Job Description

Our client represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and Society to Rise™.

They are a USD 6 billion company with 163,000+ professionals across 90 countries, helping 1279 global customers, including Fortune 500 companies. They focus on leveraging next-generation technologies, including 5G, Blockchain, Metaverse, Quantum Computing, Cybersecurity, Artificial Intelligence, and more, on enabling end-to-end digital transformation for global customers.

Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the ‘2021 Global 100 Most sustainable corporations in the World by Corporate Knights. 

We are currently searching for a Data Engineer Sr.:

Responsibilities:

  • Design, implement, and maintain scalable data pipelines on the Azure platform using core tools like Azure Data Factory and Synapse Analytics.
  • Develop, optimize, and manage Power BI dashboards to support enterprise reporting and data visualization needs.
  • Translate business requirements into efficient data models, leveraging DAX, dimensional modeling, and Power BI to communicate insights effectively.
  • Work closely with cross-functional teams to integrate data pipelines and ensure seamless delivery of analytics and reporting solutions.
  • Enhance Cosmos processing performance by writing and optimizing Scope Scripts for large-scale data jobs.
  • Maintain strong data governance by ensuring data quality, security, and compliance across all systems and processes.

Requirements:

  • Tech Stack Must-Haves: Expertise in Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, SQL, Power BI, and DAX.
  • Strong experience in designing and implementing data pipelines.
  • Proven ability to optimize and manage enterprise-level reporting solutions.
  • Excellent skills in translating business requirements into technical data solutions.

Desired:

  • Experience with Cosmos Scope Scripting for large-scale data jobs.
  • Experience with PySpark for data processing.
  • Familiarity with Kusto Query Language (KQL).

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote.

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/.



Keywords:
Azure Data Factory, ADF, Synapse Analytics, Azure Data Lake Storage, ADLS, Power BI, DAX, SQL, PySpark, Cosmos Scope Scripting, KQL, Kusto Query Language, ETL, ELT

Requirements:

  • Tech Stack Must-Haves: Expertise in Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, SQL, Power BI, and DAX.
  • Strong experience in designing and implementing data pipelines.
  • Proven ability to optimize and manage enterprise-level reporting solutions.
  • Excellent skills in translating business requirements into technical data solutions.


Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application