Job Description

About the Role

We are seeking a highly skilled Data Engineer with deep expertise in the Microsoft Azure ecosystem and Microsoft Fabric to design, build, and optimize scalable data pipelines and Medallion lakehouse architectures . The candidate will provide thought leadership in transitioning legacy environments to unified Fabric workspaces, delivering robust, secure, and high-performance solutions that empower AI and analytics.


Key Responsibilities

  • Fabric & Lakehouse Implementation: Design and maintain unified data environments using Microsoft Fabric (OneLake, Lakehouses, and Warehouses) and Azure Synapse.
  • Pipeline Orchestration: Develop scalable data pipelines using Fabric Data Factory and Azure Data Factory to ingest data from diverse sources (APIs, On-premises, Cloud).
  • Architecture Optimization: Build and optimize Medallion architectures (Bronze/Silver/Gold) using Delta Lake and Fabric Notebooks (Spark) .
  • Thought Leadership: Work with stakeholders to translate business requirements into technical roadmaps, specifically advising on when to use Fabric vs. Azure Databricks .
  • Unified Governance: Leverage Microsoft Purview and Fabric’s native security features to ensure data quality, consistency, and RBAC/Sensitivity labeling.
  • Infrastructure & Automation: Implement Infrastructure as Code (IaC) using Terraform or Bicep for automated provisioning of Fabric capacities and Azure resources.
  • Advanced Analytics Support: Design Star Schemas and utilize DirectLake mode in Power BI to provide high-performance reporting for data scientists and analysts.


Required Technical Skills

  • Microsoft Fabric Ecosystem: Expert-level knowledge of OneLake, Fabric Capacities, Lakehouse/Warehouse artifacts, and Shortcuts.
  • Azure Data Services: Proven experience with Azure Data Factory, Azure Databricks , and Azure Data Lake Storage Gen2.
  • Data Processing: Strong proficiency in PySpark (Spark SQL & DataFrames) for complex transformations and performance tuning.
  • Languages: Advanced SQL and Python skills for pipeline development and orchestration.
  • DevOps & IaC: Experience with Terraform/Bicep and CI/CD practices using Azure DevOps or GitHub Actions.
  • Real-Time Data: Experience with streaming data using Fabric Eventstreams or Azure Event Hubs.


Soft Skills & Mindset

  • Strategic Thinking: Ability to navigate the evolving Microsoft roadmap and choose the right tool for the specific scale/cost requirement.
  • Collaboration: Strong ability to bridge the gap between "Pro-code" engineering and "Low-code" analytics users.
  • Communication: Robust client management skills with the ability to explain complex cloud architectures to non-technical stakeholders.


Preferred Qualifications

  • Certifications: DP-600 (Implementing Analytics Solutions Using Microsoft Fabric) or DP-203 (Azure Data Engineer).
  • Advanced Tools: Experience with Unity Catalog , Delta Live Tables, or Fabric’s Copilot features.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application