Job Description

Description GSPANN is hiring an Azure Data Engineer Lead to design, build, and operate large-scale, mission-critical data platforms on Microsoft Azure. The role focuses on leading ETL/ELT pipelines, cloud-native data architecture, platform reliability, and mentoring data engineering teams using Azure Data Factory, Databricks, Synapse, and Microsoft Fabric.

Role and Responsibilities

  • Design, build, and support large-scale data platforms using Azure Data Factory, Azure Databricks, Azure DevOps, Azure Data Lake Storage, SQL/Synapse, Azure Cosmos DB, and Microsoft Fabric.
  • Develop, optimize, and operate complex ETL/ELT pipelines to ensure scalable, reliable, and high-performance data processing.
  • Review existing data platforms and applications to identify architectural improvements, modernization opportunities, and performance optimizations.
  • Define and implement cloud-native best practices for data architecture, scalability, governance, and security.
  • Participate in solution design reviews and collaborate with architects to align solutions with enterprise standards.
  • Support high-volume, mission-critical data platforms and ensure uninterrupted operations across environments.
  • Lead cross-functional triage calls, manage incidents, and drive coordinated resolution with clear stakeholder communication.
  • Support containerized workloads using Docker and Kubernetes in cloud-native environments.
  • Apply data virtualization concepts and platforms such as Denodo to improve data access and integration, where applicable.
  • Execute incident, change, and problem management processes using ITIL practices and ITSM tools.
  • Govern operational processes and drive continuous improvement initiatives across data platforms.
  • Lead and mentor data engineering and operations teams, providing technical guidance and ownership.
  • Participate in rotational shifts to support 24×7 operations and platform reliability.
  • Skills and Experience

  • 12+ years of experience in software engineering, data platforms, or technical operations for large-scale systems.
  • 8+ years of hands-on experience with Azure Data Factory, Azure Databricks, Azure DevOps, Azure Data Lake Storage, SQL/Synapse, Azure Cosmos Database, and Microsoft Fabric.
  • 5+ years of hands-on data engineering experience with strong coding capabilities.
  • Experience with data virtualization tools such as Denodo (preferred).
  • Strong understanding of container platforms, including Docker and Kubernetes.
  • Proven expertise supporting high-volume, mission-critical applications.
  • Excellent troubleshooting skills with the ability to resolve issues quickly with minimal business impact.
  • Strong experience leading incident triage calls and handling critical escalations end-to-end.
  • Solid understanding of ITIL frameworks and IT Service Management (ITSM) tools.
  • Experience leading teams, managing stakeholder expectations, and driving operational accountability.
  • Ability to assess platforms periodically and recommend architectural and operational improvements.
  • Hands-on experience with data pipelines, ETL/ELT orchestration, and Microsoft Fabric.
  • Familiarity with MLOps practices and cloud platforms such as Azure or AWS.
  • Exposure to cloud AI services, including Azure AI, Copilot Studio, and AI Foundry (advantage).
  • Experience designing end-to-end integrations across platforms and products.
  • Familiarity with modern AI patterns such as RAG, MCP, and multi-agent architectures (advantage).
  • Strong communication skills with fluency in English.
  • Apply for this Position

    Ready to join ? Click the button below to submit your application.

    Submit Application