Job Description

Job Position: Azure Data Engineer Lead

Experience Required: 12+ Years.

Location: Hyderabad

Technical Skill Requirements: Azure Data Factory (API/APIM), Azure Data Bricks, Azure Devops, Azure Data Lake storage (ADLS), SQL, Synapse data warehouse, Azure Cosmos DB, MS Fabric, Power BI


Roles & Responsibilities -

  • Lead the design, development, and support of large‑scale data platforms using Azure Data Factory (ADF), Azure Databricks, Azure DevOps, ADLS, SQL/Synapse Data Warehouse, Azure Cosmos DB, and Microsoft Fabric.
  • Develop, optimize, and maintain complex ETL/ELT pipelines, ensuring reliability, scalability, and high‑performance data processing.
  • Continuously assess existing applications and data platforms to identify architectural enhancements, modernization opportunities, and performance improvements.
  • Recommend and implement best practices for cloud-native data architecture, scalability, governance, and security.
  • Contribute to solution design reviews and collaborate with architects to ensure alignment with enterprise standards.
  • Manage and support high-volume, mission‑critical applications, ensuring uninterrupted operations across environments.
  • Lead triage calls involving cross‑functional stakeholders, ensuring timely communication and coordinated incident response.
  • Utilize understanding of Docker, Kubernetes, and cloud‑native workloads to support containerized deployments and operational workflows.
  • Leverage experience with data virtualization platforms (e.g., Denodo) for improved data accessibility and integration (if applicable).
  • Apply strong knowledge of ITIL processes and ITSM tools for effective incident, change, and problem management.
  • Maintain governance over operational processes and promote continuous improvement across teams.
  • Lead and mentor data engineering/operations teams, guiding technical decision‑making and ensuring high performance.
  • Work flexibly in rotational shifts to support 24/7 operational needs and ensure platform reliability.


Requirements

  • 12+ years of experience in software development, technical operations, and running large-scale applications.
  • 8+ years of experience in developing or supporting Azure Data Factory (API/APIM), Azure Databricks, Azure DevOps, Azure Data Lake storage (ADLS), SQL and Synapse data warehouse, Azure Cosmos DB and MS Fabric
  • Must have 5+ years of experience working in Data Engineering / Hands-on experience in coding
  • Any experience in data virtualization products like Denodo is desirable
  • Azure Data Engineer or Solutions Architect certification is desirable
  • Should have a good understanding of container platforms like Docker and Kubernetes.
  • Should be able to assess the application/platform time to time for architectural improvements and provide inputs to the relevant teams
  • Very Good troubleshooting skills (quick identification of the application issues and providing quick resolutions with no or minimal user/business impact)
  • Hands-on experience in working with high-volume, mission-critical applications
  • Deep appreciation of IT tools, techniques, systems, and solutions.
  • Excellent communication skills along with experience in driving triage calls which involve different technical stake holders
  • Has creative problem-solving skills related to cross-functional issues amidst the changing priorities.
  • Should be flexible and resourceful to swiftly manage the changing operational goals and demands.
  • Should have good experience to lead the team and manage customer expectations
  • Good experience in handling escalations and take complete responsibility and ownership of all critical issues to get a technical/logical closure.
  • Good understanding of the IT Infrastructure Library (ITIL) framework and various IT Service Management (ITSM) tools available in the marketplace.
  • Flexible to work in rotational shifts

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application