Job Description

Role: Azure Data Engineer
Experience: 8+ Years
Work location: Hyderabad (Hybrid work mode)
Duration: Long-term
Job Description:
We are seeking a highly skilled and proactive Azure Data Factory (ADF) Developer to design, build, and maintain scalable, secure, and automated data integration pipelines across hybrid and cloud environments. The ideal candidate will have deep expertise in Azure Data Factory, data orchestration, ETL/ELT workflows, and integration with Azure services such as Azure SQL Data Warehouse, Blob Storage
This role is critical in enabling real-time data delivery, end-to-end data consistency, and CI/CD-driven deployments—key pillars of modern data platforms
Roles and Responsibilities:
1. Pipeline Design & Development
Design, develop, and implement robust Azure Data Factory pipelines for ETL/ELT processes across on-premise, cloud, and Saa S data sources.
Create reusable and modular pipeline components (e.g., templates, parameters, linked services) to ensure consistency and reduce duplication.
Implement incremental data loads, error handling, retry logic, and checkpointing for reliable data ingestion.
Optimize pipeline performance using parallel execution, partitioning, and data compression techniques.
2. Orchestration & Automation
Orchestrate complex workflows involving multiple data sources, transformations, and downstream systems (e.g., Power BI, Synapse, Databricks).
Integrate ADF with Azure Logic Apps, Azure Functions, and Event Grid for event-driven automation.
Implement pipeline triggers (scheduled, tumbling window, event-based) to support real-time and batch processing.
Ensure end-to-end data flow consistency across SSIS, ADF, and downstream systems (e.g., Power BI reports).
3. Data Quality & Governance
Integrate data validation checks within pipelines (e.g., row count validation, schema checks, null/invalid data detection).
Stop ingestion of bad or malformed data using conditional logic and failure triggers.
Collaborate with data engineers and analysts to ensure data lineage, metadata tracking, and compliance with data governance policies.
Support data quality dashboards and job status monitoring using Power BI or custom tools.
4. Collaboration & Documentation
Partner with data architects, BI developers, and business stakeholders to gather requirements and translate them into technical designs.
Maintain detailed documentation in Azure Dev Ops Wiki or Confluence on pipeline logic, dependencies, error codes, and recovery procedures.
Participate in agile ceremonies (sprints, grooming, retrospectives) and support sprint planning and estimation., coverage, and defect trends to stakeholder

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application