Job Description
Job Title
Senior Data Architect / Lead Data Engineer – Azure & Databricks
Job Summary
We are seeking an experienced Senior Data Architect / Lead Data Engineer to design, build, and lead cloud-native data platforms on Azure, with a strong focus on Databricks. The role involves architecting end-to-end modern data solutions, leading data strategy initiatives, enabling DataOps/MLOps practices, and collaborating closely with customers and internal teams to drive data modernization at scale.
Key Responsibilities
- Design and develop modern, cloud-native data architectures to support analytics, BI, and advanced insights.
- Build and optimize cost-effective data infrastructure using Azure Databricks.
- Orchestrate data workflows using Databricks Workflows and Azure Data Factory (ADF).
- Lead data strategy and architecture discussions focused on scalability, performance, and flexibility.
- Design and implement CI/CD pipelines for Databricks using Azure DevOps.
- Partner with customers to deliver data modernization and migration solutions.
- Develop training plans and learning materials to upskill internal associates.
- Build industry-specific and domain-driven data solutions tailored to customer needs.
- Design and implement a Smart Operations framework covering DataOps and MLOps best practices.
- Collaborate with Data Engineering, Data Management, BI, and Analytics teams in complex enterprise environments.
Required Skills & Experience
- 10+ years of overall experience in data engineering and analytics, with at least 4+ years delivering cloud-native, end-to-end data solutions (ingestion to consumption).
- Strong hands-on experience architecting and implementing Modern Data Platforms on Azure.
- Advanced experience with Databricks, PySpark, and distributed data processing frameworks.
- Proven expertise in cloud-native architecture, data governance, and security.
- Extensive experience migrating on-premises data platforms (Hadoop, Spark) to Azure Databricks.
- Strong understanding of data warehouse concepts, including columnar database architectures.
- Hands-on experience with ETL/ELT tools, Kafka or streaming platforms, and modern data integration patterns.
- Solid understanding of Agile/Scrum methodologies and working in iterative delivery models.
- Ability to collaborate across cross-functional teams in a large-scale IT ecosystem.
Preferred / Added Advantage
- Experience designing and implementing Data Mesh architectures and Data Products.
- Exposure to industry-specific data solutions.
- Knowledge of advanced analytics and ML enablement on cloud data platforms.
If you want, I can also:
- Tighten this for ATS optimization
- Convert it into a short LinkedIn job post
- Tailor it for consulting / client-facing roles
- Align it to a specific company or industry
Just say the word.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application