Job Description

Job Description - Data Platform Lead
Job Summary:
We are seeking a highly skilled, forward-thinking in Engineering Platform with expertise across AWS/Azure cloud services, Databricks, and API development (Fast API). This role involves leading a small but high-performing engineering team, architecting scalable data and application platforms, and driving modern data engineering practices across the organization.
The candidate will lead a small team and provide architectural solution for cloud-based application development and data engineering initiatives as well.
Key Responsibilities:
- Design, develop, and deploy scalable cloud-native applications using AWS and Azure services.
- Lead the implementation and optimization of Databricks.
- Write clean, efficient, and modular Python code for data processing, transformation, and automation. Architect and develop high-performance REST APIs and microservices using Python (Fast API) for data access, workflow automation, and platform integrations
- Collaborate with team/s to understand business requirements and translate them into technical solutions and maintain documentation and contribute to knowledge sharing across the team.
- Design, architect, and deploy scalable, cloud-native applications and data platforms using AWS/Azure services, lead the end-to-end implementation and optimization of Databricks.
- Build and optimize ETL/ELT pipelines, ingestion frameworks, streaming workloads, and data transformation layers.
- Ensure data platform governance including cost optimization, access controls, workspace management, and environment standardization.
Required Skills & Qualifications:
- 9+ years of experience in application API development.
- Strong hands-on experience with cloud Providers like AWS and Azure.
- Deep understanding and practical experience with Databricks (including notebooks, Delta Lake, MLflow, and Spark).
- Proficient in advanced Python concepts, including libraries such as Pandas, Num Py, Py Spark, and Fast API.
- Familiarity with CI/CD pipelines, version control (Git), and Dev Ops practices.
- Excellent problem-solving, communication, and collaboration skills.
- Any Certifications in AWS, Azure, or Databricks.
Nice-to-Have (Not required)
- Experience with cross-platform integrations (Snowflake, Kafka, Redshift, Cosmos DB, S3/Datalake Gen2).
- Experience building data quality frameworks.
- Understanding of cost governance, and resource optimization.
- Exposure to modern front-end frameworks is a plus.
- Io T exposure (Greengrass In AWS)

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application