Job Description
Job Title: Databricks Architect
Location: Indore, Pune, Bangalore (Hybrid/Remote options available)
Work Timings: 2:00 PM to 11:00 PM IST
Experience Level: 10 to 15 Years
About the Role
We are seeking a seasoned Databricks Architect to lead the design and implementation of scalable data platforms from scratch. You will architect end-to-end ETL pipelines, optimize data workflows on Databricks, and drive innovation in big data solutions. This role demands hands-on expertise in building projects from the ground up, ensuring high-performance, secure, and cost-efficient data architectures for enterprise clients.
Key Responsibilities
- Architect and deploy Databricks-based data platforms, including Unity Catalog, Delta Lake, and lakehouse architectures from scratch.
- Design, develop, and optimize ETL/ELT pipelines using PySpark, Python, and SQL for large-scale data processing.
- Lead data modeling, schema design, and performance tuning for complex projects involving batch, streaming, and real-time data.
- Collaborate with cross-functional teams to translate business requirements into scalable technical solutions.
- Implement best practices for data governance, security, CI/CD integration, and cost optimization on Databricks.
- Mentor junior engineers and conduct proof-of-concepts (POCs) for emerging Databricks features.
Must-Have Skills & Qualifications
- 10+ years of experience in data engineering/architecture, with 7+ years hands-on with Databricks.
- Expertise in Python, PySpark, and SQL for building high-performance ETL pipelines.
- Proven track record of architecting data projects from scratch (e.g., greenfield lakehouse implementations).
- Strong knowledge of ETL tools, Delta Lake, Spark SQL, and integration with cloud services (Azure/AWS/GCP preferred).
- Experience with data orchestration tools like Airflow, dbt, or Databricks Workflows.
- Bachelor's/Master's in Computer Science, IT, or related field.
Preferred Skills
- Certifications: Databricks Certified Data Engineer/Architect Professional.
- Familiarity with MLflow, GenAI integrations, or advanced Spark optimizations.
- Exposure to multi-cloud environments and cost management strategies.
What We Offer
- Competitive salary and performance incentives.
- Flexible hybrid work model across Indore, Pune, and Bangalore.
- Opportunities for upskilling in cutting-edge data technologies.
- Collaborative team environment with focus on work-life balance.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application