Job Description
Locations: Pan India
Experience: 10+ Years
Role: Databricks Architect
Overview:
We are seeking a highly skilled and experienced Databricks Architect to lead the design, implementation, and optimization of data solutions tailored for the Banking, Finance, and Insurance sectors. The ideal candidate will have a deep understanding of Databricks, big data technologies, and cloud platforms, with a proven track record of delivering scalable and secure data architectures.
Key Responsibilities:
Design and implement end-to-end data solutions using Databricks for financial data processing, analytics, and reporting.
Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
Develop and optimize scalable data pipelines and workflows for real-time and batch processing.
Ensure data governance, security, and compliance with industry regulations and best practices.
Lead the migration of legacy systems to modern cloud-based data platforms.
Provide technical leadership and mentorship to data engineering teams.
Stay updated with the latest advancements in Databricks and related technologies to drive innovation.
Required Skills and Qualifications:
Extensive experience with Databricks, Apache Spark, and Delta Lake.
Proficiency in cloud platforms such as Azure, AWS, or Google Cloud.
Strong knowledge of SQL, Python, and data pipeline optimization.
Familiarity with financial data processing, risk management, and regulatory compliance.
Excellent problem-solving and communication skills.
Databricks certification is a plus.
Good to have:
Experience with Azure Data Factory, Synapse Analytics, and Data Lake Storage.
Knowledge of machine learning frameworks and advanced analytics.
Understanding of data visualization tools like Power BI or Tableau.
Experience: 10+ Years
Role: Databricks Architect
Overview:
We are seeking a highly skilled and experienced Databricks Architect to lead the design, implementation, and optimization of data solutions tailored for the Banking, Finance, and Insurance sectors. The ideal candidate will have a deep understanding of Databricks, big data technologies, and cloud platforms, with a proven track record of delivering scalable and secure data architectures.
Key Responsibilities:
Design and implement end-to-end data solutions using Databricks for financial data processing, analytics, and reporting.
Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
Develop and optimize scalable data pipelines and workflows for real-time and batch processing.
Ensure data governance, security, and compliance with industry regulations and best practices.
Lead the migration of legacy systems to modern cloud-based data platforms.
Provide technical leadership and mentorship to data engineering teams.
Stay updated with the latest advancements in Databricks and related technologies to drive innovation.
Required Skills and Qualifications:
Extensive experience with Databricks, Apache Spark, and Delta Lake.
Proficiency in cloud platforms such as Azure, AWS, or Google Cloud.
Strong knowledge of SQL, Python, and data pipeline optimization.
Familiarity with financial data processing, risk management, and regulatory compliance.
Excellent problem-solving and communication skills.
Databricks certification is a plus.
Good to have:
Experience with Azure Data Factory, Synapse Analytics, and Data Lake Storage.
Knowledge of machine learning frameworks and advanced analytics.
Understanding of data visualization tools like Power BI or Tableau.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application