Job Description

Job Description

Key Responsibilities:


  1. Data Pipeline Development :  

  • Design, develop, and optimize scalable data pipelines using Databricks , Snowflake , and Azure Data Factory (ADF) .  

  • Implement ETL/ELT processes for structured and unstructured data across data lakes and warehouses.  

  1. Data Modeling & Optimization :  

  • Create efficient data models (e.g., star schema, snowflake schema) for Snowflake to ensure optimal query performance.  

  • Optimize Databricks workflows for performance tuning and scalability.  

  1. Integration & Collaboration :  

  • Integrate Snowflake with other systems (e.g., ADLS Gen2, Salesforce) and configure APIs for seamless data flow.  

  • Collaborate with cross-functional teams, including data scientists, analysts, and infrastructure engineers, to deliver end-to-end solutions.  

  1. Automation & CI/CD :  

  • Develop reusable jobs and implement CI/CD pipelines using GitHub Actions or Azure DevOps.  

  • Automate source-to-target mappings and data lineage tracking using tools like Collibra.  

  1. Troubleshooting & Documentation :  

  • Identify and resolve performance bottlenecks in Snowflake queries or Databricks jobs.  

  • Document workflows, processes, and technical solutions to ensure clarity and maintainability.  

Required Skills  

  • Hands-on experience with Snowflake , including schema design, query optimization, and SnowSQL scripting.  

  • Proficiency in Databricks , including PySpark workflows and distributed computing frameworks.  

  • Strong knowledge of ETL/ELT processes and tools like Azure Data Factory (ADF) or DBT Labs.  

  • Expertise in SQL for data manipulation and transformation; familiarity with NoSQL databases is a plus.  

  • Experience integrating cloud platforms such as Azure or AWS with Snowflake/Databricks.  

  • Familiarity with CI/CD pipelines using GitHub Actions or Azure DevOps.  

Qualifications  

  • Bachelor’s degree in Computer Science , Information Systems, or a related field.  

  • 3–5 years of experience as a Database Engineer or Data Engineer working with Snowflake and Databricks.  

  • Certifications in Snowflake or Databricks are a plus (e.g., SnowPro Certification).  

 





Requirements
Preferred Qualifications: · MBA from a premier business school · Experience working with financial institutions and understanding their challenges. · Exposure to AI-led transformation, Agile delivery, or digital banking programs · Experience in client-facing consulting roles within banking

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application