Job Description

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. 

Job Description – Data Engineer (Snowflake, dbt, Airflow, Insurance Domain)

Position: Data Engineer

Rank- Senior

Experience: 5–8 years

Employment Type: Full-time

Overview

We are seeking an experienced Data Engineer with strong expertise in Snowflake, dbt, and Astronomer- Airflow, along with a solid understanding of insurance business processes. This role will focus on building and optimizing scalable data pipelines, ensuring data quality, and enabling analytics capabilities for insurance data sets.

Key Responsibilities

  • Design, develop, and maintain ELT data pipelines using Snowflake, dbt, and Airflow.
  • Build and optimize data models in Snowflake to support analytics and reporting requirements.
  • Implement data transformation logic in dbt for business rules and semantic layer creation.
  • Automate workflows and data ingestion processes with Airflow DAGs.
  • Collaborate with domain experts to translate insurance data concepts (policies, claims, underwriting) into technical pipelines.
  • Monitor system performance, troubleshoot data processing issues, and optimize queries.
  • Prepare documentation and conduct knowledge-sharing sessions for team members.
  • Required Skills

  • Hands-on expertise in Snowflake for data warehousing.
  • Strong proficiency with dbt for modular, testable SQL transformations.
  • Solid experience with Astronomer/Apache Airflow for orchestration and scheduling.
  • Strong SQL and data modelling skills.
  • Knowledge of ELT best practices and performance optimization.
  • Understanding of insurance data domains including policies, claims, actuarial data, and compliance.
  • Familiarity with version control (Git) and CI/CD practices for data projects.
  • Knowledge of Python and Git-based CI/CD pipelines is a plus.
  • Knowledge of Data Vault and building model using Dbt is a plus.
  • Preferred Qualifications

  • Experience with cloud environments such as AWS, Azure, or GCP.
  • Knowledge of Python for scripting in Airflow workflows.
  • Education

  • Bachelor’s or master’s degree in computer science, Information Systems, Data Engineering, or related field
  • EY | Building a better working world 

    Apply for this Position

    Ready to join ? Click the button below to submit your application.

    Submit Application