Job Description

Experience:

  • 8+ years of professional experience working specifically with data, ETL tools, and data architecture.
  • Expert-level proficiency in SQL.
  • Strong preference for experience with cloud data platforms like Snowflake.
  • Deep experience with dbt (Data Build Tool) for managing transformations and data quality.
  • Solid working experience with Python for data manipulation and scripting.
  • Proven ability to implement and manage modern data architectures (e.g., Data Lake, Data Warehouse, Lakehouse).
  • Experience developing and maintaining production-level DAGs in Apache Airflow.

Job responsibilities

  • Design, develop, and maintain robust, scalable, and efficient ETL/ELT data pipelines.
  • Develop complex data transformation logic using dbt (Data Build Tool) and advanced SQL.
  • Implement, monitor, and manage workflows using Apache Airflow for scheduling and orchestr...

Apply for this Position

Ready to join GlobalLogic? Click the button below to submit your application.

Submit Application