Job Description

Overview

Design, develop, and maintain data pipelines. You will be responsible for the health of our data ecosystem, using Databricks Delta Lake and Auto Loader to ensure data quality and availability.

Responsibilities

  • Develop high-quality, scalable ETL/ELT pipelines using Delta Lake and DLT.
  • Implement ingestion patterns using Auto Loader with schema evolution.
  • Configure Unity Catalog: schemas, user access, and audit logging.
  • Troubleshoot and optimize jobs using the Photon engine.
  • Build secure DLT pipelines with Bronze/Silver/Gold layering.

Must-Have Skills & Experience

  • 3+ years of hands-on experience in data engineering or a similar role.
  • Strong programming and debugging skills in Python.
  • Solid hands-on experience with PySpark for large-scale data processing.
  • Proven experience building ETL/ELT pipelines on Databricks.
  • Proficiency with Delta Lake, Auto...

Apply for this Position

Ready to join Accenture in the Philippines? Click the button below to submit your application.

Submit Application