Job Description
Position: Data Quality Engineer
Location: 100% remote in MX
Skills Required
: Databricks, Azure, DATA QUALITY MANAGEMENT, Governance & Quality Management: Metadata management, data lineage tracking, KPI reporting
Job Summary
We are looking for a Data Quality Engineer to design and maintain data quality, governance, and monitoring solutions across enterprise data platforms. The role focuses on ensuring accurate, reliable, and well-governed data for analytics and reporting.
Key Responsibilities
- Build and automate data quality checks across data pipelines
- Develop ETL/ELT workflows using Azure Data Factory, Airflow.
- Create KPI reports and dashboards
- Implement data governance, lineage, and metadata best practices
- Build data models and pipelines on Databricks, Azure Synapse, and Snowflake
- Manage CI/CD pipelines using GitHub Actions or Azure DevOps
- Monitor and report on data quality KPIs
Required Skills
- Strong SQL, Python, and PySpark
- Experience with Azure (ADF, Synapse, Databricks)
- Data quality and governance tools (Ataccama DQMS preferred)
- Knowledge of Delta Lake, Parquet, and data pipeline monitoring
- Understanding of data governance, lineage, and metadata management
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application