Job Description
Responsibilities
- Actively participate in requirements gathering and analysis of business processes to determine technical feasibility.
- Design and maintain ETL workflows using Airflow or similar tools.
- Develop Spark jobs for data transformation and optimization.
- Validate input and output data for quality and consistency.
- Configure and monitor data quality alarms and thresholds.
- Collaborate with QA and other engineers for testing and troubleshooting.
- Maintain documentation and ensure reproducibility of data processes.
Core Competencies (Must-have Competencies)
- At least 5 years of Apache Spark - Expertise in distributed data processing, Spark SQL, and performance tuning.
- At least 5 years of experience in Airflow / Workflow Orchestration - Ability to design and manage complex ETL pipelines with dependency handling.
- At least 5 years of experience with BigQuery, SQL & PostG...
Apply for this Position
Ready to join Quantrics Enterprises Inc.? Click the button below to submit your application.
Submit Application