Job Description

Job Title: AWS Data Engineer
Experience: 6+ Years
Positions: multiple
Engagement Type: Contract
Location: Remote
Role Overview
We are looking for two highly skilled Data Engineers to join our team on a contract basis. The ideal candidates will have strong experience in designing, building, and maintaining scalable data pipelines and working with modern cloud-based data platforms. This role requires hands-on expertise in AWS, Apache Airflow, Snowflake, and Git Hub.
Key Responsibilities
Design, develop, and maintain robust and scalable data pipelines.
Build and orchestrate ETL/ELT workflows using Apache Airflow.
Ingest, process, and manage data using AWS services.
Develop and optimize data models and transformations in Snowflake.
Ensure data quality, reliability, and performance across pipelines.
Collaborate with analytics, product, and business teams.
Implement best practices for version control and CI/CD using Git Hub.
Monitor, troubleshoot, and resolve data pipeline issues.
Document data workflows, architecture, and technical processes.
Required Skills & Experience
6+ years of experience in Data Engineering or similar roles.
Strong hands-on experience with AWS (S3, EC2, Lambda, Redshift, etc.).
Solid experience with Apache Airflow for workflow orchestration.
Strong expertise in Snowflake for cloud data warehousing.
Proficiency in Git Hub for version control and collaboration.
Advanced SQL skills and experience working with large datasets.
Experience with ETL/ELT design patterns and data modeling.
Good understanding of data quality, monitoring, and performance tuning.
Good to Have
Experience with Python or Py Spark.
Exposure to streaming tools like Kafka/Kinesis.
Experience with CI/CD and Dev Ops practices.
Knowledge of data governance and security best practices.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application