Job Description

We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, SQL, and PySpark.

  • Experience - 4 to 6 years

Requirements

Key Responsibilities:

  • Creation and support of data pipelines built on AWS technologies including Glue, Redshift, EMR, Kinesis and Athena
  • Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the AWS platform.
  • Optimize data integration platform to provide optimal performance under increasing data volumes
  • Support the data architecture and data governance function to continually expand their capabilities
  • Experience in development of Solution Architecture for Enterprise Data Lakes (applicable for AM/Manager level candidates)
  • Should have exposure to client facing roles
  • Strong communication, inter-personal and team management skills


THE INDIVIDUAL

  • Proficient in any object-oriented/ functional scripting languages: Pyspark, Python etc.
  • Experience in using AWS SDKs for creating data pipelines – ingestion, processing and orchestration. 
  • Hands on experience in working with big data on AWS environment including cleaning/transforming/cataloguing/mapping etc.
  • Good understanding of AWS components, storage (S3) & compute services (EC2)
  • Hands on experience in AWS managed services (Redshift, Lambda, Athena) and ETL (Glue).
  • Experience in migrating data from on-premise sources (e.g. Oracle, API-based, data extracts) into AWS storage (S3)
  • Experience in setup of data warehouse using Amazon Redshift, creating Redshift clusters and perform data analysis queries
  • Experience in ETL and data modelling on AWS ecosystem components - AWS Glue, Redshift, DynamoDB
  • Experience in setting up AWS Glue to prepare data for analysis through automated ETL processes.


  • Familiarity with AWS data migration tools such as AWS DMS, Amazon EMR, and AWS Data Pipeline
  • Hands on experience with AWS CLI, Linux tools and shell scripts
  • Certifications on AWS will be an added plus. 

QUALIFICATION

  • BE/BTech/MCA
  • 4 to 6 years of strong experience in 3-4 of the above-mentioned skills.

Benefits

Work with one of the Big 4's in India

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application