Job Description

Overview




We are seeking a highly experienced Data Engineer with 5+ years of experience. This role is critical to hitting product rollout deadlines, as the team's work is a hard, direct dependency for other product feature rollouts. The ideal candidate will be a hands-on developer with deep expertise in the AWS data stack, focusing primarily on data engineering and pipeline development.


Key Responsibilities
  • Develop and Implement Data Pipelines: Design, build, and maintain robust data pipelines primarily using AWS Glue and PySpark.
  • Data Sourcing and Transformation: Source data from various systems, including Redshift and Aurora, performing necessary streaming transformations and heavy data cleaning.
  • Data Delivery: Push resulting, cleaned datasets into S3 buckets.
  • External Integration: Manage the secure transfer of resulting files via SFTP ...
  • Apply for this Position

    Ready to join InterSources? Click the button below to submit your application.

    Submit Application