Job Description

Role: Data Engineering

Remote

Looking for 5-10 years of Exp

Skill Set:

AWS, Snowflake, Kafka, Airflow, GitHub, PySpark, Python

Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT pipelines
  • Ingest data from various sources (APIs, databases, files, etc.)
  • Implement both real-time and batch processing solutions based on use case requirements
  • Ensure data quality through validation and cleansing processes
  • Collaborate with Product Managers and Business Stakeholders to gather and understand data requirements
  • Translate business needs into technical specifications
  • Ensure data security, access control, and compliance with relevant policies
  • Maintain documentation and follow best practices for data engineering.

Apply for this Position

Ready to join Vriba Solutions? Click the button below to submit your application.

Submit Application