Job Description

Job Responsibilities

  • Design, develop, and maintain ETL pipelines to ingest, transform, and load data from multiple sources.
  • Build and optimize data pipelines for performance, reliability, and scalability.
  • Work with cloud-based data lakehouse platforms such as Databricks, BigQuery, or similar technologies.
  • Develop and maintain data processing workflows using Python.
  • Write complex and optimized SQL queries to support analytics and reporting requirements.
  • Design and manage data models to support data warehousing and business intelligence use cases.
  • Process and manage structured and semi-structured data (e.g., JSON, CSV, logs).
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
  • Ensure data quality, integrity, and consistency across data platforms.
  • Troubleshoot and resolve data pipeline and performance issues.
<...

Apply for this Position

Ready to join Yondu, Inc.? Click the button below to submit your application.

Submit Application