Job Description

Job Description:



Responsibilities:


  • Design and implement end-to-end data pipelines using Cloud Dataflow (Python/Apache Beam) for batch and streaming data.

  • Develop, optimize, and maintain BigQuery stored procedures (SPs), SQL scripts, and user-defined functions (UDFs) for complex transformations and business logic implementation.

  • Build and manage data orchestration workflows using Cloud Composer (Airflow) with appropriate operators and dependencies.

  • Establish secure and efficient connections to source systems for data ingestion and integration.

  • Manage data ingestion workflows from on-premise and cloud sources into Google Cloud Storage (GCS) and BigQuery.

  • Execute history data migration from legacy data warehouses (preferably Snowflake, Teradata, Net...

Apply for this Position

Ready to join CYNET SYSTEMS? Click the button below to submit your application.

Submit Application