Job Description
Job Description
Requirements
MS Fabric, SQL, Python
Job Summary:
We are seeking an experienced Data Engineer (GCP) to design, build, and maintain scalable, high-performance data solutions on Google Cloud Platform. The ideal candidate will have strong expertise in BigQuery, Apache Airflow, SQL, and Python or Java, with a proven ability to develop robust data pipelines that support analytics and business intelligence initiatives.
Key Responsibilities
- Design, develop, and optimize scalable data pipelines on Google Cloud Platform (GCP).
- Build and manage data solutions using BigQuery for large-scale data processing and analytics.
- Orchestrate and schedule workflows using Apache Airflow.
- Develop data processing logic using Python or Java.
- Write optimized and complex SQL queries for data transformation and reporting.
- Ensure data quality, performance, security, and reliability across data systems.
- Collaborate with data scientists, analysts, and business stakeholders to support data-driven initiatives.
- Troubleshoot data issues and continuously improve pipeline efficiency and scalability
Required Skills & Qualifications:
- 6-8 Years of experience with Data Engineer
- Strong hands-on experience with Google Cloud Platform (GCP).
- Expertise in BigQuery for data warehousing and analytics.
- Experience with Apache Airflow for workflow orchestration.
- Proficiency in Python or Java for data engineering tasks.
- Strong SQL skills for data modeling and transformations.
- Solid understanding of ETL/ELT processes, data architecture, and cloud-native data platforms.
Preferred Skills:
- Experience with GCP services such as Cloud Storage, Dataflow, Pub/Sub, or Composer.
- Knowledge of data governance, monitoring, and performance tuning on cloud platforms.
Requirements
MS Fabric, SQL, Python
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application