Job Description

TCS Hiring !! Virtual Drive 21st , 22ND, 23RD, JANUARY 2026


BigQuery, Cloud Storage, Dataproc, Airflow

GCP Data Engineer to design, build, and optimize scalable data pipelines and analytics solutions using BigQuery, Cloud Storage, Dataproc, and Airflow.



21 Wed 12 to 1 PM

Hyderabad

6 - 8 Years

Immediate Joiners

22 Thur 12 to 1 PM

Hyderabad

6 - 8 Years

Immediate Joiners

23 Friday 12 to 1 PM

Hyderabad

6 - 8 Years

Immediate Joiners


Please read Job description before Applying


NOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details:

Name:

Contact Number:

Email ID:

Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.)

Current Organization Name:

Total IT Experience- 7+ Yrs

Location: Hyderabad

Current CTC

Expected CTC

Notice period:

Whether worked with TCS - Y/N


GCP Services: BigQuery, Cloud Storage, Dataproc, Cloud Composer (managed Airflow) or self-managed Airflow. · Airflow: Strong experience in DAG creation, operators/hooks, scheduling, backfilling, retry strategies, and CI/CD for DAG deployments. · Programming: Proficiency in Python (PySpark, Airflow DAGs), SQL (advanced BigQuery SQL). · Data Modeling: Dimensional modeling (Star/Snowflake), data vault basics, and schema design for analytics. · Performance Tuning: BigQuery partitioning/clustering, predicate pushdown, job stats review, Dataproc executor tuning. · Version Control & CI/CD: Git, branching strategies, pipelines for deploying Airflow DAGs and config. · Operational Excellence: Monitoring with Stackdriver/Cloud Logging, debugging pipeline failures, and root-cause analysis. · involves end-to-end ownership of data ingestion, transformation, orchestration, and performance tuning for batch and near real-time workflows

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application