Job Description

TCS Hiring !!. GCP Data Engineer (BigQuery, Cloud Storage, Dataproc, Airflow)



Please read Job description before Applying


SKILLS: GCP Data Engineer (BigQuery, Cloud Storage, Dataproc, Airflow)


GCP Services: BigQuery, Cloud Storage, Dataproc, Cloud Composer (managed Airflow) or self-managed Airflow. · Airflow: Strong experience in DAG creation, operators/hooks, scheduling, backfilling, retry strategies, and CI/CD for DAG deployments. · Programming: Proficiency in Python (PySpark, Airflow DAGs), SQL (advanced BigQuery SQL). · Data Modeling: Dimensional modeling (Star/Snowflake), data vault basics, and schema design for analytics. · Performance Tuning: BigQuery partitioning/clustering, predicate pushdown, job stats review, Dataproc executor tuning. · Version Control & CI/CD: Git, branching strategies, pipelines for deploying Airflow DAGs and config. · Operational Excellence: Monitoring with Stackdriver/Cloud Logging, debugging pipeline failures, and root-cause analysis. · involves end-to-end ownership of data ingestion, transformation, orchestration, and performance tuning for batch and near real-time workflows.


NOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details:

Name:

Contact Number:

Email ID:

Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.)

Current Organization Name:

Total IT Experience-7+ years

Location: TCS: Hyderabad

Current CTC

Expected CTC

Notice period: Immediate Joiner

Whether worked with TCS - Y/N


GCP Services: BigQuery, Cloud Storage, Dataproc, Cloud Composer (managed Airflow) or self-managed Airflow. · Airflow: Strong experience in DAG creation, operators/hooks, scheduling, backfilling, retry strategies, and CI/CD for DAG deployments. · Programming: Proficiency in Python (PySpark, Airflow DAGs), SQL (advanced BigQuery SQL). · Data Modeling: Dimensional modeling (Star/Snowflake), data vault basics, and schema design for analytics. · Performance Tuning: BigQuery partitioning/clustering, predicate pushdown, job stats review, Dataproc executor tuning. · Version Control & CI/CD: Git, branching strategies, pipelines for deploying Airflow DAGs and config. · Operational Excellence: Monitoring with Stackdriver/Cloud Logging, debugging pipeline failures, and root-cause analysis. · involves end-to-end ownership of data ingestion, transformation, orchestration, and performance tuning for batch and near real-time workflows.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application