Job Description

10+ years of experience in design, architecture, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data.
Have expertise in designing and implementing end-to-end data architectures using Google Cloud Dataproc, including data ingestion pipelines, transformation logic, and data warehousing strategies to handle large-scale batch and real-time data processing.
Proven expertise in GCP services including Dataproc, Dataflow, Cloud Storage, Big Query, Cloud Composer, and Cloud Functions. Experience building scalable data lakes and pipelines.
Strong hands-on experience processing large volumes of data; proficiency in Py Spark, Python, Spark SQL, and automating workflows.
Have good exposure to implementing robust data governance using Dataplex and security measures.
Have proficiency in requirements analysis, solution design, development, testing, deployment, and ongoing support, including cloud migration projects for large-scale data platforms.
Location: Noida / Gurgaon / Indore / Bangalore / Hyderabad / Pune
Notice: Immediate to 30 days

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application