Job Description
Job Title: GCP Big Data Engineer
Experience: 5 to 8 Years
Budget: 15 to 20 LPA
Location: Gurgaon & Bangalore
Type: Full-Time
Job Description:
We are seeking a seasoned GCP Data Analytics professional with extensive experience in Big Data technologies and Google Cloud Platform services to design and implement scalable data solutions
Design develop and optimize data pipelines using GCP BigQuery Dataflow and Apache Airflow to support largescale data analytics Utilize the Big Data Hadoop ecosystem to manage and process vast datasets efficiently Collaborate with crossfunctional teams to gather requirements and deliver reliable data solutions Ensure data quality consistency and integrity across multiple data sources Monitor and troubleshoot data workflows to maintain high system availability and performance Stay updated with emerging trends and best practices in GCP data analytics and big data technologies
Roles and Responsibilities:
- Implement and manage ETL processes leveraging GCP services such as BigQuery, Dataflow, Pyspark and Airflow Develop.
- Scalable maintainable and reusable data pipelines to support business intelligence and analytics needs.
- Optimize SQL queries and data models for performance and cost efficiency in BigQuery.
- Integrate Hadoop ecosystem components with GCP services to enhance data processing capabilities
- Automate workflow orchestration using Apache Airflow for seamless data operations
- Collaborate with data engineers analysts and stakeholders to ensure alignment of data solutions with organizational goals
- Participate in code reviews testing and deployment activities adhering to best practices
- Mentor junior team members and contribute to continuous improvement initiatives within the data engineering team
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application