Job Description
Job Title
Full Stack Data Engineer
Job Description
● Programming: Python, SQL, Shell scripting.
● GCP Data Stack:
○ Data Storage: Big Query, Cloud Storage
○ Data Processing: Dataflow (Apache Beam), Dataproc (Spark), Cloud Composer (Airflow) ○ Streaming: Pub/Sub
○ Orchestration: Cloud Composer / Airflow
○ APIs & Services: Cloud Run, Cloud Functions, API Gateway
○ Monitoring: Cloud Logging, Cloud Monitoring, Stackdriver, or Prometheus/Grafana ● Dev Ops: Terraform, Git, Docker, Kubernetes, Cloud Build / Jenkins.
● Data Visualization: Looker Studio, Power BI, or Tableau.
● Version Control & CI/CD: Git Hub, Git Lab, or Bitbucket pipelines
Key Responsibilities
● Build, and maintain end-to-end GCP data pipelines (batch and streaming).
● Ensure data platform uptime and performance in alignment with defined SLAs/SLOs. ● Develop ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and Big Query.
● Manage and enhance data lakes and warehouses using Big Query and Cloud Storage. ● Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka.
● Build data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.
● Define and enforce data quality, governance, and lineage using Data Catalog and Cloud Data Quality tools.
● Collaborate with Dev Ops to build CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows.
● Participate in incident management, RCA, and change control processes following ITIL best practices.
● Mentor junior engineers and ensure adherence to engineering best practices.
Key Competencies
● Experience in managed service delivery under strict SLAs (availability, latency, and resolution timelines).
● Strong understanding of GCP, IAM, and cost management.
● Strong knowledge of incident management, change management, and problem management using ITSM tools (e.g., Service Now, Jira).
● Ability to lead on-call operations in a 16/5 support model, coordinating across global teams. ● Understanding of data security, governance, and compliance.
● Excellent communication and client-handling skills.
● Ability to work in a 16/5 model, coordinating with global teams across time zones.
Good to Have
● Certifications:
○ Google Professional Data Engineer
Mandatory Skills
Python, API Integration, GCP
Full Stack Data Engineer
Job Description
● Programming: Python, SQL, Shell scripting.
● GCP Data Stack:
○ Data Storage: Big Query, Cloud Storage
○ Data Processing: Dataflow (Apache Beam), Dataproc (Spark), Cloud Composer (Airflow) ○ Streaming: Pub/Sub
○ Orchestration: Cloud Composer / Airflow
○ APIs & Services: Cloud Run, Cloud Functions, API Gateway
○ Monitoring: Cloud Logging, Cloud Monitoring, Stackdriver, or Prometheus/Grafana ● Dev Ops: Terraform, Git, Docker, Kubernetes, Cloud Build / Jenkins.
● Data Visualization: Looker Studio, Power BI, or Tableau.
● Version Control & CI/CD: Git Hub, Git Lab, or Bitbucket pipelines
Key Responsibilities
● Build, and maintain end-to-end GCP data pipelines (batch and streaming).
● Ensure data platform uptime and performance in alignment with defined SLAs/SLOs. ● Develop ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and Big Query.
● Manage and enhance data lakes and warehouses using Big Query and Cloud Storage. ● Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka.
● Build data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.
● Define and enforce data quality, governance, and lineage using Data Catalog and Cloud Data Quality tools.
● Collaborate with Dev Ops to build CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows.
● Participate in incident management, RCA, and change control processes following ITIL best practices.
● Mentor junior engineers and ensure adherence to engineering best practices.
Key Competencies
● Experience in managed service delivery under strict SLAs (availability, latency, and resolution timelines).
● Strong understanding of GCP, IAM, and cost management.
● Strong knowledge of incident management, change management, and problem management using ITSM tools (e.g., Service Now, Jira).
● Ability to lead on-call operations in a 16/5 support model, coordinating across global teams. ● Understanding of data security, governance, and compliance.
● Excellent communication and client-handling skills.
● Ability to work in a 16/5 model, coordinating with global teams across time zones.
Good to Have
● Certifications:
○ Google Professional Data Engineer
Mandatory Skills
Python, API Integration, GCP
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application