Job Description
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL/ELT data pipelines using Python.
- Architect enterprise data solutions leveraging technologies such as Kafka, GKE auto-scaling, load balancers, API management (APIGEE), DBT, and LLMs for specific solutions.
- Ensure sensitive data is handled securely using DLP (Data Loss Prevention) and redaction techniques.
- Work extensively with Google Cloud Platform (GCP) services:
- Dataflow for batch and real-time processing
- Cloud Functions for serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer (Airflow) for workflow orchestration
- Google Cloud Storage (GCS) for large-scale data storage
- IAM for access control and security
- Cloud Run for containerized applications
- Build,...
Apply for this Position
Ready to join People Prime Worldwide? Click the button below to submit your application.
Submit Application