Job Description

About the Role:
We are looking for an experienced AI Python Engineer to design, develop, and maintain end-to-end data and analytics workflows. This role involves building robust data pipelines, ETL/ELT processes, and vectorization workflows to support advanced search and retrieval capabilities in AI-driven systems. You will work closely with data and platform teams to automate deployments, enhance monitoring, ensure reliability, and optimize performance across data environments.

Key Responsibilities:

  • Develop and maintain scalable data pipelines, ETL/ELT processes, and analytics engineering workflows.
  • Build vectorization workflows to support advanced search, retrieval, and AI-driven features.
  • Collaborate with data teams to capture requirements, automate deployment, and improve monitoring processes.
  • Optimize data storage systems, troubleshoot issues, and ensure high performance.
  • Implement strong unit and integration testing practices to ensure quality and reliability.
  • Participate in Agile workflows and contribute within a DevOps-oriented environment.

Must-Have Skills:

  • At least 8 years of experience in software or data engineering.
  • Strong proficiency in Python.
  • Solid experience in unit and integration testing.
  • Familiarity with DevOps practices and Agile methodologies.

Good-to-Have Skills:

  • Experience with AWS and Kubernetes (K8s).
  • Familiarity with data platforms such as Snowflake, Databricks, Apache Spark, Apache Hive, Delta Lake, Apache Iceberg, and vector databases.
  • Experience with orchestration tools like Apache Airflow, Dagster, Prefect, or Temporal.
  • Familiarity with GitHub workflows and Datadog.

Why Join Us

  • Regional project exposure
  • Work on high-value data centre tenders
  • Opportunity to influence winning strategies

Company information

Registration No.

H

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application