Job Description

  • Core Frameworks: Apache Spark, PySpark, Airflow, NiFi, Kafka, Hive, Iceberg, Oozie
  • Programming Languages: Python, Scala, Java

Terraform

  • Data Formats & Storage: Parquet, ORC, Avro, S3, HDFS
  • Orchestration & Workflow: Airflow, DBT
  • Performance Optimization: Spark tuning, partitioning strategies, caching, YARN/K8s resource tuning
  • Testing & Validation: Great Expectations, Deequ, SQL-based QA frameworks
  • Observability & Monitoring: Datadog, Grafana, Prometheus

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application