Job Description

  • Design, build, and maintain scalable data platforms;
  • Collect, process, and analyze large and complex data sets from various sources;
  • Develop and implement data processing workflows using data processing framework technologies such as Spark, and Apache Beam;
  • Collaborate with cross-functional teams to ensure data accuracy and integrity;
  • Ensure data security and privacy through proper implementation of access controls and data encryption;
  • Extraction of data from various sources, including databases, file systems, and APIs;
  • Monitor system performance and optimize for high availability and scalability.


Qualifications:

  • Experience with cloud platforms and services for data engineering (GCP);
  • Proficiency in programming languages like Python, Java, or Scala;
  • Use of Big Data Tools as Spark, Flink, Kafka, Elastic Search, Hadoop, Hive, Sqoop, Flume, Impala, Kafka Streams and...

Apply for this Position

Ready to join act digital? Click the button below to submit your application.

Submit Application