Job Description

Company Description

AuxoAI is dedicated to making artificial intelligence accessible and impactful with our mission, "AI Made Real." We help organizations accelerate their business strategies by seamlessly integrating AI, data, and digital technologies, fostering enterprise growth and efficiency. Named after the Greek goddess of growth, AuxoAI enables businesses to modernize infrastructure, optimize workflows, and address operational challenges. Our flagship platform, Damia - Enterprise AI Studio, delivers rapid value by offering AI-powered solutions to drive innovation and momentum for enterprises.


Role Description

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3- 6 years of prior experience in data engineering, with a strong background in AWS (Amazon Web Services) technologies. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.


Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL processes leveraging AWS services such as S3, Glue, EMR, Lambda, and Redshift.
  • Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
  • Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.
  • Implement data governance and security best practices to ensure compliance and data integrity.
  • Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.
  • Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Qualifications :

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 3-6 years of prior experience in data engineering, with a focus on designing and building data pipelines.
  • Proficiency in AWS services, particularly S3, Glue, EMR, Lambda, and Redshift.
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Experience with SQL and NoSQL databases, data warehousing concepts, and big data technologies.
  • Familiarity with containerization technologies (e.G., Docker, Kubernetes) and orchestration tools (e.G., Apache Airflow) is a plus.



Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application