Job Description

Responsibilities

  • Design and maintain optimal data pipeline architecture via scheduling and authoring tools such as Airflow & dbt
  • Wrangling large, complex data sets & modelling data to meet functional / non-functional business requirements.
  • Optimize data storage and query processing for performance and scalability.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Implement data security and governance best practices.

Qualifications

  • Bachelor's degree or higher in Computer Science, Engineering, Information Technology or other related field.
  • Proficiency in scripting languages - Python, SQL
  • Good understanding of relational DB, data structures, models, data warehouses & data lakes
  • Familiarity with databases & data warehouses - MySQL, PostgreSQL, MSSQL,...

Apply for this Position

Ready to join Avinity Analytics? Click the button below to submit your application.

Submit Application