Job Description

About Kipi.ai:


Kipi.ai is an Elite Snowflake Consulting Partner and the trusted source of data and analytics solutions for commercial, mid-market, enterprise, and strategic customers. As a leading service provider and expert in Snowflake, we're on a mission to help our clients deliver value and enable their businesses to thrive. From start-to-finish, we're committed to ensuring their business has the tools and support it needs to drive powerful data analytics and insights. Become a kipian and discover how together, we can help customers across every industry increase agility and create long-term success.


Success Traits of the Data Engineer:


  • You will be an active participant and passionate developer/engineer with in-depth technical expertise, credibility, and field experience to establish yourself as a subject-matter expert 
  • You enjoy being mentored and help yourself developing skills within the team by Architects and Business Leaders.

Job Responsibilities:


  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data' technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data engineer team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications:


  • Bachelor's degree in Computer Science, Engineering, or related area
  • 4+ years of total experience with 2+  years' of relevant experience in snowflake
  • Hands on experience working in migration projects and large chunks of data.
  • Experience with Python and SQL required.
  • Expertise in ETL, Data Warehousing Analytics, Ab Initio, Data stage. Any reporting tool experience, preferred.
  • Cloud experience required (AWS or Azure).
  • Advanced knowledge of leading architecture solutions in the industry area
  • Excellent interpersonal and collaboration skills
  • Ability to demonstrate technical concepts to non-technical audiences

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application