Job Description

Key Responsibilities

  • Pipeline Development: Design and build data pipelines to integrate new data sources, with a heavy focus on API-based ingestion.
  • Unstructured Data Management: Develop and implement strategies to process, store, and utilize unstructured data formats.
  • Snowflake Operations: Handle ongoing Snowflake maintenance tickets, performance tuning, and regular housekeeping of the data warehouse.
  • Collaboration: Work closely with Data Quality Analysts and stakeholders to ensure data integrity and flow from scraped document outputs.
  • Explore ways to enhance data quality and reliability
  • Identify and realize opportunities to acquire better data (raw data)
  • Execute data migration from existing databases to the inhouse data warehouse


Qualifications & Skills

  • Experience: 5+ years of hands-on experience with Snowflake.
  • Technical Proficiency
  • Strong proficiency in Python for data processing, scripting, and automation.
  • Strong SQL: Advanced proficiency in writing complex, high-performance SQL queries for data transformation and analysis.
  • Advanced Snowflake Expertise: Deep experience in Snowflake architecture, including Snowpipe , Streams & Tasks , Zero-Copy Cloning , and Time Travel .
  • Extensive hands-on experience with the Snowflake Data Cloud platform.
  • API Expertise: Proven ability to build and complete integrations with third-party APIs with Snowflake.


  • Cloud Knowledge: Exposure of using Snowflake on Azure and knowledge of Azure Cloud.
  • Snowflake SnowPro Certification is a plus.
  • Strong understanding or practical experience of at least one common Enterprise Agile Framework e.g.
  • Kanban, SAFe, SCRUM, etc.
  • Strong understanding of ETL, data warehouse and BI and Advanced Data Analytics concepts
  • Strong critical-thinking, analytical and problem-solving skills
  • Excellent communicator with team-oriented approach

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application