Job Description

Goldcast (a Cvent company) is an AI-powered B2 B Video Content Platform that enables marketers to put video content at the heart of the customer journey. We’re building a browser-based, frame-accurate video editor that rivals desktop apps, using cutting-edge tech like Web Codecs, WASM, and Remotion.
Founded in mid-2020 at Harvard Business School, we have 400+ customers across tech, professional services, manufacturing, and finance to deliver engaging video content & digital events.
Role Overview
As a senior data engineer you will specialise in building data pipelines from multiple data sources, creating and managing data models, building analytics for end user consumption. Work on optimising data ingestion, transformation, and storage workflows to meet scalability, reliability, and latency requirements.

We are looking for an individual with strong data engineering skillset who can lead the design, development, and optimization of enterprise-scale data pipelines and architectures to drive data-driven decision-making. Deep hands-on expertise in Snowflake implementation and a strong background in data modeling and use of ETL tools
What You’ll Do

Design Data Pipeline :
Design and implement high-performance data pipelines using ETL processes, batch/streaming frameworks
Deep knowledge of snowflake and familiarity with tools like dbt for data transformation
Work with AWS tech stack using kinesis, firehose, s3, lambda, spark, airflow, flink
ETL Tool:
Develop and maintain ETL/ELT pipelines integrating events heartbeat data, video heartbeat data, backend events data and chats data
Experience working with ingestion tools like segment, fivetran, hightouch, estuary and more
Analytics:
Data Modeling in Thoughtspot, explo
Knowledge of snowflake and star schema
Working with materialised views
Engineering excellence and cross-org impact:
Lead at least one engineering-wide initiative and continuously improve processes within and across teams.
Partnering with multiple stakeholders across various vertical
have good understanding of internals and fundamental concepts of databases including indexing, snowflake optimisation and cost management

What We Seek

8+ years in data engineering leading multiple initiatives
Strong technical judgment in data engineering, etl, modeling and analytics
Proven track record of delivering complex projects on time, improving team processes, and raising engineering quality bars
Excellent communication and stakeholder management with Product and Design
Nice to Have
Experience with AI/ML-enabled workflows using snowflake cortex
Data Quality Framework, Governance and catalog
Familiarity with experimentation and analytics stacks for iteration and measurement
Operating experience in high-growth or startup environments leading multiple pods

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application