Job Description

<p><b>Tittle : Snowflake Data Engineer</b></p> <p><b>Location: Dallas TX (3 days a week is must)</b></p> <p><b>Key Responsibilities</b></p> <ul> <li>Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services.</li> <li>Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt.</li> <li>Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture).</li> <li>Use Zero-Copy Cloning for environment management, testing, and sandboxing.</li> <li>Apply Time Travel and Fail-safe features for data recovery and auditing.</li> <li>Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake.</li> <li>Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables.</li> <li>Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace.</li> <li>Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening.</li> <li>Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers.</li> <li>Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security.</li> <li>Optimize compute usage with multi-cluster warehouses, resource monitors, and query performance tuning.</li> <li>Manage cost optimization strategies (warehouse auto-suspend, query profiling, storage/compute separation).</li> <li>Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs.</li> <li>Work with domain teams to deliver data products leveraging Snowflake's data mesh-friendly features.</li> <li>Collaborate with architects to design a Snowflake-centric data fabric integrated with ETL/ELT and API layers.</li> <li>Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud.</li> </ul> <p><b>Qualifications</b></p> <p>Education: Bachelor's or Master's in Computer Science, Data Engineering, or related field.</p> <p><b>Experience:</b></p> <ul> <li>10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud.</li> <li>Expertise in SQL optimization and Snowflake performance tuning.</li> <li>Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing.</li> <li>Proficiency in Python, Scala, or Java for Snowpark development.</li> <li>Experience integrating with cloud platforms like AWS.</li> <li>Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran).</li> <li>Familiarity with CI/CD, Git, DevOps practices for data operations.</li> <li>Preferred Certifications:</li> <li>SnowPro Core Certification</li> </ul> <p><b>Key Skills</b></p> <ul> <li>Snowflake-native feature design and implementation (Snowpark, Streams, Time Travel, Secure Data Sharing)</li> <li>Data ingestion (Snowpipe, CDC, Kafka, Fivetran)</li> <li>Semi-structured data handling (VARIANT, JSON, Avro, Parquet)</li> <li>Advanced SQL and performance tuning</li> <li>Data governance (RBAC, masking, lineage, catalogs)</li> <li>Cloud data platform integrations (AWS S3, Azure ADLS, GCP GCS)</li> <li>BI and analytics tool integration</li> <li>Cost optimization and warehouse orchestration</li> </ul> <p> </p>

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application