Job Description
Primary Roles and Responsibilities:
● Developing Modern Data Warehouse solutions using Snowflake and ADF.
● Ability to provide solutions that are forward-thinking in data engineering and analytics space
● Collaborate with DW/BI leads to understand new ETL pipeline development requirements.
● Triage issues to find gaps in existing pipelines and fix the issues
● Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs
● Help joiner team members to resolve issues and technical challenges.
● Drive technical discussions with client architect and team members
● Orchestrate the data pipelines in the scheduler via Airflow
Skills and Qualifications:
● Bachelor's and/or master’s degree in computer science or equivalent experience.
● Must have total 9+ yrs. of IT experience and 4+ years' experience in Data warehouse/ETL projects.
● Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects.
● Hands-on experience with Snowflake utilities, Snow SQL, Snowpipe, Snowsight and Snowflake connectors.
● Deep understanding of Star and Snowflake dimensional modeling.
● Strong knowledge of Data Management principles
● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture
● Should have hands-on experience in SQL and Spark (Py Spark)
● Experience in building ETL / data warehouse transformation processes
● Experience with Open Source non-relational / No SQL data repositories (incl. Mongo DB, Cassandra, Neo4 J)
● Experience working with structured and unstructured data including imaging & geospatial data.
● Experience working in a Dev/Ops environment with tools such as Terraform, Circle CI, GIT.
● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization.
● Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
● Should have experience working in Agile methodology
● Strong verbal and written communication skills.
● Strong analytical and problem-solving skills with a high attention to detail.
- Mandatory Skills: Snowflake & Azure Data Factory
● Developing Modern Data Warehouse solutions using Snowflake and ADF.
● Ability to provide solutions that are forward-thinking in data engineering and analytics space
● Collaborate with DW/BI leads to understand new ETL pipeline development requirements.
● Triage issues to find gaps in existing pipelines and fix the issues
● Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs
● Help joiner team members to resolve issues and technical challenges.
● Drive technical discussions with client architect and team members
● Orchestrate the data pipelines in the scheduler via Airflow
Skills and Qualifications:
● Bachelor's and/or master’s degree in computer science or equivalent experience.
● Must have total 9+ yrs. of IT experience and 4+ years' experience in Data warehouse/ETL projects.
● Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects.
● Hands-on experience with Snowflake utilities, Snow SQL, Snowpipe, Snowsight and Snowflake connectors.
● Deep understanding of Star and Snowflake dimensional modeling.
● Strong knowledge of Data Management principles
● Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture
● Should have hands-on experience in SQL and Spark (Py Spark)
● Experience in building ETL / data warehouse transformation processes
● Experience with Open Source non-relational / No SQL data repositories (incl. Mongo DB, Cassandra, Neo4 J)
● Experience working with structured and unstructured data including imaging & geospatial data.
● Experience working in a Dev/Ops environment with tools such as Terraform, Circle CI, GIT.
● Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization.
● Databricks Certified Data Engineer Associate/Professional Certification (Desirable).
● Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
● Should have experience working in Agile methodology
● Strong verbal and written communication skills.
● Strong analytical and problem-solving skills with a high attention to detail.
- Mandatory Skills: Snowflake & Azure Data Factory
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application