Job Description
Job Description
Job Title: Data Engineer Specialist (Snowflake)
Location: Offshore / India (Open to BLR & HYD)
Experience: 10–12 years
Key Responsibilities
- Analyze and solve complex problems using technical expertise, judgment, and prior experience
- Provide informal guidance and support to new team members
- Explain complex technical concepts in a clear and straightforward manner
1. Data Engineering & Modeling
- Design & Develop Scalable Data Pipelines: Use AWS technologies to design, develop, and manage end-to-end data pipelines using ETL, Kafka, DMS, Glue, Lambda, and Step Functions
- Workflow Orchestration: Build, deploy, and manage automated workflows using Apache Airflow to ensure efficient data processing
- Snowflake Data Warehouse: Design, implement, and maintain Snowflake data warehouses ensuring optimal performance and scalability
- Infrastructure Automation: Automate cloud infrastructure provisioning using Terraform and Cloud Formation, ensuring security and scalability
- Data Modeling: Design and implement high-performance logical and physical data models using Star and Snowflake schemas
- Modeling Tools: Use Erwin or similar tools to create, maintain, and optimize data models aligned with business requirements
- Continuous Optimization: Monitor and enhance data models to improve performance, scalability, and security
2. Collaboration, Communication & Continuous Improvement
- Collaborate closely with data scientists, analysts, and business stakeholders to gather requirements and deliver tailored data solutions
- Provide guidance on data security best practices and ensure adherence to secure coding standards
- Stay updated with emerging trends in data engineering, cloud technologies, and data security
- Proactively identify opportunities for system optimization, automation, and performance improvements
Key Skills & Expertise
- Snowflake: Hands-on experience with performance tuning, RBAC, dynamic masking, data sharing, encryption, and row/column-level security
- Data Modeling: Strong expertise in physical and logical data modeling using Star and Snowflake schemas
- AWS Services: ETL, DMS, Glue, Step Functions, Airflow, Lambda, Cloud Formation, S3, IAM, EKS, and Terraform
- Programming: Proficiency in Python, R, Scala, Py Spark, and SQL (including stored procedures)
- Dev Ops & CI/CD: Experience with CI/CD pipelines and Ia C tools such as Terraform, JFrog, Jenkins, and Cloud Formation
- Problem Solving: Strong analytical and troubleshooting skills
- Communication: Excellent interpersonal and stakeholder management skills
Qualifications & Experience
- Bachelor’s degree in Computer Science, Engineering, or a related field
- 7–8 years of experience in designing and implementing large-scale Data Lake and Data Warehouse solutions
Certifications
- AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect (Preferred)
- Snowflake Advanced Architect and/or Snowflake Core Certification (Required)
Job Title: Data Engineer Specialist (Snowflake)
Location: Offshore / India (Open to BLR & HYD)
Experience: 10–12 years
Key Responsibilities
- Analyze and solve complex problems using technical expertise, judgment, and prior experience
- Provide informal guidance and support to new team members
- Explain complex technical concepts in a clear and straightforward manner
1. Data Engineering & Modeling
- Design & Develop Scalable Data Pipelines: Use AWS technologies to design, develop, and manage end-to-end data pipelines using ETL, Kafka, DMS, Glue, Lambda, and Step Functions
- Workflow Orchestration: Build, deploy, and manage automated workflows using Apache Airflow to ensure efficient data processing
- Snowflake Data Warehouse: Design, implement, and maintain Snowflake data warehouses ensuring optimal performance and scalability
- Infrastructure Automation: Automate cloud infrastructure provisioning using Terraform and Cloud Formation, ensuring security and scalability
- Data Modeling: Design and implement high-performance logical and physical data models using Star and Snowflake schemas
- Modeling Tools: Use Erwin or similar tools to create, maintain, and optimize data models aligned with business requirements
- Continuous Optimization: Monitor and enhance data models to improve performance, scalability, and security
2. Collaboration, Communication & Continuous Improvement
- Collaborate closely with data scientists, analysts, and business stakeholders to gather requirements and deliver tailored data solutions
- Provide guidance on data security best practices and ensure adherence to secure coding standards
- Stay updated with emerging trends in data engineering, cloud technologies, and data security
- Proactively identify opportunities for system optimization, automation, and performance improvements
Key Skills & Expertise
- Snowflake: Hands-on experience with performance tuning, RBAC, dynamic masking, data sharing, encryption, and row/column-level security
- Data Modeling: Strong expertise in physical and logical data modeling using Star and Snowflake schemas
- AWS Services: ETL, DMS, Glue, Step Functions, Airflow, Lambda, Cloud Formation, S3, IAM, EKS, and Terraform
- Programming: Proficiency in Python, R, Scala, Py Spark, and SQL (including stored procedures)
- Dev Ops & CI/CD: Experience with CI/CD pipelines and Ia C tools such as Terraform, JFrog, Jenkins, and Cloud Formation
- Problem Solving: Strong analytical and troubleshooting skills
- Communication: Excellent interpersonal and stakeholder management skills
Qualifications & Experience
- Bachelor’s degree in Computer Science, Engineering, or a related field
- 7–8 years of experience in designing and implementing large-scale Data Lake and Data Warehouse solutions
Certifications
- AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect (Preferred)
- Snowflake Advanced Architect and/or Snowflake Core Certification (Required)
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application