Job Description
Role Title: Senior Data Engineer
Position Summary:
This role is responsible for analysis, data modeling, data collection, data integration, and preparation of data for consumption. It is responsible for creating and managing data infrastructure, data pipeline design, implementation and data verification. Along with the team, it is responsible for ensuring the highest standards of data quality, security and compliance. Displays personal accountability for successful outcomes and support quality efforts within the team. Interfaces with colleagues and other stakeholders to evaluate defined business requirements and processes. Uses available approved technologies. This role will implement methods to improve data reliability and quality, combine raw information from different sources to create consistent data sets. It will need to be well versed in DataOps and have learned, and are capable of using, relevant technologies. This is the second level position in the Data Engineer job family. Those holding this position are typically assigned to lead small scale projects and participate as part of a development team on larger projects.
Job Responsibilities:
Data Acquisition:
- Develop solid knowledge of structured and unstructured data sources within each product journey (Underwriting and Risk; Client Service, Sales and Marketing; Claims; Account and Location Engineering) as well as emerging data sources (purchased data sets; external data; etc.)
- Partner with team members, product owners, developers, solution architects, business analysts, data engineers, data analysts, data scientists and others to understand data needs
- Develop solutions using data modelling techniques and using technologies such as ER-Studio, Postgres, SQL Server, Azure Data Factory, Kakfa, SSIS, and others as required
- Validate code through detailed and disciplined testing
- Participate in peer code review to ensure solutions are accurate
- Ensure tables and views are designed for data integrity, efficiency and performance, and are easy to comprehend
Move and Store Data:
- Data flow, infrastructure pipelines, ETL/ELT, structured and unstructured data movement and storage solutions.
- Design data models and data flows into and out of databases
- Understand and design data relationships between business and data subject areas
- Follow standards for naming conventions, code documentation and code review
Support data exploration and transformation needs:
- Support team members with data cleansing tasks
- Conduct data profiling to identify data anomalies
- Assist team members with data preparation tasks
Support users and production applications:
- Support developers, data analysts and data scientists who need to interact with data in either data warehouse
- Analyze and assess reported data quality issues, quickly identifying root cause
- Consult dba(s) and team members on configuration and maintenance of the warehouse infrastructure
- Monitor system performance and identify opportunities for optimization
- Monitor storage capacity and reliability
- Address production issues quickly, with appropriate validation and deployment steps
- Provide clear and professional communication to users, management, and teammates
- Provide ad hoc data extracts and analysis to respond to tactical business needs
Participate in effective execution of team priorities:
- Identify work tasks and capture them in the team backlog
- Organize known tasks, following provided prioritization
- Escalate colliding priorities
- Provide production support
- Network with product teams to keep abreast of database changes as well as business process changes which result in data interpretation changes.
Skill and Experience:
- 3-5 years of Experience Required to Perform Essential Job Functions
- Data modeling abilities
- Relational (3rd Normal Form) and non-relational (Kimball / Inmon) database theory
- Design, build, maintain data solutions
- Understanding of database clustering
- Expertise with databases including Postgres, SQL Server, and Data Lake
- Knowledge of Azure Cloud applications
- ETL design
- Programming languages (SQL, C#, Python, Powershell, KSQL)
- Collaboration & Standards: Experience with GraphQL, peer reviews, and adherence to coding and quality standards.
Must Have Skills:
- Data Engineering Tools: Proficiency in SQL, ADO, Azure Data Factory (ADF), Kafka, Azure Service Bus (ASB), and both stream and batch processing.
- CI/CD & Quality Assurance: Experience with continuous integration / deployment pipelines and implementing data quality checks.
- Cloud & Storage Technologies: Familiarity with unstructured data storage, data lakes, cloud-native development, and containerization.
- Software Quality & Security: Strong understanding of software quality practices, security principles, and API integration.
Education and Certifications:
- 4 Year / Bachelors Degree
- Bachelor's Degree, preferably in Computer Science, Information Technology, Computer Engineering, or equivalent experience
Work location: Bengaluru
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application