Job Description
Position: Data Engineer II (azure)
Job Description:
Primary Skills: Data Engineering, DevOps Automation; CI/CD; Databricks; Azure DevOps; Powershell etc.
The Databricks Platform Engineer II contributes to the expanding and optimizing our data
and data pipeline architecture, as well as optimizing data flow and collection for cross
functional teams. They will contribute to our data initiatives and will ensure optimal data
delivery architecture is consistent throughout ongoing projects. They engage through the
entire lifecycle of a project from data mapping, data pipelines, data modeling, and finally
data consumption. They must be self-directed and comfortable supporting the data needs
of multiple teams, systems, and products. They will learn to optimize or even re-design our
company's data architecture to support our next generation of products and data
initiatives. They can take on smaller projects from start to finish, work on problems of
moderate scope where analysis of situations or data requires a review of a variety of factors
and trace issues to their source. They develop solutions to a variety of problems of
moderate scope and complexity.
Duties & Responsibilities:
Build, Maintain and Support Databricks compute and storage
Solves simple to moderate application errors, resolves application problems,
following up promptly with all appropriate customers and IT personnel.
Reviews and contributes to QA test plans and supports QA team during test
execution.
Participates in developing data transformation and data pipelines.
Ensures change control and change management procedures are followed within
the program/project as they relate to requirements.
Develops BI datasets, reports, dashboards to support decision-making within the
business and IT.
Able to interpret requirement documents, contributes to creating functional design
documents as a part of data development life cycle.
Documents all phases of work including gathering requirements, joining
relationship diagrams, creating database diagrams, report layouts and other
program technical specifications using current specified design standards for new
or revised solutions.
Relates information from various sources to draw logical conclusions.
Conducts unit testing on ELT and report development.
Conducts data lineage and impact analysis as a part of the change management
process.
Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and
new data sources.
Contributes to documenting reporting requirements through engagement with
business process SMEs.
Creates source to target data mappings for data pipelines and integration activities.
Assists in identifying the impact of proposed application
development/enhancements projects.
Performs data profiling and process analysis to understand key source systems and
uses knowledge of application features and functions to assess scope and impact
of business needs.
Works with business users to design, develop, test, and implement business
intelligence solutions in the Data & Analytics Platform.
Implement and maintain data governance policies and procedures to ensure data
quality, security and compliance
Ensure operational stability of a 24/7/365 grocery retail environment by providing
technical support, system monitoring, and issue resolution which may be required
during off-hours, weekends, and holidays as needed.
Qualifications:
Bachelors Degree in Computer Science or Technical field; equivalent
trainings/certifications/experience equivalency will be considered
3 or more years of equivalent experience in relevant job or field of technology
3+ years of Databricks
3+ years of Terraform
Preferred Qualifications
Masters Degree in relevant field of study preferred; Additional trainings or
certifications in relevant field of study preferred
Experience in Agile teams and or Product/Platform based operating model.
Experience in retail or grocery preferred
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application