Job Description
Location : Pune Kharadi 3 days WFO
CBR: INR 160 K/Month
JD :
AD (Data Engineer)
Software Engineer Python Data Engineer
Your Role
Design, prototype, build, and maintain new data pipelines features on our data platform, as well as support existing ones through debugging and optimization.
Implement quality assurance and data quality checks to ensure completeness, validity, consistency, and integrity of data as it flows through the pipeline.
Collaborate closely with a global team of researchers, engineers, and business analysts to build innovative data solutions.
---
Your team
You'll be working in Finance IT in Pune. We provide excellent exposure to latest technologies and ability to 'make a mark' in our delivery team. As a Software Engineer you'll play an important role in achieving important change milestones.
We are results-oriented, multi-cultural group and would welcome you on the UBS AI big rocks journey with us.
---
Your expertise
10+ years of hands-on experience in data engineering space, with ability to translate working solution into implementable working package and/or built new solution using Azure platform
High proficiency and experience in designing/developing data analytics and data warehouse solutions with Azure Data Factory (ADF) and Azure Data Bricks, services related to Azure Analytics, Azure data lake storage Gen2, Azure SQL/Postgres, Azure analytics services and good understanding of Azure identity (SPN, SAMI, UAMI etc.)
experience in designing large data distribution, integration with service-oriented architecture, data mesh, data Lake and data analytics solution using Azure data services with large and multi-format data
proficient coding experience using Spark (Python/Scala) and SQL
prior ETL development experience using industry tool e.g. informatica/SSIS/Talend etc., good to have knowledge in Kafka streaming Azure Infrastructure
a proven track of work experience in cloud (MS Azure platform preferred) in an agile SDLC environment, leveraging modern programming languages, DevOps, and test-driven development"
Exposure to Docker, Git
hands on experience with pipeline orchestration tools like Apache Airflow, Autosys, Control-M Preferred Apache Ariflow
proficient at working with large and complex code bases using GitLab/Github, Fork/Pull Model and CI/CD
knowledge of DevOps tools and technologies (like GitLab CI/CD pipeline creation, YAML, Terraform, ARM, Puppet), Azure function app, logic app is a plus
working experience in Agile methodologies (SCRUM, XP, Kanban). Work within an agile SDLC environment to deliver high-quality data solutions
---
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application