Job Description

<p>Location : Pune Kharadi 3 days WFO</p> <p>CBR: INR 160 K/Month</p> <p>JD :</p> <p> </p> <p><b>AD (Data Engineer)</b></p> <p> </p> <p>Software Engineer Python Data Engineer</p> <p> </p> <p> </p> <p>Your Role</p> <p> Design, prototype, build, and maintain new data pipelines features on our data platform, as well as support existing ones through debugging and optimization.</p> <p> Implement quality assurance and data quality checks to ensure completeness, validity, consistency, and integrity of data as it flows through the pipeline.</p> <p> Collaborate closely with a global team of researchers, engineers, and business analysts to build innovative data solutions.</p> <p> </p> <p>---</p> <p> </p> <p>Your team</p> <p>You'll be working in Finance IT in Pune. We provide excellent exposure to latest technologies and ability to 'make a mark' in our delivery team. As a Software Engineer you'll play an important role in achieving important change milestones.<br /> <br /> We are results-oriented, multi-cultural group and would welcome you on the UBS AI big rocks journey with us.</p> <p> </p> <p>---</p> <p> </p> <p>Your expertise</p> <p> 10+ years of hands-on experience in data engineering space, with ability to translate working solution into implementable working package and/or built new solution using Azure platform</p> <p><br /> High proficiency and experience in designing/developing data analytics and data warehouse solutions with Azure Data Factory (ADF) and Azure Data Bricks, services related to Azure Analytics, Azure data lake storage Gen2, Azure SQL/Postgres, Azure analytics services and good understanding of Azure identity (SPN, SAMI, UAMI etc.)</p> <p><br /> experience in designing large data distribution, integration with service-oriented architecture, data mesh, data Lake and data analytics solution using Azure data services with large and multi-format data</p> <p><br /> proficient coding experience using Spark (Python/Scala) and SQL</p> <p><br /> prior ETL development experience using industry tool e.g. informatica/SSIS/Talend etc., good to have knowledge in Kafka streaming Azure Infrastructure</p> <p><br /> a proven track of work experience in cloud (MS Azure platform preferred) in an agile SDLC environment, leveraging modern programming languages, DevOps, and test-driven development"<br /> <br /> Exposure to Docker, Git</p> <p><br /> hands on experience with pipeline orchestration tools like Apache Airflow, Autosys, Control-M Preferred Apache Ariflow</p> <p><br /> proficient at working with large and complex code bases using GitLab/Github, Fork/Pull Model and CI/CD</p> <p><br /> knowledge of DevOps tools and technologies (like GitLab CI/CD pipeline creation, YAML, Terraform, ARM, Puppet), Azure function app, logic app is a plus</p> <p><br /> working experience in Agile methodologies (SCRUM, XP, Kanban). Work within an agile SDLC environment to deliver high-quality data solutions</p> <p> </p> <p>---</p>

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application