Job Description
Description
Our Technology, Data and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernising the bank's IT infrastructure with long-term investments and taking advantage of cloud computing.
You will be working in the Transaction Monitoring and Data Controls team designing, implementing, and operationalising Java components.
What we’ll offer you
As part of our flexible scheme, here are just some of the benefits that you’ll enjoy
Best in class leave policyGender neutral parental leaves100% reimbursement under childcare assistance benefit (gender neutral)Sponsorship for Industry relevant certifications and educationEmployee Assistance Program for you and your family membersComprehensive Hospitalization Insurance for you and your dependentsAccident and Term life InsuranceComplementary Health screening for 35 yrs. and aboveYour key responsibilities
Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines, pre-dominantly on Google Cloud Platform (GCP) to process high-volume transaction data for regulatory and internal compliance monitoring.Implement robust data quality frameworks and monitoring solutions to ensure the accuracy, completeness, and timeliness of data within our critical transaction monitoring systems.Contributing to DevOps capabilities to ensure maximum automation of our applicationsCollaboration across the TDI areas such as Cloud Platform, Security, Data, Risk & Compliance areas to create optimum solutions for the business, increasing re-use, creating best practice, and sharing knowledgeYour skills and experience
Expert hands-on Data Engineering using at least one of:Java/Scala/Kotlin in a toolset such as Apache Spark, Dataflow/Apache-Beam, Apache FlinkPython in a toolset such as PySpark or Dataflow/Apache-BeamSQL based using DBTProfessional experience of at least one data warehousing technology (ideally Google Big Query), including knowledge of partitioning, clustering, and cost/performance optimization strategies.Hands on experience writing and maintaining DevOps pipelines in at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions.Experience contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) and understanding of relevant Architecture styles and their trade-offs - e.g., Data Warehouse, ETL, ELT, Monolith, Batch, Incremental loading vs Stateless processingExperience navigating and engineering within a secure, enterprise hybrid cloud environment within a large, regulated, and complex technology landscapeExperience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures and excellent communication skills (verbal and written)How we’ll support you
Training and development to help you excel in your careerCoaching and support from experts in your teamA culture of continuous learning to aid progressionA range of flexible benefits that you can tailor to suit your needs
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application