Job Description

Management Level

G

Equiniti is a leading international provider of shareholder, pension, remediation, and credit technology. With over 6000 employees, it supports 37 million people in 120 countries.

EQ India began its operations in 2014 as a Global India Captive Centre for Equiniti, a leading fintech company specialising in shareholder management. Within a decade, EQ India strengthened its operations and transformed from being a capability centre to a Global Competency Centre, to support EQ's growth story worldwide.

Capitalising on India’s strong reputation as a global talent hub for IT / ITES, EQ India has structured the organisation to be a part of this growth story. Today, EQ India has evolved as an indispensable part of EQ Group providing critical fintech services to the US and UK.

Role Overview

The Senior Data & Reporting Engineer plays a critical role in designing, building, and operating scalable data platforms that underpin enterprise reporting and analytics across UK and India Equiniti Retirement Solutions (EQRS).

This role is primarily focused on data engineering, including the development of robust ETL/ELT pipelines, Lakehouse and Warehouse architectures, and the implementation of medallion (Bronze–Silver–Gold) data models that enable high-quality, trusted, and performant reporting.

The successful candidate will awork closely with business stakeholders, analysts, and reporting teams across the UK and India to ensure data is well-engineered, governed, and fit for purpose, enabling interactive reporting and analytics via Power BI and other consumption tools.

Key Responsibilities

Data Engineering & Architecture

  • Design, build, and maintain end-to-end ETL/ELT data pipelines using Microsoft Fabric or equivalent modern data platforms (e.g. Azure, Databricks, Snowflake).
  • Develop and manage Lakehouse and Data Warehouse architectures, ensuring scalability, performance, and cost efficiency.
  • Implement and evolve a medallion architecture (Bronze, Silver, Gold layers) to standardise data ingestion, transformation, and presentation.
  • Engineer curated, analytics-ready datasets to support interactive reporting and self-service BI.
  • Data Integration & Ingestion

  • Lead the integration of new data sources using APIs, secure file transfers, databases, and event-based feeds.
  • Build resilient ingestion frameworks to handle structured and semi-structured data from multiple internal and external systems.
  • Ensure data pipelines are monitored, reliable, and recoverable with appropriate logging and error handling.
  • Data Quality, Governance & Performance

  • Apply data quality checks, validation rules, and reconciliation processes to ensure accuracy and consistency.
  • Optimise data storage, partitioning, and query performance across Lakehouse and Warehouse layers.
  • Work closely with governance and security teams to ensure data access controls, lineage, and documentation are in place.
  • Reporting Dashboard

  • Design data models optimised for reporting consumption, enabling Power BI and other tools to perform efficiently at scale.
  • Support reporting teams by providing well-structured semantic datasets rather than focusing on dashboard visual design.
  • Collaborate on standards for reusable KPIs, measures, and reporting datasets.
  • Collaboration & Delivery

  • Partner with business stakeholders to translate reporting and analytics requirements into robust data solutions.
  • Work collaboratively with UK- and India-based teams to deliver against agreed roadmaps and delivery plans.
  • Manage multiple concurrent initiatives, ensuring high-quality documentation and adherence to engineering standards.
  • Essential Skills & Experience

  • Fluent business-level communication in English—both written and verbal—is essential as you’ll work directly with stakeholders.
  • 6+ years’ experience in specific data engineering, BI engineering, or analytics engineering roles.
  • Strong hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, Pipelines, OneLake) or equivalent platforms.
  • Strong Python experience for building, optimising and maintaining scalable data pipelines, transformations and data workflows. 
  • Strong SQL expertise, including complex transformations and performance optimisation.
  • Experience enabling and building Power BI dashboards.
  • Proven experience designing and implementing ETL/ELT pipelines at enterprise scale.
  • Solid understanding and practical implementation of medallion data architectures.
  • Experience integrating data via APIs, databases, flat files, and third-party platforms.
  • Familiarity with data modelling concepts for analytical workloads (star schemas, dimensional models).
  • Experience enabling reporting tools such as Power BI through well-engineered datasets (not visual-first development).
  • Strong documentation, communication, and stakeholder engagement skills.
  • Desirable Experience

  • Experience in financial services, pensions, or regulated environments.
  • Exposure to data governance, data cataloguing, and lineage tooling.
  • Understanding of DevOps / CI-CD practices for data platforms.
  • Experience supporting large, multi-client reporting environments.
  • Benefits:

    Being a permanent member of the team at EQ you will be rewarded by our company benefits, these are just a few of what is on offer:

  • 31 days + 9 bank holidays (UK)
  • Comprehensive Medical Assurance cover
  • Two-way cab transport for staff working in UK & US shift
  • Maternity leave of 6 months full pay, 10days paid paternity leave
  • Accidental & Life cover 3 times of concerned CTC
  • We are committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships. Please note any offer of employment is subject to satisfactory pre-employment screening checks.

    Apply for this Position

    Ready to join ? Click the button below to submit your application.

    Submit Application