Job Description
Location: Bengaluru, India
Experience: 8+ years
Role Type: Full-time
Role Overview
We are seeking an Analytics Engineer (AWS) to support a UK-based client, owning the semantic modelling and governed metrics layer on AWS. In this role, you will design robust, modular ELT transformations, ensure trust, consistency, and scalability of metrics, and enable stakeholders with high-performance, well-governed analytics products.
You will collaborate closely with Data Engineering, BI, and business teams to deliver reliable, scalable, and well-documented analytics solutions.
Key Responsibilities Modelling & ELT
- Design star schemas and domain data marts
- Build modular ELT models with incremental strategies
- Implement testing for schema, data quality, and freshness
- Maintain high-quality documentation and lineage
Metrics Governance
- Define, version, and govern business metrics (ownership, contracts, change control)
- Implement and manage a semantic layer (dbt Semantic Layer / Metric Flow)
- Ensure metric consistency across BI and downstream consumers
BI Delivery
- Support development of BI datasets and dashboards
- Implement row-level and column-level security
- Optimise BI performance (SPICE, query tuning, UX)
Quality & Reliability
- Embed data quality checks into pipelines
- Monitor data freshness, accuracy, and reliability
- Partner with Core Data Engineering on upstream contracts and SLAs
CI/CD & Workflow
- Enable Git-driven development with PR reviews and environment promotion
- Automate model validation, testing, and BI artefact deployment
Performance Tuning
- Optimise Redshift (sort/dist keys, WLM, concurrency scaling)
- Improve Athena performance using partitioning and efficient file formats
Enablement & Documentation
- Translate business requirements into analytics solutions
- Maintain a governed catalogue of metrics and datasets in Glue Catalog
- Run enablement sessions and training for stakeholders
Success Outcomes (First 60–90 Days)
- Deliver a governed KPI suite (metric catalogue + dbt models)
- Ship at least two business-critical dashboards with RLS enabled
- Establish CI/CD for analytics repositories with automated testing and promotions
- Reduce dashboard query times through model and dataset optimisation
- Publish clear documentation covering metrics, dimensions, lineage, and ownership
Required Skills & Experience
- 8+ years in Analytics Engineering or equivalent roles
- Expert SQL, solid Python
- Strong experience in semantic data modelling and analytics documentation
- Clear understanding of ABAC on data products
- 3+ years hands-on experience with:
- dbt (models, macros, tests, exposures)
- Amazon Redshift (including Serverless) and/or Athena
- Glue Catalog integration
- 3+ years delivering governed data products on cloud
- 5+ years designing data architectures using medallion architecture
- Strong experience with Git-based version control and CI/CD
- Proficiency in YAML and Jinja
- Experience with metric governance, change management, and stakeholder engagement
Nice-to-Have Skills
- dbt Semantic Layer / Metric Flow
- BI tools: Amazon Quick Sight (including Q), Tableau, Power BI
- Iceberg, Spectrum
- Data quality frameworks such as Great Expectations
- Familiarity with Lake Formation policies and policy-as-code approaches
- Experience with data mesh / domain ownership
- Exposure to feature store patterns (e.g., Sage Maker Feature Store)
Experience: 8+ years
Role Type: Full-time
Role Overview
We are seeking an Analytics Engineer (AWS) to support a UK-based client, owning the semantic modelling and governed metrics layer on AWS. In this role, you will design robust, modular ELT transformations, ensure trust, consistency, and scalability of metrics, and enable stakeholders with high-performance, well-governed analytics products.
You will collaborate closely with Data Engineering, BI, and business teams to deliver reliable, scalable, and well-documented analytics solutions.
Key Responsibilities Modelling & ELT
- Design star schemas and domain data marts
- Build modular ELT models with incremental strategies
- Implement testing for schema, data quality, and freshness
- Maintain high-quality documentation and lineage
Metrics Governance
- Define, version, and govern business metrics (ownership, contracts, change control)
- Implement and manage a semantic layer (dbt Semantic Layer / Metric Flow)
- Ensure metric consistency across BI and downstream consumers
BI Delivery
- Support development of BI datasets and dashboards
- Implement row-level and column-level security
- Optimise BI performance (SPICE, query tuning, UX)
Quality & Reliability
- Embed data quality checks into pipelines
- Monitor data freshness, accuracy, and reliability
- Partner with Core Data Engineering on upstream contracts and SLAs
CI/CD & Workflow
- Enable Git-driven development with PR reviews and environment promotion
- Automate model validation, testing, and BI artefact deployment
Performance Tuning
- Optimise Redshift (sort/dist keys, WLM, concurrency scaling)
- Improve Athena performance using partitioning and efficient file formats
Enablement & Documentation
- Translate business requirements into analytics solutions
- Maintain a governed catalogue of metrics and datasets in Glue Catalog
- Run enablement sessions and training for stakeholders
Success Outcomes (First 60–90 Days)
- Deliver a governed KPI suite (metric catalogue + dbt models)
- Ship at least two business-critical dashboards with RLS enabled
- Establish CI/CD for analytics repositories with automated testing and promotions
- Reduce dashboard query times through model and dataset optimisation
- Publish clear documentation covering metrics, dimensions, lineage, and ownership
Required Skills & Experience
- 8+ years in Analytics Engineering or equivalent roles
- Expert SQL, solid Python
- Strong experience in semantic data modelling and analytics documentation
- Clear understanding of ABAC on data products
- 3+ years hands-on experience with:
- dbt (models, macros, tests, exposures)
- Amazon Redshift (including Serverless) and/or Athena
- Glue Catalog integration
- 3+ years delivering governed data products on cloud
- 5+ years designing data architectures using medallion architecture
- Strong experience with Git-based version control and CI/CD
- Proficiency in YAML and Jinja
- Experience with metric governance, change management, and stakeholder engagement
Nice-to-Have Skills
- dbt Semantic Layer / Metric Flow
- BI tools: Amazon Quick Sight (including Q), Tableau, Power BI
- Iceberg, Spectrum
- Data quality frameworks such as Great Expectations
- Familiarity with Lake Formation policies and policy-as-code approaches
- Experience with data mesh / domain ownership
- Exposure to feature store patterns (e.g., Sage Maker Feature Store)
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application