Job Description
This position is based in Mérida, Yucatán, México. If you do not live in/around Merida, relocation will be required within 60 days of accepting this position. Verato will provide a relocation bonus of $42,500 MXN pesos to help with your move.
About VeratoVerato, the identity intelligence experts, powers exceptional experiences everywhere by solving the problem that drives everything else — knowing who is who. The Verato MDM Cloud, the next generation of MDM, delivers unprecedented identity intelligence by uniquely combining extraordinary identity resolution and enrichment with identity verification, AI‑powered data governance, and advanced insights. Verato re‑imagines MDM to be purpose‑built and nimble to drive a complete and trusted 360‑degree view of people, organizations, and networks across complex ecosystems with unmatched speed to value, enterprise‑grade performance, and customer success. More than 75% of the US population flows through Verato, powering a single source of truth for identity across critical industries.
Core to Verato’s strategy for sustained growth is our commitment to building a people‑first culture that attracts, develops, and retains top talent worldwide. Verato operates on the simple principle that a company must prioritize its employees first. Staff are given chances to expand their knowledge in areas like technology (big data, cloud computing, complex algorithms), healthcare, and organizational development.
About The PositionWe are seeking a detail‑oriented and analytical BI Engineer to support business decision‑making through data‑driven insights. The role focuses on building, maintaining, and optimizing business intelligence dashboards, reports, and data models that enable stakeholders to track performance, identify trends, and improve operational and strategic outcomes. This is a technology leadership role within the Enterprise Data Systems department. This position reports to the Director of Data Insights within the Technology Department.
Essential Functions And Responsibilities- Analyze large and complex datasets to identify trends, patterns, and actionable insights
- Design, develop, and maintain business intelligence dashboards and reports for stakeholders
- Translate business requirements into data models, metrics, and visualizations
- Ensure data accuracy, consistency, and reliability across data sources, reports and dashboards
- Partner with business teams to understand reporting needs and provide support
- Schedule and automate reports and improve reporting efficiency including performance and cost
- Document data definitions, reporting metric definitions, and operational processes
- Continuously improve data quality and reporting best practices
- Collaborate with other engineering and DevOps team members to implement, test, deploy, and operate data pipelines in support of insights
- Build necessary components to ensure data quality, monitoring, alerting, integrity, and governance standards are met
- Able to navigate ambiguity and thrive in a fast‑paced environment
- Proactive, results‑oriented, and operates effectively with minimal supervision
- Ability to communicate insights clearly to technical and non‑technical audiences
- Bachelor’s or master’s degree in computer science, Information Systems, or related field
- 3+ years of experience as a Data Analyst or Business Intelligence Analyst
- 3+ years of experience working in data‑centric organizations, collaborating closely with data engineering teams
- 3+ years of experience with reporting and analytics tools like Tableau, PowerBI, Quicksight or Looker Studio
- Solid understanding of data modeling, KPIs, and reporting frameworks
- Strong Python or R skills and familiarity with common Python libraries related to data analysis including pandas, boto3, PySpark, etc.
- 3+ years of experience with AWS S3, Lambda, Airflow, Athena, Redshift; experience with similar services in GCP/Azure is desired but optional
- 3+ years of working experience with Snowflake cloud data warehouse including integration with BI tools and query/cost optimization techniques
- Must have working knowledge of various data formats like CSV, JSON
- Experience working with agile development methodology
- Experienced in CI/CD and release processes, proficient in GitHub, CodeCommit or other source control management systems, to streamline development and deployment workflows
- Knowledge of building infrastructure in AWS cloud using CloudFormation, Liquibase or Terraform
- Knowledge of data warehousing concepts and ETL processes
- Familiarity with cloud data platforms (Snowflake, BigQuery, Redshift)
- Experience working with large, multi‑source datasets
- Understanding of Master Data Management solutions such as IBM Initiate, and knowledge in healthcare domain
- Exposure to multi‑tenant/multi‑customer environments
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application