Job Description
Position Overview:
As a key member of the McDonalds Global Data Platform & Operations team, this role focuses on the physical design, management, and optimization of cloud database platforms, with a strong emphasis on GCP BigQuery. The role supports the implementation and maintenance of database structures, metadata definitions, and source-to-target mappings, while ensuring database platforms are reliable, scalable, and cost-efficient in support of advanced analytics and AI use cases. The ideal candidate should be passionate about databases and brings hands-on experience with cloud databases, strong SQL skills, and a practical understanding of batch, event-based, and streaming data ingestion into database layers, and partners closely with data engineers, analysts, and platform stakeholders to ensure physical data designs and platform operations meet performance, governance, and business needs.
Responsibilities:
- Meet with data analysts, business intelligence teams, and other key stakeholders to gather, understand, and address complex data requirements. Ensure that the data solutions designed are both technically sound and aligned with business needs.
- Hands-on design, development, deployment, and management of database architecture solutions. Ensure that the solutions are scalable, flexible, and tailored for advanced analytics and AI scenarios utilizing GCP BigQuery.
- Execute ongoing GCP BigQuery platform administration and health checks, including configuration management, slot and reservation allocation, capacity monitoring, and remediation of platform-level issues to ensure stable, scalable, and cost-efficient BigQuery services.
Implement database operations best practices, including:
- Alerts & Monitoring: Configure proactive alerts for query performance, slot utilization, storage consumption, and failed jobs; monitor cost and usage trends; automate health checks for schema integrity.
- SQL Tuning: Review execution plans, optimize joins and aggregations, leverage partitioning and clustering, avoid SELECT *, and recommend materialized views for frequent queries.
- User Recommendations: Provide query optimization guidelines, educate analysts on efficient SQL practices, encourage parameterized queries, and share dashboards for performance and cost transparency.
- Planning and coordinating database downtime for maintenance or any other activities
- Work closely with the IT security team to ensure that all database platforms adhere to strict security protocols, ensuring data privacy and protection.
- Participate in data governance initiatives. Collaborate with cross-functional teams to help drive best practices, emphasizing data accuracy, security, and ensuring that all architectures are in line with regulatory compliance standards.
- Stay updated with the latest trends and best practices for database platform architecture. Continuously evaluate the existing design and processes, recommending and implementing improvements as necessary.
- Be skilled at conveying complex ideas in a clear and understandable manner, tailoring communication strategies based on the audience and the nature of the project.
- Engage in Deployment efforts through CI/CD or manual intervention.
- Available for a 24/7 environment and for on-call support.
Qualifications:
- Bachelors or masters degree in information technology, or a related field.
- 2+ years of experience in data design for logical and physical models, including ER, dimensional, and canonical modeling approaches for analytics and data warehousing.
- 3+ years of experience with cloud services such as GCP and AWS (GCP preferred).
- 3+ years of hands-on experience managing and optimizing GCP BigQuery platforms, including workload management, capacity planning, and cost optimization through FinOps-based consumption budgeting and forecasting.
- 3+ years of experience developing advanced SQL for complex analytical and data warehousing use cases, including large-scale analytical queries and complex transformations.
- Experience supporting batch, event-based, and streaming data ingestion into cloud database platforms, including managing schemas, partitioning strategies, and ingestion patterns aligned with downstream analytics requirements.
- Working knowledge of medallion architecture concepts, including how raw data is refined through intermediate layers and optimized in the Gold layer for performant, governed analytical access.
- 1+ years of experience with ERwin Data Modeler, including logical, physical, and conceptual modeling and translating business requirements into technical designs.
Preferred Skills:
- Familiarity with cloud-managed data integration, transformation, and movement tools such as DBT, Talend, or Confluent (Kafka).
- Demonstrated expertise in data governance, data management, and data quality geared towards data analytics, data science, and GenAI contexts.
- Excellent written, verbal, and meeting facilitation skills.
- Strong analytical and critical thinking skills.
- Self-driven with the ability to set priorities and mentor others in a performance-driven environment.
Work Shift:
- The role requires flexibility for either a 5:00 am 2:00 pm or 2:00 pm 11:00 pm shift.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application