Job Description

Job Description: Design end-to-end enterprise data architectures for large-scale analytics and operational use cases. Translate business requirements into conceptual, logical, and physical data models. Hands-on experience developing data pipelines on GCP using Dataflow for batch and/or streaming processing. Hands-on experience building transformation layers using Dataform and/or dbt (modeling, testing, documentation, deployment patterns). Deep BigQuery experience including schema design, partitioning/clustering strategies, and cost/performance optimization. Expert SQL capability to write complex transformations and analytics queries across large datasets. Programming experience in at least one language (e.g., Python, Java, Scala) to support automation, pipeline logic, and data utilities. Strong data modeling tool proficiency in SAP PowerDesigner and/or ERwin for enterprise-grade modeling and documentation. Familiarity with CI/CD and Git-based workflows for data/analytics engineering (b...

Apply for this Position

Ready to join Miracle Software? Click the button below to submit your application.

Submit Application