Job Description
Preferred Qualifications
- Design and own the end-to-end data architecture, utilizing Snowflake as the core data warehouse and leveraging GCP for machine learning pipelines.
- Establish standards and patterns for ingesting, storing, cataloging, and accessing structured, semi structured, and unstructured data (e.g., JSON, logs, documents, media)
- Design and review enterprise grade data pipelines to ingest data from operational systems, SaaS sources, APIs, and external feeds into GCP/Snowflake.
- Define standards for using tools such as Dataflow/Beam, Pub/Sub, Composer/Airflow, or other orchestration/ETL tools like DBT
- Design GCP based data platforms using services such as Cloud Storage, Pub/Sub, Dataflow, Composer, GKE/Cloud Run (for data services), and IAM for security.
- Define network, security, and identity patterns for data workloads
- Establish standards for data quality (DQ rules, validation checks, profiling) and metadat...
Apply for this Position
Ready to join HD Supply? Click the button below to submit your application.
Submit Application