Job Description

Job Description

  • Ensure stable, scalable, and secure operation of the Azure-based Data & Analytics platform, including Databricks, Azure-native components, Power BI, and CI/CD infrastructure
  • Offload operational workload from platform architects by taking ownership of infrastructure, deployment automation, and pipeline reliability
  • Enable smooth execution and troubleshooting of data pipelines written in Scala and PySpark, including hybrid integration scenarios such as Power BI with gateway infrastructure
  • Reports to: Head of Data & Analytics IT Competence Center
  • Collaborates with: Platform Architects, Data Engineers, ML Engineers, Power BI Developers
  • Geography: Global (stakeholders in Germany, India, Manila)
  • Operational Scope: Azure services, Databricks workspaces, CI/CD toolchains, Power BI service (incl. gateways), and Spark-based data pipelines
  • Main Tasks

    - Operate and optimize Azure resources (ADF, Key Vault, Monitor, Event Hub)
    - Administer Databricks workspace access and cluster configs

    - Apply Infrastructure-as-Code (Terraform/Bicep)

    - Manage CI/CD pipelines for Scala and PySpark-based pipelines

    - Integrate build steps (, Maven/SBT, Python wheels) into automated deployments

    - Enforce DevSecOps and IaC standards

    - Monitor Spark job execution, analyze failures and stage-level issues using Spark UI and logs

    - Configure alerts, metrics, and dashboards for pipelines and infrastructure

    - Lead post-incident reviews and reliability improvements

    - Administer Power BI tenant configuration, workspace access, and usage monitoring

    - Operate and monitor on-premises or VM-hosted enterprise gateways

    - Troubleshoot dataset refreshes and hybrid data integration

    - Support runtime execution of production pipelines and ensure SLA adherence

    - Collaborate with engineers to resolve Spark performance issues or deployment errors

    - Participate in schema evolution and environment transitions

    - Enforce platform policies (tagging, RBAC, audit logging)

    - Maintain credential and secrets security using Key Vault and managed identity

    - Conduct audits across Azure, Databricks, and Power BI environments

    Qualifications

  • Education / Certification:

    Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.

    Preferred: Azure DevOps Engineer Expert, Power BI Admin, or Databricks Admin certifications
  • Professional Experience:

    Minimum 5 years in cloud platform engineering, DevOps, or SRE roles within data or analytics platforms

    Hands-on experience with Spark (Databricks), PySpark, and CI/CD for JVM-based data applications
  • Project or Process Experience:

    Proven ability to deploy and operate complex data pipeline ecosystems using Scala and PySpark

    Experience in managing Power BI service in enterprise setups, including hybrid gateway environments
  • Leadership Experience:

    No formal people leadership required; expected to lead through technical authority and cross-team collaboration
  • Intercultural / International Experience:

    Experience working in distributed teams across time zones and cultures; strong communication skills and resilience
  • Additional Information

    The well-being of our employees is important to us. That's why we offer exciting career prospects and support you in achieving a good work-life balance with additional benefits such as:

  • Training opportunities
  • Mobile and flexible working models
  • Sabbaticals
  • and much more...

    Sounds interesting for you?

    are important to us and make our company strong and successful. We offer equal opportunities to everyone - regardless of age, gender, nationality, cultural background, disability, religion, ideology or sexual orientation.

    Apply for this Position

    Ready to join ? Click the button below to submit your application.

    Submit Application