Job Description
Position Description:
We are looking for a highly hands-on Data Engineer – Technical Lead who will own IT-side deployments, release engineering, and production support across the enterprise analytics stack: Databricks, Denodo, Tableau, and Power BI. This role will design and run CI/CD pipelines, automate environment promotions (Dev → QA/UAT → Prod), manage Airflow orchestration, support Python-based packaging and scripting, and ensure stable, secure, and compliant production operations.You will act as the technical backbone for business teams—ensuring platforms and reporting tools are reliably deployed, monitored, and supported.
Your future duties and responsibilities:
A) Platform Deployment & Release Engineering (Databricks + Analytics Tools)
• Lead and implement end-to-end CI/CD using Jenkins pipelines (or equivalent) for:
o Databricks (jobs/workflows, notebooks, repos, libraries/wheels, cluster policies/configs, secrets integration).
o Denodo (deployment/migration of views, data services, connectors, caching-related configs as applicable).
o Tableau (publishing/promoting workbooks, data sources, extract refresh scheduling, server deployment support).
o Power BI (workspace/pipeline promotions, dataset refresh gateway coordination, deployment pipeline support).
• Define deployment standards: versioning, approvals, change control, rollback strategy, release notes.
• Maintain and standardize YAML-driven configuration and environment parameterization.
B) Orchestration & Automation
• Build and maintain Airflow DAGs for orchestration, including integrations with Databricks jobs and downstream dependencies.
• Create automation scripts for:
o Environment validation (pre/post deploy checks)
o Smoke tests and connectivity checks
o Job health checks and SLA verification
C) Production Support & Operational Excellence (L2/L3)
• Own production support for Databricks + Denodo + Tableau + Power BI:
o Incident triage, deep troubleshooting, RCA, and permanent fixes.
o Handle performance issues (pipeline latency, refresh failures, cache issues, concurrency bottlenecks).
• Implement monitoring/alerting for:
o Job failures, cluster health, pipeline SLAs, refresh schedules, platform uptime.
• Build runbooks, on-call procedures, and operational playbooks.
D) Security, Governance & Compliance Support
• Implement secure deployment patterns:
o Secrets management, least privilege access, controlled promotions, audit trails.
• Ensure audit-ready traceability for deployments and environment changes across tools.
E) Partnering with Business & Data Teams
• Act as a technical enabler for business users and reporting teams:
o Ensure reliable publishing, refresh, and consumption paths for dashboards/reports.
o Standardize production-readiness (logging, parameterization, idempotency, error handling).
• Mentor engineers and raise engineering standards for deployments and support.
Required qualifications to be successful in this role:
• 8–14+ years in data engineering / platform engineering / DevOps / production support roles.
• Strong hands-on experience with Databricks on cloud (AWS/Azure): jobs, workflows, clusters, libraries, repos.
• Strong Jenkins CI/CD experience: pipeline design, branching strategy, release management, environment promotion.
• Strong Airflow experience: DAG design, scheduling, operational handling, dependency patterns.
• Strong Python + scripting (Shell/Bash): automation, packaging, deployment utilities.
• Strong experience with YAML and configuration-as-code practices.
• Strong Git practices (PR workflow, tagging/releases) and production change discipline.
• Hands-on deployment/support experience for at least 2 of the following:
o Denodo
o Tableau Server/Cloud
o Power BI Service (including deployment pipelines/gateway coordination)
Skills:
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application