Job Description
Databricks with SAP BO:
Key Responsibilities
- Design, build, and maintain scalable ETL ELT pipelines using Databricks PySpark Delta Lake SQL Warehouse
- Transform and curate data into bronze silver and gold layers following medallion architecture best practices
- Publish and expose gold layer datasets through Databricks SQL Warehouse for consumption by SAP BO
- Collaborate with BO developers to ensure semantic layer alignment
- Conduct data validation and reconciliation between Databricks outputs and BO report datasets
- Optimize data models queries and partitions for performance cost and scalability
Required Skills and Experience
- 5 years of experience with Azure Databricks PySpark Delta Lake SQL Warehouse
- Proficiency in SQL and data modelling star snowflake schemas
- Familiarity with SAP BusinessObjects universe and report structures able to validate and support BO da...
Apply for this Position
Ready to join Virtusa? Click the button below to submit your application.
Submit Application