Job Description
BeOne continues to grow at a rapid pace with challenging and exciting opportunities for experienced professionals. When considering candidates, we look for scientific and business professionals who are highly motivated, collaborative, and most importantly, share our passionate interest in fighting cancer.
General Description:
This role reports to the Director of Enterprise Data Architecture, Solutions, and Strategy. The AD Platform and Solution Engineering must be an expert in Databricks solution technologies to design a scalable, high-performance data solutions that empower our organization to leverage data effectively. The ideal candidate will possess strong technical knowledge and experience in cloud data architectures, big data processing, and real-time analytics, coupled with the ability to collaborate cross-functionally to drive data-driven decision-making across the organization.
Essential Functions of the job:
The individual in this position should expect significant day-to-day variability in tasks and challenges.
Primary duties include but is not limited to the following:
Design and implement robust data architectures using Databricks, ensuring integration with existing systems and scalability for future growthEstablish data management frameworks, optimizing ETL/ELT processes and data models for performance and accuracy.Evaluate and recommend modern architectural patterns, including Lakehouse, Delta Live Tables, Data Mesh, and real-time streaming.Drive rapid Proof-of-Concepts (POCs) to validate new architectural approaches, tools, and design patterns before enterprise rollout.Partner with data engineers, scientists, and business stakeholders to develop seamless data pipelines prioritizing data integrity and usability.Implement and uphold data governance practices that enhance data accessibility while ensuring compliance with regulations.Integrate external systems, APIs, and cloud-native services to support new data products and analytics use cases.Prototype and test new connectors, ingestion frameworks, and integration patterns to accelerate innovation.Monitor data pipelines and infrastructure performance, troubleshooting issues as they arise and ensuring high availability.Optimize and enhance existing data systems for performance, reliability, and cost-efficiency.Collaborate with data analysts and data scientists to understand data requirements and implement solutions that support data-driven insights and models.Monitor and enhance system performance, employing tools and methodologies to optimize data processing and storage solutions.Optimize compute costs, job orchestration, workflow efficiency, and data storage strategies.Troubleshoot and resolve data-related issues to maintain optimal system functionality.Experiment with new Databricks features (Unity Catalog updates, AI/ML runtimes, Photon, DBRX, Delta Sharing, serverless SQL/compute, etc.) through quick hands-on evaluations.Develop and enforce data governance standards, including data quality, security, and compliance.Innovation & Rapid PrototypingConduct fast-turnaround POCs to explore new technical capabilities, libraries, and features across Databricks, Azure, Informatica, Reltio, and other ecosystem tools.Build lightweight demo pipelines, dashboards, and micro-solutions to demonstrate feasibility, guide architectural choices, and influence roadmap decisions.Stay current with emerging technologies, industry trends, and platform advancements; translate insights into actionable recommendations.Collaborate with vendors and internal teams to evaluate beta features, pilot new capabilities, and provide technical feedback for adoption decisions.Education Required: Bachelor’s Degree in Information Technology or related field/experiences
Qualifications:
Proven experience (8+ years) in data architecture or in a similar role, with extensive experience in Databricks and cloud-based data solutions.8+ years of experience in solution engineering, architecture, or related roles, preferably in platform development.Strong proficiency in Apache Spark, Unity Catalog technology, Python, SQL, and data processing frameworks.Experience with APIs and experience in integrating diverse technology systems.Familiarity with modern development frameworks, DevOps methodologies, and CI/CD processes.Experience with data warehousing solutions, delta lakes, and ETL/ELT processes.Familiarity with cloud environments (AWS, Azure) and their respective data services.Solid understanding of data governance, security, and compliance best practices.Excellent communication and interpersonal skills, with an ability to articulate complex technical concepts to diverse audiences.Experience leading cross-functional teams and projects is a plus.Supervisory Responsibilities: No
Global Competencies
When we exhibit our values of Patients First, Driving Excellence, Bold Ingenuity, and Collaborative Spirit, through our twelve global competencies below, we help get more affordable medicines to more patients around the world.
Fosters TeamworkProvides and Solicits Honest and Actionable FeedbackSelf-AwarenessActs InclusivelyDemonstrates InitiativeEntrepreneurial MindsetContinuous LearningEmbraces ChangeResults-OrientedAnalytical Thinking/Data AnalysisFinancial ExcellenceCommunicates with Clarity
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application