Job Description
We are seeking Data Engineer with strong expertise in Azure Databricks. This role will focus on building, supporting, and administering scalable, high-performance data pipelines that power real-time and batch analytics for trading, risk, and operational use cases. The ideal candidate will have a deep background in data bricks data engineering, administration, capital markets data, and thrive in an Agile, fast-paced environment.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using Azure Databricks, Confluent, DLT, Spark pipleline, and Delta Lake to support trading and market data workflows.
- Self-study the existing pipeline and enhance existing data pipelines, ensuring continuity, scalability, and performance improvements.
- Production pipeline support services, including job monitoring, incident resolution, and performance tuning in production environments.
- Administer Databricks workspaces, unity catalog, including cluster configuration, job scheduling, access control, and workspace optimization.
- Build and maintain CI/CD pipelines using GitLab, enabling automated testing, deployment, and versioning of data engineering code.
- Follow and enforce best practices in code management, including modular design, code reviews, and documentation using GitLab workflows.
- Collaborate with fellow team members, business analysts, and data architect to understand data requirements and deliver high-quality solutions.
- Build reusable components and frameworks to accelerate development and ensure consistency across data platforms.
- Actively participate in Agile ceremonies (e.g., sprint planning, stand-ups, retrospectives) and contribute to continuous improvement of team processes.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, with at least 2 years working with Azure Databricks.
- Strong proficiency in PySpark, SQL, and Python.
- Experience supporting production pipelines, including monitoring, alerting, and troubleshooting.
- Experience with GitLab CI/CD, including pipeline configuration, runners, and integration with cloud services.
- Familiarity with financial capital markets domain, such as market data feeds, order books, trade execution, and risk metrics.
- Proven ability to work effectively in Agile development environments.
- Azure certifications (e.g., Azure Data Engineer Associate).
- Experience with real-time data processing using Kafka or Event Hubs.
Additional Information
What you will love about Exinity:
“Freedom to succeed” is our core belief. It’s not just a promise we make to our clients and partners, but to our people too. We want our people to LEAP and so in this role you will…
[Learn] (e.g., from each other/from new projects).
[Exchange] (e.g., information and best practices in an open-minded environment).
[Advance] (e.g., by developing skills and accepting greater responsibilities/ your career progression and diversification).
[Prosper] (e.g., by acquiring skills/ by nurturing a team of x people).
Exinity is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of gender, sexual orientation, marital or civil partner status, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application