Job Description

**Job Summary**



We are seeking a Developer with 4 to 8 years of experience who is proficient in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. The candidate should have domain expertise in Property & Casualty Insurance. The work model is hybrid with day shifts and no travel required.



**Responsibilities**



+ Develop and maintain scalable data pipelines using Spark in Scala and PySpark to ensure efficient data processing and transformation.

+ Implement and manage Delta Sharing and Databricks Unity Catalog Admin to ensure secure and efficient data sharing and cataloging.

+ Utilize Databricks CLI and Delta Live Pipelines to automate data workflows and ensure seamless data integration.

+ Design and implement structured streaming solutions to handle real-time data processing and analytics.

+ Apply risk management techniques to identify assess and mitigate potential data risks.

+ Integrate and manage data workflows using Apache Airflow to ensure smooth and reliable data operations.

+ Utilize Amazon S3 and Amazon Redshift for efficient data storage and retrieval ensuring optimal performance.

+ Develop and optimize SQL queries using Databricks SQL to support data analysis and reporting.

+ Implement and manage Databricks Delta Lake to ensure data reliability and consistency.

+ Design and manage Databricks Workflows to automate and streamline data processes.

+ Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.

+ Provide technical expertise and support to ensure the successful implementation of data projects.

+ Stay updated with the latest industry trends and technologies to continuously improve data solutions.



**Qualifications**



+ Possess strong experience in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark.

+ Demonstrate domain expertise in Property & Casualty Insurance.

+ Exhibit excellent problem-solving and analytical skills.

+ Show strong communication and collaboration abilities.

+ Have a proactive and self-motivated approach to work.

+ Display a keen attention to detail and a commitment to quality.

+ Possess the ability to work effectively in a hybrid work model.

+ Demonstrate the ability to manage multiple tasks and projects simultaneously.

+ Exhibit a strong understanding of data security and compliance requirements.

+ Show a commitment to continuous learning and professional development.

+ Have experience working in an agile development environment.

+ Possess a strong understanding of data architecture and design principles.

+ Demonstrate the ability to work independently and as part of a team.

Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application