Job Description

Job Title: - Databricks with Pyspark

Location: - AIA Hyderabad

Minimum 4 to 6 years of experience



1. Job Title : Developer 2. Job Summary : Join our dynamic team as a Developer where you will leverage your expertise in Spark in Scala Delta Sharing and Databricks Unity Catalog Admin to drive impactful projects. With a focus on innovation and efficiency you will contribute to the development and optimization of data workflows ensuring seamless integration and management of data across platforms. Collaborate with cross-functional teams to enhance data solutions and support strategic initiatives all within a hybrid work model. 3. Experience : 4 - 8 years 4. Required Skills : Technical Skills: Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows PySpark Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Cloud Modernization/Migration 7. Shift : Day 8. Responsibilities : - Develop and optimize data workflows using Spark in Scala to enhance data processing capabilities. - Implement Delta Sharing to facilitate secure and efficient data exchange across platforms. - Administer Databricks Unity Catalog to ensure robust data governance and cataloging. - Utilize Databricks CLI for streamlined management and automation of Databricks environments. - Design and manage Delta Live Pipelines to support real-time data processing and analytics. - Implement Structured Streaming for continuous data flow and real-time analytics. - Apply risk management strategies to identify and mitigate potential data-related risks. - Integrate Apache Airflow for orchestrating complex data workflows and processes. - Leverage Amazon S3 for scalable and secure data storage solutions. - Utilize Amazon Redshift for efficient data warehousing and analytics. - Develop Python scripts to automate data processing and enhance workflow efficiency. - Execute Databricks SQL queries to extract and analyze data for business insights. - Implement Databricks Delta Lake for reliable and scalable data storage solutions. - Manage Databricks Workflows to ensure seamless execution of data tasks and processes. - Utilize PySpark for efficient data manipulation and processing within the Databricks environment. -Qualifications - Possess strong expertise in Spark in Scala and Databricks Unity Catalog Admin. - Demonstrate proficiency in Delta Sharing and Databricks CLI for data management. - Have experience with Delta Live Pipelines and Structured Streaming for real-time analytics. - Show capability in risk management and Apache Airflow for workflow orchestration. - Exhibit knowledge of Amazon S3 and Amazon Redshift for data storage and warehousing. - Proficient in Python scripting for automation and data processing. - Skilled in Databricks SQL and Databricks Delta Lake for data analysis and storage. - Experience with Databricks Workflows and PySpark for efficient data operations. 9. Job Location : Primary Location :INTSHYDA15(ITIND GAR - Tower – 5 HYD - GAR STD) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Associate - Projects [65PM00] 11. Demand Requires Travel? : No 12. Certifications Required : NA

Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application