Job Description
1. Job Title : Developer 2. Job Summary : Join our dynamic team as a Developer where you will leverage your expertise in Spark in Scala Amazon S3 Python and Databricks technologies to drive innovative solutions. With a hybrid work model and a focus on day shifts you will contribute to impactful projects that enhance our data capabilities and support our mission to deliver excellence. 3. Experience : 4 - 8 years 4. Required Skills : Technical Skills: Spark in Scala Amazon S3 Python Databricks SQL Databricks Delta Lake Databricks Workflows PySpark Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Cloud Modernization/Migration 7. Shift : Day 8. Responsibilities : - Develop robust data processing solutions using Spark in Scala to optimize performance and scalability. - Implement efficient data storage and retrieval systems utilizing Amazon S3 to ensure seamless data access. - Design and execute complex data workflows with Databricks Workflows to streamline operations and improve efficiency. - Utilize Python for scripting and automation tasks to enhance productivity and reduce manual intervention. - Apply Databricks SQL for querying and analyzing large datasets to derive actionable insights. - Integrate Databricks Delta Lake for reliable and consistent data management across various platforms. - Collaborate with cross-functional teams to understand data requirements and deliver tailored solutions. - Troubleshoot and resolve technical issues related to PySpark applications to maintain system integrity. - Provide technical expertise in optimizing data pipelines to support business objectives and drive growth. - Ensure data security and compliance with industry standards to protect sensitive information. - Participate in code reviews and contribute to the continuous improvement of development practices. - Stay updated with the latest advancements in data technologies to incorporate innovative solutions. - Support the companys purpose by enhancing data-driven decision-making processes for societal impact. -Qualifications - Possess strong experience in Spark in Scala and Amazon S3 for data processing and storage. - Demonstrate proficiency in Python and PySpark for scripting and data manipulation tasks. - Have hands-on experience with Databricks SQL and Delta Lake for efficient data management. - Familiarity with Databricks Workflows to automate and optimize data operations. - Experience in hybrid work environments with a focus on day shifts for effective collaboration. - Ability to work independently and as part of a team to achieve project goals. - Strong problem-solving skills and attention to detail in data-related tasks. 9. Job Location : Primary Location :INTSHYDA15(ITIND GAR - Tower – 5 HYD - GAR STD) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Associate - Projects [65PM00] 11. Demand Requires Travel? : No 12. Certifications Required : NA
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application