Job Description

Role- DEVELOPER

RelevantExperience (in Yrs) 4-6 YRS

MustHave Technical/Functional Skills 3years of experience in Databricks engineering solutions on AWS Cloud platformsusing PySpark, Databricks SQL, Data pipelines using Delta Lake.

 • 3 years of experience years of experience inETL, Big Data/Hadoop and data warehouse architecture & delivery.

 • Minimum 3 year of Experience in one or moreprogramming languages Python, Java, Scala.

 • 2 years of experience developing CICDpipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform

 • Good knowledge on Data Warehousing

ExperienceRequired 4-6 YRS

Roles& Responsibilities Work on clientprojects to deliver AWS, PySpark, Databricks based Data engineering &Analytics solutions.

 • Build and operate very large data warehousesor data lakes.

 • ETL optimization, designing, coding, &tuning big data processes using Apache Spark.

...

Apply for this Position

Ready to join Tata Consultancy Services? Click the button below to submit your application.

Submit Application