Job Description
Python pyspark - CREQ Description Design and develop on Hadoop applications.
Hands on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA.
Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts.
Experience on source code management with Git repositories.
Secondary skills
Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services.
Basic SQL programming.
Knowledge of agile methodology for delivering software solutions.
Build scripting with Maven / Cradle, Exposure to Jenkins.
Primary Location Hyderabad, Andhra Pradesh, India Job Type Experienced Primary Skills Python, PySpark Years of Experience 4 Travel No
Hands on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/ SCALA.
Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts.
Experience on source code management with Git repositories.
Secondary skills
Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services.
Basic SQL programming.
Knowledge of agile methodology for delivering software solutions.
Build scripting with Maven / Cradle, Exposure to Jenkins.
Primary Location Hyderabad, Andhra Pradesh, India Job Type Experienced Primary Skills Python, PySpark Years of Experience 4 Travel No
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application