Job Description
Our client is currently seeking a Sr. PySpark Engineer:
The ideal candidate will have strong hands-on experience with PySpark, Apache Spark, and Python, along with deep knowledge of SQL and NoSQL databases such as DB2, PostgreSQL, and Snowflake. This role requires proven proficiency in data modeling, ETL pipeline development, and workflow orchestration using tools like Airflow.
Qualifications & Requirements:10+ years of experience in big data and distributed computing. Very Strong hands-on experience with PySpark, Apache Spark, and Python. Strong Hands on experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, . Proficiency in data modeling and ETL workflows. Proficiency with workflow schedulers like Airflow Hands on experience with AWS cloud-based data platforms. Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
The ideal candidate will have strong hands-on experience with PySpark, Apache Spark, and Python, along with deep knowledge of SQL and NoSQL databases such as DB2, PostgreSQL, and Snowflake. This role requires proven proficiency in data modeling, ETL pipeline development, and workflow orchestration using tools like Airflow.
Qualifications & Requirements:
Must Have Technical/Functional Skills
Apply for this Position
Ready to join The Judge Group? Click the button below to submit your application.
Submit Application