Job Description
Overview:
Join a rapidly growing team building world-class, large-scale Big Data architectures. This is a hands-on coding role focused on developing and optimizing data solutions using modern cloud and distributed computing technologies.
Key Responsibilities:
Development & Engineering
- Write high-quality, scalable code using Python .
- Work with SQL , PySpark , Databricks , and Azure cloud environments.
- Optimize Spark performance and ensure efficient data processing pipelines.
- Apply sound programming principles including version control , unit testing , and deployment automation .
- Design and implement APIs , abstractions , and integration patterns for distributed systems.
- Define and implement ETL , data transformation , and automation workflows in parallel processing environm...
Apply for this Position
Ready to join Sigmoid? Click the button below to submit your application.
Submit Application