Job Description
Exp- 7-10 Years
Location-Mumbai/Pune (Local candidates only)
Notice Period- Only Immediate Joiners to 15 days
Key Responsibilities:
Develop and maintain data integration workflows using IICS CDI for cloud-based data processing and ETL tasks.
Work with Snowflake to assist in data loading, querying, and maintaining data within the cloud environment.
Write and maintain Unix scripts to automate data processes and enhance system operations.
Leverage ESP Scheduler or relevant scheduling tools to manage, monitor, and ensure smooth execution of data jobs and workflows.
Deploy and manage code using Git Hub (or similar version control systems) for versioning and release management.
Required Skills and Experience:
Bachelor’s degree in Computer Science, Information Technology, or a related field.
Hands-on experience with IICS CDI (Cloud Data Integration) for ETL processes.
Exposure to Snowflake for data storage and processing in the cloud.
Solid knowledge of Unix/Linux for system administration, automation, and scripting.
Experience with ESP Scheduler or similar job scheduling tools.
Familiarity with Git Hub or other version control systems for code management and deployment.
Basic understanding of Agile methodologies and experience working in an Agile environment.
Strong problem-solving and troubleshooting skills.
Ability to work effectively in a collaborative team environment.
Collaborate with the team to ensure high performance, reliability, and security in data pipelines.
Participate in Agile processes, including sprint planning, daily stand-ups, and retrospectives.
Assist in debugging and troubleshooting data integration issues and performance bottlenecks.
Contribute to technical documentation and provide knowledge sharing across teams.
Support the creation and maintenance of automated CI/CD pipelines to streamline deployments.
Location-Mumbai/Pune (Local candidates only)
Notice Period- Only Immediate Joiners to 15 days
Key Responsibilities:
Develop and maintain data integration workflows using IICS CDI for cloud-based data processing and ETL tasks.
Work with Snowflake to assist in data loading, querying, and maintaining data within the cloud environment.
Write and maintain Unix scripts to automate data processes and enhance system operations.
Leverage ESP Scheduler or relevant scheduling tools to manage, monitor, and ensure smooth execution of data jobs and workflows.
Deploy and manage code using Git Hub (or similar version control systems) for versioning and release management.
Required Skills and Experience:
Bachelor’s degree in Computer Science, Information Technology, or a related field.
Hands-on experience with IICS CDI (Cloud Data Integration) for ETL processes.
Exposure to Snowflake for data storage and processing in the cloud.
Solid knowledge of Unix/Linux for system administration, automation, and scripting.
Experience with ESP Scheduler or similar job scheduling tools.
Familiarity with Git Hub or other version control systems for code management and deployment.
Basic understanding of Agile methodologies and experience working in an Agile environment.
Strong problem-solving and troubleshooting skills.
Ability to work effectively in a collaborative team environment.
Collaborate with the team to ensure high performance, reliability, and security in data pipelines.
Participate in Agile processes, including sprint planning, daily stand-ups, and retrospectives.
Assist in debugging and troubleshooting data integration issues and performance bottlenecks.
Contribute to technical documentation and provide knowledge sharing across teams.
Support the creation and maintenance of automated CI/CD pipelines to streamline deployments.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application