Job Description
Roles and Responsibility
- Design, implement, and manage Hadoop clusters for big data processing.
- Configure and troubleshoot HDFS, YARN, and MapReduce components.
- Monitor cluster health and perform proactive maintenance tasks.
- Collaborate with development teams to ensure smooth deployment of applications.
- Develop and maintain documentation of Hadoop infrastructure and procedures.
- Ensure compliance with security best practices and industry standards.
Job Requirements
- Strong understanding of Hadoop ecosystem technologies including HDFS, YARN, and MapReduce.
- Experience with Linux operating systems and scripting languages such as Python or Shell.
- Knowledge of database management systems such as MySQL or NoSQL databases like Cassandra or MongoDB.
- Excellent problem-solving skills and ability to work under pressure.
- Strong communication and collaboration...
Apply for this Position
Ready to join IDESLABS? Click the button below to submit your application.
Submit Application