Job Description

Title:               Java Developer - Intermediate

Location:       Remote

Duration:       0-8+ Months (Possible Extension depending on performance)

Shift:              1st Shift (M-F)

Pay Rate:       Negotiable on W2

  

Duties:

Java backend developer (Java, AWS, Kafka, EMR)

·       The applications will utilize a modern tech stack focused on Java, Apache Kafka, AWS Cloud services, CI/CD practices, and GitLab.

·       Data streaming and processing: Implement and manage Kafka-based data pipelines for real-time data ingestion and processing.

·       Leverage AWS cloud services (e.g., EC2, Lambda, S3, SQS, DynamoDB, ECS, EKS) to build and deploy cloud-native applications and microservices.

·       Design and implement CI/CD pipelines using GitLab to automate build, test, and deployment processes, ensuring continuous delivery and smooth releases.

·       Write clean, well-documented, and testable code following best practices and coding standards.

·       Actively participate in code reviews and provide constructive feedback to team members.

·       Collaborate with cross-functional teams, including product owners, architects, and QA engineers, to define requirements, design solutions, and deliver features.

·       Monitor application performance, identify bottlenecks, troubleshoot issues, and implement solutions to optimize performance and reliability.

·       Ensure the security, scalability, and maintainability of applications in production environments.   

 

Qualifications:

·       Experience of 3~5 years in Java 17 development, including Spring Boot framework and microservices architecture.

·       Kafka expertise: Demonstrated experience with Apache Kafka, including design, deployment, performance tuning, and troubleshooting.

·       Proven experience with AWS cloud services (e.g., EC2, Lambda, S3, SQS, DynamoDB, Aurora, EKS).

·       CI/CD tools and methodologies: Strong understanding and experience with CI/CD tools and methodologies, such as Jenkins, Git, Gradle.

·       Big data technologies: Experience with big data technologies, including Spark and Avro is preferred.

·       Databases: Experience with relational and NoSQL databases like AWS DynamoDB, AWS RDS Aurora, and Cassandra is beneficial.

·       Strong problem-solving skills and ability to troubleshoot complex technical issues.

·       Excellent communication and collaboration skills, with the ability to work effectively in an Agile/Scrum environment.

·       Bachelor’s degree in computer science, Software Engineering, or a related field. 

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application