Job Description

About Statusneo

We accelerate your business transformation by leveraging best fit CLOUD NATIVE technologies wherever feasible. We are DIGITAL consultants who partner with you to solve & deliver. We are experts in CLOUD NATIVE TECHNOLOGY CONSULTING & SOLUTIONS. We build, maintain & monitor highly scalable, modular applications that leverage elastic compute, storage and network of leading cloud platforms. We CODE your NEO transformations. #StatusNeoBusiness domain experience is vital to the success of neo transformations empowered by digital technology. Experts in domain ask the right business questions to diagnose and address. Our consultants leverage your domain expertise & augment our digital excellence to build cutting edge cloud solutions.

Job Title: Big Data Developer – AWS

Location: Bangalore WFO 

Experience: 4+ Years

Notice Period: Immediate Joiners Preferred

Shift Timing: General 

Job Description:

We are seeking a skilled Big Data Developer with hands-on experience in AWS cloud services to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining scalable big data pipelines and data processing solutions.

Key Responsibilities:

  • Design and implement scalable data processing frameworks using AWS services.

  • Develop ETL pipelines for ingestion, transformation, and processing of large datasets.

  • Collaborate with data engineers, analysts, and architects to ensure efficient data workflows.

  • Optimize data storage and processing performance.

  • Ensure data quality, security, and compliance standards are met.

  • Troubleshoot and resolve issues related to data ingestion, transformation, and integration.

Technical Skills Required:

  • Big Data Tools: Spark, Hadoop, Hive, HDFS

  • Programming: Python, Scala, or Java

  • AWS Services: S3, Glue, EMR, Lambda, Redshift, Athena, Kinesis

  • Data Integration & Workflow Tools: Airflow, Step Functions, or similar

  • Databases: SQL / NoSQL (DynamoDB, PostgreSQL, etc.)

  • Version Control: Git / Bitbucket

  • Good understanding of data warehousing concepts and data lake architectures.

Preferred Qualifications:

  • Experience in performance tuning and optimizing large-scale data pipelines.

  • Exposure to CI/CD for data applications.

  • AWS certification is an added advantage.

Employment Type: Full-time

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application