Job Description

BigData - Tech Lead - CREQ195153 Description

Job Description:

Responsibilities:

  • Responsible for Design, Build & deployment the Solution in Bigdata.
  • Ability to effectively use complex analytical, interpretive and problem solving techniques.
  • Analytical, flexible, team-oriented and has good interpersonal/communication skills.
  • Apply Internal Standards for re-use, Architecture, Testing and general best practices.
  • Responsible for full software Development Life Cycle.
  • Responsible for the on-time delivery of high-quality code with low rates of production defects.
  • Research and recommend Technology to improve the current systems.
  • Communicate status and risk to stakeholders and escalate as appropriate.
  • Flexible and able to manage time effectively.
  • Ability to learn new skills quickly with little supervision and ensuring the details of high priority.
  • Excellent communication (verbal and written) and interpersonal skills with the ability to communicate well at all levels.
  • Efficiently and effectively manage work, time, and resources.
  • Strong problem solving and program execution skills while being process orientated.
  • Self-motivating and delivery focused individual.
  • Required Skills:

  • Minimum 6 to 8 Years of experience in Big Data and Data Engineering using Scala or Java . (Python good to have)
  • Designing and developing Scala related data pipelines and REST API.
  • Ability to write robust code in Scala.
  • Deep knowledge in spark & scala API Experience is required.
  • Good experience in Hadoop, Big Data, Hadoop centric Schedulers
  • Big Data and reporting System integration through API knowledge is added advantage.
  • Understanding of data structures, data modelling and software architecture
  • Excellent communication skills
  • Ability to work in a team.
  • Outstanding analytical and problem-solving skills
  • Banking domain knowledge is an Add-on Advantage
  • Knowledge of Regulatory Reporting domain will be a strong plus.
  • Technology Minimum Experience

  • Big Data / Hadoop 6 years
  • Spark 4 Years
  • Scala 4 years
  • SQL 4 years
  • Kafka 2 years
  • Unix & Shell Script 2 years
  • Nice to have skills:

  • Domain knowledge in Finance / Liquidity reporting is an added advantage
  • Basic Project/Release Management Jira, RLM, service now.
  • Primary Location Chennai, Tamil Nadu, India Job Type Experienced Primary Skills Hive Years of Experience 6 Travel No

    Apply for this Position

    Ready to join ? Click the button below to submit your application.

    Submit Application