Job Description

  • At least 8 years of overall experience in building ETL/ELT, data warehousing and big data solutions.
  • At least 5 years of experience in building data models and data pipelines to process different types of large datasets.
  • At least 3 years of experience with Python, Spark, Hive, Hadoop, Kinesis, Kafka.
  • Proven expertise in relational and dimensional data modeling.
  • Understand PII standards, processes, and security protocols.
  • Experience in building data warehouse using Cloud Technologies such as AWS or GCP Services, and Cloud Data Warehouse preferably Google BigQuery.
  • In depth knowledge in Big Data solutions and Hadoop ecosystem.
  • Strong SQL knowledge – able to translate complex scenarios into queries.
  • Strong Programming experience in Java or Python
  • Experience in Google Cloud platform (especially BigQuery & Dataflow)
  • Experience with Google Cloud SDK & API Scripting
  • Ex...
  • Apply for this Position

    Ready to join Tata Consultancy Services? Click the button below to submit your application.

    Submit Application