Job Description

Dear Candidates,

Greetings from TCS!!!

TCS is looking for Informatica ETL Developer

Experience: 5-8 years

Location: PAN India


Required Technical Skill Set: Informatica ETL Developer


Must have skill:

  • Experience in ETL development with Informatica PowerCenter (experience with BDM/CDI preferred).
  • Strong knowledge of PL/SQL, Greenplum, and data warehousing concepts.
  • Hands-on experience with Hadoop ecosystem (HDFS, Hive) and Spark for big data processing.
  • Familiarity with Kafka and real-time streaming ETL.
  • Experience with Unix/Linux scripting, scheduling tools, and workflow automation.
  • Understanding of data governance, metadata management, and compliance (GDPR, PII masking).


Good to have:

  • Exposure to cloud-native ETL architectures and containerization (Kubernetes) is a plus.


Roles and responsibilities:

The Senior Informatica ETL Developer will design, develop, and optimize ETL workflows to support enterprise data ecosystem, which includes Greenplum Data Warehouse , HDFS-based Data Lake , and real-time streaming pipelines . This role ensures efficient data integration, high performance, and compliance with governance standards while enabling analytics and BI platforms such as MicroStrategy, Power BI, and Tableau.

  • Design, develop, and maintain ETL workflows using Informatica PowerCenter and Informatica BDM/CDI for batch and streaming data ingestion.
  • Integrate data from structured, semi-structured, and unstructured sources into Greenplum Data Warehouse and HDFS Data Lake .
  • Collaborate with data architects, BI teams, and data governance teams to ensure alignment with architecture and compliance requirements.
  • Implement error handling, logging, recovery mechanisms , and data quality checks .
  • Perform performance tuning of ETL processes, SQL queries, and optimize for MPP engines .
  • Support metadata management and lineage using Informatica EDC .
  • Provide production support , troubleshoot ETL failures, and ensure pipeline observability.

Contribute to real-time data integration leveraging Kafka and streaming frameworks.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application