Job Description

ESSENTIAL DUTIES AND RESPONSIBILITIES


Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.
  • Architect and implement robust ETL workflows using tools like Informatica PowerCenter, AbInitio. Data mapping, transformation logic, error handling, and performance optimization for high-volume data processing.
  • Design and develop data pipelines in Snowflake for efficient data warehousing, querying, and analytics, leveraging features such as Snowpark for custom processing and zero-copy cloning for cost-effective data sharing.
  • Build and maintain distributed data processing systems on Hadoop ecosystems (e.g., Hive, Spark, HDFS), ensuring scalability, fault tolerance, and seamless integration with upstream and downstream systems.
  • Develop advanced SQL queries, stored procedures, and optimizations for both relational and...
  • Apply for this Position

    Ready to join Truist? Click the button below to submit your application.

    Submit Application