Job Description
Dear Candidate,
Greetings from TATA Consultancy Services!!
Thank you for expressing your interest in exploring a career possibility with the TCS Family.
Hiring For:- Data Bricks
Location: Pune, Hyderabad Indore
Experience: 6 to 8yrs
- Data Pipeline Development: Design, build, and maintain robust, scalable ETL/ELT pipelines on Databricks.
- Platform Expertise: Utilize Databricks features like Delta Lake, Databricks Workflows, and Delta Live Tables (DLT).
- Performance Tuning: Optimize Apache Spark jobs for efficiency and cost-effectiveness on large datasets.
- Collaboration: Work with data scientists, analysts, and architects to understand requirements and deliver solutions.
- Data Quality & Governance: Implement data quality checks, security, and access controls (e.g., Unity Catalog).
- Cloud Integration: Manage data infrastructure on AWS, Azure, or GCP.
- Documentation: Create technical specifications and process documentation.
Key Qualifications
- Technical Skills: Proficiency in SQL, Python/Py Spark, Apache Spark, Delta Lake, Databricks platform.
- Cloud: Experience with AWS, Azure, or GCP.
- Databases: Strong SQL and relational database experience.
- Concepts: Knowledge of Data Warehousing, Data Modeling, Data Architecture.
- Best Practices: Familiarity with CI/CD, code reviews, and Dev Ops principles.
- Education: Bachelor's degree in Computer Science, Engineering, or related field.
Greetings from TATA Consultancy Services!!
Thank you for expressing your interest in exploring a career possibility with the TCS Family.
Hiring For:- Data Bricks
Location: Pune, Hyderabad Indore
Experience: 6 to 8yrs
- Data Pipeline Development: Design, build, and maintain robust, scalable ETL/ELT pipelines on Databricks.
- Platform Expertise: Utilize Databricks features like Delta Lake, Databricks Workflows, and Delta Live Tables (DLT).
- Performance Tuning: Optimize Apache Spark jobs for efficiency and cost-effectiveness on large datasets.
- Collaboration: Work with data scientists, analysts, and architects to understand requirements and deliver solutions.
- Data Quality & Governance: Implement data quality checks, security, and access controls (e.g., Unity Catalog).
- Cloud Integration: Manage data infrastructure on AWS, Azure, or GCP.
- Documentation: Create technical specifications and process documentation.
Key Qualifications
- Technical Skills: Proficiency in SQL, Python/Py Spark, Apache Spark, Delta Lake, Databricks platform.
- Cloud: Experience with AWS, Azure, or GCP.
- Databases: Strong SQL and relational database experience.
- Concepts: Knowledge of Data Warehousing, Data Modeling, Data Architecture.
- Best Practices: Familiarity with CI/CD, code reviews, and Dev Ops principles.
- Education: Bachelor's degree in Computer Science, Engineering, or related field.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application