Job Description
Job Responsibilities:
- In-depth knowledge of Data Lake, Lakehouse, and Data Mesh architectures
- Experience building any data platforms using Databricks / Delta Lake (on-prem) / Snowflake.
- Proficient in ingesting structured, semi-structured, and unstructured data.
- Strong hands-on experience with Python, PySpark, SQL and APIs for data ingestion and transformation.
- Experience with ETL / ELT pipelines, streaming (Kafka, Kinesis), and batch processing (Spark, Glue, DBT).
- Strong experience working with Parquet, JSON, CSV files, sensor data, and optimizing large-scale analytical dataset.
- Collaborate with team members to design and operationalize data lake solutions.
- Excellent problem-solving, communication, and team collaboration skills.
Job Requirements:
- B Tech in Computer Science or Master of Computer Application.
- 6-8 years of relevant experience.
- Expertise in data modeling techniq...
Apply for this Position
Ready to join Bebo Technologies Private Limited? Click the button below to submit your application.
Submit Application