Job Description

Job Description:
· Technical Skillset
- Strong hands-on experience with AWS cloud services.
- Proficiency in Big Data tools and frameworks.
- Expertise in Kafka for data streaming.
- Solid understanding of ETL tools and practices.
- Experience with Snowflake and other cloud data warehouses.
- Strong SQL skills and familiarity with RDBMS and No SQL databases.
- Experience in Power BI for data visualization and reporting.
We are primarily looking for a Big Data Architect, who has worked on large scale data implementation (i.e., every day, ingesting at least 1-5 TB of real time data into their system, storage layer is having data in hundreds of TBs or in petabyte scale). Also, the Big Data Architect should have worked as a developer for at least 1-2 years in Spark (either Scala or Python). The expectation is, he/she will not be asked to code as a developer, but they should have a strong development experience to address challenges in large scale data ecosystem and as well as to guide the developers on designs/best practices. Apart from this, as per JD, they should have good experience in AWS (especially on big data services like EMR, Glue, Lambda, S3 etc.,) and in Snowflake/AWS Red Shift.

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application