Job Description
Job Description
We are looking for an experienced and resilient person to join our growing team of analytic experts and willing to learn through bigger challenges.
What You Will Do
Skills Required
Hadoop, Aws Redshift, Power Bi, Kafka, Soap, Tableau, Sql, Rest, snowflake , Etl Tools, Python
We are looking for an experienced and resilient person to join our growing team of analytic experts and willing to learn through bigger challenges.
What You Will Do
- Work closely with product and engineering team to understand the domains, features and metrics
- Design and build scalable data pipelines to handle data from different sources
- Extract data using ETL tools and load to data warehouse ( AWS Redshift / Google BigQuery)
- Implement batch processing for structured and unstructured data
- Analyse data and create visualizations using tools like Tableau/Metabase/GoogleDataStudio which helps in implementing business decisions
- Work on core data team which designs and maintain the Data warehouse
- Troubleshooting and resolving issues in data processing and pipelines.
- Should be able to anticipate problems and build processes to avoid them.
- Learn new technology in a short span of time.
- Setting up CI/CD.
- Proficiency in database design and writing SQL queries
- Experience in any of data warehouse solutions AWS Redshift / Google BigQuery / Snowflake
- Knowledge on platforms such as Segment / HevoData / Stitch / Amplitude / Clevertap
- Hands on experience in ApacheSpark / Python / R / Hadoop / Kafka
- Knowledge on working with connectors (REST / SOAP etc.)
- Experience in BI platforms like Metabase / Power BI / Tableau / Google Data Studio
Skills Required
Hadoop, Aws Redshift, Power Bi, Kafka, Soap, Tableau, Sql, Rest, snowflake , Etl Tools, Python
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application