Job Description
• Proficient in programming languages (Python, Java, etc.) and REST APIs (Azure API Management, MuleSoft, etc.) to process data
• Proficient in integrating structured, semi-structured, and unstructured data sources to deliver analytics solutions using Data Warehouse, Data Lake, and Data Hubs
• Proficient in developing and operationalizing various data distribution patterns like, APIs, event-based, pub/sub-models
• Experience in migrating data warehouses from on-premises (Teradata or similar) to cloud (Azure Synapse Analytics or similar)
• Experience in monitoring the performance of data workflows and infrastructure through the use(00 of monitoring tools (Azure Monitor, Informatica, AppDynamics, etc.) and enabling alerts for prompt resolution
• Experience with DevOps toolset (Azure DevOps, Jenkins, Bitbucket, etc.) to collaborate and deploy code
• Knowledge of Big Data Technologies (Hadoop, Spark, etc.)
• Proficient in integrating structured, semi-structured, and unstructured data sources to deliver analytics solutions using Data Warehouse, Data Lake, and Data Hubs
• Proficient in developing and operationalizing various data distribution patterns like, APIs, event-based, pub/sub-models
• Experience in migrating data warehouses from on-premises (Teradata or similar) to cloud (Azure Synapse Analytics or similar)
• Experience in monitoring the performance of data workflows and infrastructure through the use(00 of monitoring tools (Azure Monitor, Informatica, AppDynamics, etc.) and enabling alerts for prompt resolution
• Experience with DevOps toolset (Azure DevOps, Jenkins, Bitbucket, etc.) to collaborate and deploy code
• Knowledge of Big Data Technologies (Hadoop, Spark, etc.)
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application