Job Description
What you’ll be doing
Architecture & Design
• Design and implement Kafka-based streaming solutions for high availability and fault tolerance.
• Define topic structure, partitioning, replication, retention, and compaction policies.
• Architect stream processing pipelines using Apache Flink for real-time analytics.
• Implement disaster recovery, geo-replication, and security (TLS, SASL, RBAC, ACLs).
• Hands on experience in setting up confluent kafka on On-prem and cloud platforms
• Optimize cloud resources for cost efficiency and performance.
• Implement Dynatrace for end-to-end monitoring of Kafka clusters and streaming jobs.
• Configure dashboards, alerts, and performance metrics for proactive issue detection.
• Build producers, consumers, and connectors using Kafka Connect.
• Integrate with data lakes, databases, and APIs across hybrid environments.
• Build POC's for different solutions.
Collaboration ...
Apply for this Position
Ready to join BT Group? Click the button below to submit your application.
Submit Application