Job Description
The Role:
Key Responsibilities:
- Design, develop, and maintain data ingestion pipelines using Apache NiFi for structured and unstructured data.
- Implement data integration workflows between various databases, APIs, and cloud storage systems.
- Develop and optimize SQL queries, stored procedures, and database objects to support ETL processes.
- Build data models (conceptual, logical, and physical) to support analytics and reporting requirements.
- Use Python for custom data transformations, automation scripts, and API integration tasks.
- Monitor, troubleshoot, and optimize NiFi data flows for performance and reliability.
- Collaborate with data architects and analysts to understand data requirements and ensure alignment with business needs.
- Implement best practices for data governance, security, and scalability in data workflows.
Ideal Profile:
- Hands-on experience in Apache NiFi for data ingestion, flow design, and transformation
- Strong understanding of data modelling concepts (logical, physical, and dimensional)
- Proficiency in SQL and database technologies (e.g., Oracle, PostgreSQL, MySQL)
- Experience with ETL processes, data integration, and data governance
- Excellent communication and collaboration skills
What's on Offer?
- Flexible working options
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application