Job Description
Roles & Responsibilities
- Monitor operational workflows and system processes (data ingestion, post-ingestion, and UI delivery) to ensure smooth and uninterrupted service.
- Monitor the health of our applications and troubleshoot by following provided runbooks or documentation.
- Investigate and understand root causes for issues in business operations; develop and implement corrective actions
- Log issues, delays, and anomalies accurately and escalate critical problems per internal protocols.
- Respond to client queries promptly and professionally, ensuring resolution within defined timelines.
- Work with clients and internal teams to provide timely solutions while ensuring compliance with company standards and regulations.
- Deliver outputs accurately and within agreed SLAs, maintaining high reliability standards.
- Collaborate with Engineering, Product, and Client Success teams to resolve incidents, align on expectations, and drive process improvements.
- Maintain detailed records of incidents, client interactions, and operational metrics for reporting, audits, and analysis.
- Identify recurring issues and contribute to the design of preventive measures and long-term solutions.
- Use SQL, Shell scripting, and programming skills (python) as part of technical operations support.
- Support automation and workflow improvements to increase operational efficiency.
Required Qualifications
- 5+ years of experience in operations, data engineering, or Production support roles.
- Proven hands-on experience in querying and analyzing data, Advanced Excel, and MySQL.
- Experience in Linux basics and Networking fundamentals.
- Experience using cloud technologies such as AWS.
- Strong analytical and troubleshooting skills.
- Familiarity with workflow orchestration tools (Airflow), cloud infrastructure, and containerization (Kubernetes).
- Strong communication, documentation, and cross-team collaboration skills.
- Ability to work in a fast-paced, client-focused environment.
- Proactive, detail-oriented, and committed to continuous improvement.
Preferred Qualifications
- Experience working in SaaS, energy, or data-driven industries.
- Experience with Datadog or similar health monitoring systems.
- Experience in pySpark, data management, data integration, ETL, data quality and controls, data analysis, reporting, and testing.
- Familiarity with UI-based client platforms and Excel API integrations.
- Exposure to monitoring and alerting tools, including custom scripting.
- Exposure to Git
- Experience using Agile methodologies.
- Experience working with Jira or other issue-tracking systems.
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application