Job Description
MLOps Engineer - AWS Workflow Specialist
- Location: India (Gurgaon) / Bangalore- two days in a month WFO
- Employment Type: 6 months contract
- Primary Focus: Production ML systems, MLOps, and scalable deployment on AWS for financial applications
- Immediate Joiners Only
- Budget: 250k per month
- Yrs. of Exp: 6 +
- Location - Permanent Remote with Mandatory 2 Days in a month from Gurgaon / Bengaluru office
Role Summary
We are looking for a strong MLOps Engineer (AWS Workflow Specialist) to design, orchestrate, and deploy end-to-end machine learning workflows on AWS for financial applications. You will productionize models following the Bank's approved patterns (to be provided), using AWS-native services and robust CI/CD to automate the full ML lifecycle from data ingestion to monitored inference.
Key Responsibilities
· Convert ML prototypes into robust, low-latency services for batch and real-time inference.
· Design and implement feature stores, training pipelines, and model registries using AWS-native tools.
· Build end-to-end ML pipelines using AWS services (e.g., SageMaker, Glue, Lambda, Step Functions, Redshift).
· Design, build, and deploy end-to-end ML workflows on AWS using SageMaker Pipelines and SageMaker Endpoints.
· Implement secure and compliant AWS integrations using S3, KMS, Lambda, and Secrets Manager.
· Automate deployments with AWS CI/CD tooling (CodeBuild, CodePipeline) and infrastructure-as-code patterns as per Bank standards.
· Orchestrate complex batch and event-driven workflows using Apache Airflow.
· Integrate streaming data and real-time inference triggers using Kafka.
· Optimize cost, performance, and reliability of production ML workloads on AWS.
· Develop PySpark and SQL transformations to support large-scale financial datasets.
· Ensure data quality, reproducibility, and observability across training and inference pipelines.
· Implement MLOps practices including CI/CD for ML, model versioning, and automated retraining.
· Set up monitoring for model drift, performance degradation, and security/compliance controls.
· Collaborate with Data Scientists and stakeholders to align ML solutions with business goals.
· Document architecture, runbooks, and operational guidelines for smooth handover and support.
Required Skills & Qualifications
· Strong programming skills in Python, PySpark, and SQL.
· Hands-on experience with AWS services: SageMaker, Glue, Lambda, Redshift, Step Functions (and related ecosystem).
· Hands-on experience designing and deploying SageMaker Pipelines and SageMaker Endpoints for production inference.
· Strong understanding of AWS security and platform services: S3, KMS, Lambda, and Secrets Manager.
· Experience with CI/CD automation on AWS using CodeBuild and CodePipeline (and related tooling).
· Workflow orchestration experience with Apache Airflow; streaming integration exposure with Kafka.
· Expertise in MLOps practices and production deployment of ML models.
· Familiarity with financial data and compliance requirements.
· Strong software engineering fundamentals (testing, code quality, API design, performance troubleshooting).
Preferred Qualifications
· Experience with SageMaker Pipelines and SageMaker Feature Store.
· Knowledge of streaming inference and event-driven architectures.
· AWS certifications (Machine Learning Specialty, Solutions Architect) are a plus.
· Experience implementing Bank/enterprise ML patterns, including governance, approvals, and standardized deployment templates.
· Experience with AWS EMR or Spark on AWS for large-scale data processing.
Requirements
Oracle HCM Functional Consultant
Apply for this Position
Ready to join ? Click the button below to submit your application.
Submit Application