Job Description

Overview WayOps Madrid, Community of Madrid, Spain Join to apply for the AI Engineer (Remote) role at WayOps . Responsibilities The selected candidate will join a newly formed team whose mission is to automate, via MLOps, the responsible creation of models on the analytics platform. Working with the AI Architect, the AI Engineer will develop Python libraries that enable integration of scientific code with the MLOps process and code archetypes that use these libraries. It is essential to have a technical background in programming and familiarity with both DevOps and the lifecycle of scientific models. Project & Team The project aims to adapt the existing analytics platform to integrate responsible model creation and automate deployment via MLOps. Since the base technology already includes Azure Databricks and Azure Machine Learning Services, the implementation will be done with a process governed from Azure DevOps, integrating via SDK with other services for automatic resource creation or by implementing deployment pipelines. To ensure success, specialists who can design architecture, configure pipelines, and create code archetypes that integrate with Azure Machine Learning services are required. Each initiative must support local or remote development against a Databricks cluster or an Azure Machine Learning Compute. All project and deployment configurations must be automated. The project team will include MLOps automation engineers (MLOps Engineer) and AI industrialization engineers (AI Engineer) supervised by the Team Lead. The AI Architect will work closely with the Team Lead and will lead the AI Industrialization tasks. The project will also be overseen by the Enterprise Architect with support from the platform specialist. The total project team will be about eight people. Experience & Knowledge The candidate should have 2-3 years of experience as an AI Engineer participating in the industrialization of scientific models and developing reference Python libraries in production environments. In addition, 3-5 years of experience as a Software Engineer developing data transformation applications or operationalizing scientific models is required. Experience as a scientist and/or data engineer is valued, as well as experience defining programming best practices, knowledge and experience with CI/CD in Azure DevOps, and any prior experience with the Azure Machine Learning SDK v2. Must have experience with the following technologies: Azure (Databricks, Azure Machine Learning, Storage, Data Factory) Azure Machine Learning (Experiment Tracking, Model Registry, AML Studio, AML SDK v2, MLflow) Azure DevOps (Boards) Databricks (PySpark, Dataframes, Delta Tables, Unity Catalog, Databricks Connect) Python Development (Click, Poetry, Pipx, Opencensus, Black, Pdb, fastAPI) QA & Testing (Kiuwan, JMeter, PyTest) Tools (Visual Studio Code, Git) Additional valued experience Azure (Cosmos DB, SQL Databases, Application Insights, Azure Monitor, App Service) Azure Machine Learning (AML Pipelines, AML Endpoints, AML Environments, AML Compute) Azure DevOps (Pipelines, Repos, Test Plans, Artifacts) Machine Learning (sklearn, mllib, h2o, tensorflow, keras) Responsible AI (AML RAI SDK, Fairlearn, InterpretML, DiCE, EconML) Contract & Location La contratación será mediante contrato anual prorrogable como autónomo en jornada completa. El trabajo se desarrollará en remoto preferentemente dentro del horario de oficina del cliente para facilitar la coordinación con el resto del equipo. Banda salarial negociable en función de la experiencia aportada. Incorporación inmediata. Seniority Mid-Senior level Employment type Contract Job function Information Technology Industries IT Services and IT Consulting J-18808-Ljbffr

Apply for this Position

Ready to join ? Click the button below to submit your application.

Submit Application