Este puesto ya no está disponible.

Data/Machine Learning Ops Engineer

Resumen del puesto
Indefinido
Paris
Salario: 50K a 70K €
Unos días en casa
Experiencia: > 3 años
Competencias y conocimientos
Confidencialidad de los datos
Gestión de bases de datos
Capacidad de resolución de problemas
Kubernetes
Git
+9

Lalilo
Lalilo

¿Te interesa esta oferta?

jobs.faq.title

El puesto

Descripción del puesto

About the position

We are looking for an experienced Data/ML Ops Engineer to join our Data team. This role is pivotal in maintaining the scalability, performance, and reliability of our data platform, which processes billions of data points daily. You will also be leading the improvement of our ML pipelines. The ideal candidate would have a strong background in data engineering and dev Ops and a passion for simple solutions to complex problems in a fast-paced, dynamic environment.

Key Responsibilities:

  • Improve our data infrastructure using principles, focusing on automation, scalability, and security.

  • Build, and maintain scalable and reliable data processing pipelines to support real-time data analytics and machine learning models.

  • Implement robust monitoring, alerting, and incident response systems to ensure availability and performance of our data platform.

  • Improve our ML pipelines infrastructure.

  • Collaborate with cross-functional teams to implement solutions that enhance the efficiency and reliability of our data-intensive applications.

  • Participate in on-call rotations, providing expert-level troubleshooting and resolution for production issues.

Who would you work with?

You will be part of the data team, which is composed of 8 people : data analysts, data scientists, ML engineers, and an analytics data engineer. You will report to Morgane Daniel, our AI manager and will work closely with Kim Nguyen our analytics engineer.


Requisitos

About you

  • You have significant experience in the data world

  • You have an interest in Cloud computing technologies

  • You’re familiar with Terraform, Docker and Kubernetes

  • You have a strong experience in coding with Python

  • You have a good knowledge of data transformation tools and cloud platforms

  • You like to innovate and make data architectures evolve towards greater scalability

Requirements

  • 3+ years of experience in MLOps / DataOps / DevOps / Data Engineer

  • Proficiency in distributed high data volume processing and storage technologies such as Snowflake, Glue, etc

  • Strong background in cloud platforms (AWS, GCP or Azure), and infrastructure as code (Terraform)

  • Python, development best practices (versioning, testing, …) and core tools (git, terminal, …) knowledge

  • Experience in SQL database administration

  • Excellent problem-solving, communication, and collaboration skills

  • Outstanding verbal and writing skills, professional English

  • Experience with CI/CD and ETL tools

Nice to have

  • Containerization (Docker, Kubernetes)

  • Data modeling (DBT)

  • MLOps practices (DVC)

  • Understanding of data security, privacy and compliance


Proceso de selección

Hiring process

  • Screening call : 30 min

  • Technical test : 1h30 max

  • Use case test : 1h30 max

  • Motivation/culture fit interview : 30 min

  • Reverse calls

  • Reference calls

Benefits:

⭐ Outstandingly mission-based and fun work environment :)

🌴 25 days of PTO + 10 days of RTT + additional time off as needed

🏥 Health insurance with Swisslife (basic package paid 100% by Lalilo)

💙 Take care of your mental health and get free counseling with Lifeworks

🧘🏻‍♀️ Meditate with a free subscription to Calm

🚉 75% refund for your Navigo card

🥗 Luncheon vouchers

💻 Hybrid and flexible working environment

🧘🏻‍♀️Weekly yoga classes at the office or on Zoom

🕺🏼Get involved with Lali-tivities! (after works, running club, pastry of the week, and more!)

¿Quieres saber más?

¡Estas ofertas de trabajo te pueden interesar!

Estas empresas también contratan para el puesto de "{profesión}".

Ver todas las ofertas