Cette offre n’est plus disponible.

Data/Machine Learning Ops Engineer

Résumé du poste
Salaire : 50K à 70K €
Télétravail fréquent
Expérience : > 3 ans
Compétences & expertises
Confidentialité des données
Gestion de bases de données
Aptitude à résoudre les problèmes


Cette offre vous tente ?

Questions et réponses sur l'offre

Le poste

Descriptif du poste

About the position

We are looking for an experienced Data/ML Ops Engineer to join our Data team. This role is pivotal in maintaining the scalability, performance, and reliability of our data platform, which processes billions of data points daily. You will also be leading the improvement of our ML pipelines. The ideal candidate would have a strong background in data engineering and dev Ops and a passion for simple solutions to complex problems in a fast-paced, dynamic environment.

Key Responsibilities:

  • Improve our data infrastructure using principles, focusing on automation, scalability, and security.

  • Build, and maintain scalable and reliable data processing pipelines to support real-time data analytics and machine learning models.

  • Implement robust monitoring, alerting, and incident response systems to ensure availability and performance of our data platform.

  • Improve our ML pipelines infrastructure.

  • Collaborate with cross-functional teams to implement solutions that enhance the efficiency and reliability of our data-intensive applications.

  • Participate in on-call rotations, providing expert-level troubleshooting and resolution for production issues.

Who would you work with?

You will be part of the data team, which is composed of 8 people : data analysts, data scientists, ML engineers, and an analytics data engineer. You will report to Morgane Daniel, our AI manager and will work closely with Kim Nguyen our analytics engineer.

Profil recherché

About you

  • You have significant experience in the data world

  • You have an interest in Cloud computing technologies

  • You’re familiar with Terraform, Docker and Kubernetes

  • You have a strong experience in coding with Python

  • You have a good knowledge of data transformation tools and cloud platforms

  • You like to innovate and make data architectures evolve towards greater scalability


  • 3+ years of experience in MLOps / DataOps / DevOps / Data Engineer

  • Proficiency in distributed high data volume processing and storage technologies such as Snowflake, Glue, etc

  • Strong background in cloud platforms (AWS, GCP or Azure), and infrastructure as code (Terraform)

  • Python, development best practices (versioning, testing, …) and core tools (git, terminal, …) knowledge

  • Experience in SQL database administration

  • Excellent problem-solving, communication, and collaboration skills

  • Outstanding verbal and writing skills, professional English

  • Experience with CI/CD and ETL tools

Nice to have

  • Containerization (Docker, Kubernetes)

  • Data modeling (DBT)

  • MLOps practices (DVC)

  • Understanding of data security, privacy and compliance

Déroulement des entretiens

Hiring process

  • Screening call : 30 min

  • Technical test : 1h30 max

  • Use case test : 1h30 max

  • Motivation/culture fit interview : 30 min

  • Reverse calls

  • Reference calls


⭐ Outstandingly mission-based and fun work environment :)

🌴 25 days of PTO + 10 days of RTT + additional time off as needed

🏥 Health insurance with Swisslife (basic package paid 100% by Lalilo)

💙 Take care of your mental health and get free counseling with Lifeworks

🧘🏻‍♀️ Meditate with a free subscription to Calm

🚉 75% refund for your Navigo card

🥗 Luncheon vouchers

💻 Hybrid and flexible working environment

🧘🏻‍♀️Weekly yoga classes at the office or on Zoom

🕺🏼Get involved with Lali-tivities! (after works, running club, pastry of the week, and more!)

Envie d’en savoir plus ?

D’autres offres vous correspondent !

Ces entreprises recrutent aussi au poste de “Données/Business Intelligence”.

Voir toutes les offres