Sr. Data Platform Engineer

Job summary
Freelance
Paris
A few days at home
Salary: Not specified
Skills & expertise
Composer
Vertex AI
Google Cloud Storage
Github Actions
Apache Airflow
+5

Lenstra
Lenstra

Interested in this job?

Questions and answers about the job

The position

Job description

Lenstra was founded by passionate computer science engineers with a proven track record of delivering high-quality solutions. By combining technical excellence with a strong vision, we support top-tier clients across industries such as Banking & Insurance, Luxury, and Technology.

Our expertise is structured around four core pillars: Software Development, DevSecOps, Data & AI, and Product. Through a holistic understanding of our clients’ environments, we help them address their most complex challenges—from building robust software and secure cloud platforms to designing data-driven solutions that accelerate business impact.

As a Sr. Data Platform Engineer your will:

  • Provide hands-on technical support to the data team, including troubleshooting, debugging, and performance optimisation.

  • Assist and mentor Data Analytics Engineers, helping them resolve technical issues and grow their skill sets.

  • Actively contribute to the upskilling of local data teams through coaching and best practices.

  • Identify, specify, and industrialise new features for the data platform, following an inner-source model.

  • Contribute to the evolution of shared data assets, pipelines, and tooling across the group.

  • Ensure scalability, reliability, and maintainability of data workflows.

  • Serve as a key bridge between the local team and the central Data Platform team.

  • Translate business and technical needs from the local team into actionable platform requirements.

  • Ensure alignment with group-wide standards, processes, and governance frameworks.

  • Monitor and ensure compliance with data processes and best practices.

  • Participate in defining and improving delivery, CI/CD, and operational processes.

  • Support the adoption of data mesh principles across domains.


Preferred experience

  • Strong expertise in Google Cloud Platform, with a focus on data and ML services: BigQuery, Vertex AI / Vertex Pipelines, Cloud Run, Google Cloud Storage (GCS).

  • Solid experience with GitHub Actions for CI/CD automation.

  • Good knowledge of Apache Airflow / Cloud Composer.

  • Strong proficiency with dbt (data modeling, testing, deployment).

  • Experience with Kubeflow and ML orchestration.

  • Familiarity with data mesh architectures and federated data governance models.

  • Experience working in large, multi-entity organisations or luxury/retail environments.


Recruitment process

  • 30 minutes recruiter screen

  • 1h role and cultural fit interview

  • 1h Dive Deep interview

Want to know more?