Senior Data Engineer

Resumen del puesto
Salario: 65K a 75K €
Unos días en casa
Competencias y conocimientos
Contenido generado
Gestión del tiempo
Colaboración y trabajo en equipo
Capacidad de resolución de problemas


¿Te interesa esta oferta?


El puesto

Descripción del puesto

Lenstra is currently helping multiple major companies (> 3 B$ revenue) setting up their self service Data Platforms. Our scope of work encompasses developing the Data Platform product vision taking into account our client’s strategic goals and existing technological environment, and then implementing the Data Platform including its deep integration with our client’s IT, security and legal requirements.

In this context we are looking for passionate Data Engineers to join our teams (freelancers welcome), to help put in place self service tools for data practitioners working on our platforms. The scope of work includes providing as a service, data ingestions tools, orchestration tools and data transformation engines.

You will have the opportunity to work on a state-of-the-art stack which includes tools such as GitHub actions, dbt, Terraform & Vault and various new open-source players such as Dagster.

If you are a highly motivated individual with a passion for Data Engineering and are excited to build tools used by large teams, this is a great opportunity for you.


  • Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, Statistics, or a related field.

  • At least 5+ years of experience in data engineering.

  • Comfortable with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources.

  • Experience deploying and managing data pipelines, data models, and other data applications in cloud environments such as AWS, Azure, or GCP.

  • Familiarity with security best practices and tools like Vault for secrets management, encryption, and key management.

  • Experience working with CI/CD pipelines and tools like Jenkins, CircleCI or GitLab to automate building, testing, and deploying data applications. 

  • Familiarity with monitoring tools like Prometheus and Grafana to monitor the health and performance of data pipelines and systems, and set up alerts when issues occur.

  • Excellent analytical and problem-solving skills, with the ability to work on complex and ambiguous problems.

  • Strong communication and collaboration skills, with the ability to work in a team environment.

  • Ability to manage multiple priorities and work effectively.

  • Experience with dbt or dataform.

  • Strong skills in Python.

  • Familiarity with data visualization tools such as Tableau, Power BI, or similar.

Proceso de selección

  • 1 introductory call with the recruiter

  • 1 technical interview with our CTO

  • 1 behavioural interview with one of the partners

  • 1 final discussion with the client

¿Quieres saber más?

¡Estas ofertas de trabajo te pueden interesar!

Estas empresas también contratan para el puesto de "{profesión}".

Ver todas las ofertas