Job summary
Permanent contract
Salary: Not specified
A few days at home
Skills & expertise
Time management
Collaboration and teamwork
Problem-solving skills


Interested in this job?

Questions and answers about the job

The position

Job description

Lenstra is currently helping multiple major companies (> 3 B$ revenue) setting up their self service Data Platforms. Our scope of work encompasses developing the Data Platform product vision taking into account our client’s strategic goals and existing technological environment, and then implementing the Data Platform including its deep integration with our client’s IT, security and legal requirements.

In this context we are looking for passionate Data Engineers to join our teams (freelancers welcome), to help put in place self service tools for data practitioners working on our platforms. The scope of work includes providing as a service, data ingestions tools, orchestration tools and data transformation engines.

You will have the opportunity to work on a state-of-the-art stack which includes tools such as GitHub actions, dbt, Terraform & Vault and various new open-source players such as Dagster.

If you are a highly motivated individual with a passion for Data Engineering and are excited to build tools used by large teams, this is a great opportunity for you.

Preferred experience

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, Statistics, or a related field.

  • At least 5+ years of experience in data engineering, analytics engineering, or a related field.

  • Comfortable with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources.

  • Experience deploying and managing data pipelines, data models, and other data applications in cloud environments such as AWS, Azure, or GCP.

  • Familiarity with security best practices and tools like Vault for secrets management, encryption, and key management.

  • Experience working with CI/CD pipelines and tools like Jenkins, CircleCI, or GitLab to automate building, testing, and deploying data applications.

  • Familiarity with monitoring tools like Prometheus and Grafana to monitor the health and performance of data pipelines and systems, and set up alerts when issues occur.

  • Excellent analytical and problem-solving skills, with the ability to work on complex and ambiguous problems.

  • Strong communication and collaboration skills, with the ability to work in a team environment.

  • Ability to manage multiple priorities and work effectively.

  • Experience with dbt or dataform.

  • Strong skills in Python.

  • Familiarity with data visualization tools such as Tableau, Power BI, or similar.

Recruitment process

  • 1 introductory call with the recruiter

  • 1 technical interview with our CTO

  • 1 behavioural interview with one of the partners

  • 1 final discussion with the client

Want to know more?

These job openings might interest you!

These companies are also recruiting for the position of “Data / Business Intelligence”.

See all job openings