Résumé du poste
Salaire : Non spécifié
Télétravail fréquent
Compétences & expertises
Gestion du temps
Collaboration et travail d'équipe
Aptitude à résoudre les problèmes


Cette offre vous tente ?


Le poste

Descriptif du poste

Lenstra is currently helping multiple major companies (> 3 B$ revenue) setting up their self service Data Platforms. Our scope of work encompasses developing the Data Platform product vision taking into account our client’s strategic goals and existing technological environment, and then implementing the Data Platform including its deep integration with our client’s IT, security and legal requirements.

In this context we are looking for passionate Data Engineers to join our teams (freelancers welcome), to help put in place self service tools for data practitioners working on our platforms. The scope of work includes providing as a service, data ingestions tools, orchestration tools and data transformation engines.

You will have the opportunity to work on a state-of-the-art stack which includes tools such as GitHub actions, dbt, Terraform & Vault and various new open-source players such as Dagster.

If you are a highly motivated individual with a passion for Data Engineering and are excited to build tools used by large teams, this is a great opportunity for you.

Profil recherché

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, Statistics, or a related field.

  • At least 5+ years of experience in data engineering, analytics engineering, or a related field.

  • Comfortable with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources.

  • Experience deploying and managing data pipelines, data models, and other data applications in cloud environments such as AWS, Azure, or GCP.

  • Familiarity with security best practices and tools like Vault for secrets management, encryption, and key management.

  • Experience working with CI/CD pipelines and tools like Jenkins, CircleCI, or GitLab to automate building, testing, and deploying data applications.

  • Familiarity with monitoring tools like Prometheus and Grafana to monitor the health and performance of data pipelines and systems, and set up alerts when issues occur.

  • Excellent analytical and problem-solving skills, with the ability to work on complex and ambiguous problems.

  • Strong communication and collaboration skills, with the ability to work in a team environment.

  • Ability to manage multiple priorities and work effectively.

  • Experience with dbt or dataform.

  • Strong skills in Python.

  • Familiarity with data visualization tools such as Tableau, Power BI, or similar.

Déroulement des entretiens

  • 1 introductory call with the recruiter

  • 1 technical interview with our CTO

  • 1 behavioural interview with one of the partners

  • 1 final discussion with the client

Envie d’en savoir plus ?

D’autres offres vous correspondent !

Ces entreprises recrutent aussi au poste de “Data / Business Intelligence”.

Voir toutes les offres