Senior Data Engineer (full remote or hybrid)


Senior Data Engineer (full remote or hybrid)




  • Intelligence artificielle / Machine Learning, Logistique
  • Entre 50 et 250 salariés

Le poste

Senior Data Engineer (full remote or hybrid)

Who are they?

Founded in 2014, Shippeo is a global leader in real-time multimodal transportation visibility, helping major shippers and logistics service providers operate more collaborative, automated, sustainable, profitable, and customer-centric supply chains.

Hundreds of customers, including global brands such as Coca-Cola HBC, Carrefour, Renault Group, Schneider Electric, Total, Faurecia, Saint-Gobain and Eckes Granini, trust Shippeo to track more than 28 million shipments per year across 92 countries.

Having already raised €110 million in funding, Shippeo grows on average by 80% year on year. Their team of Shippians comprises 28 different nationalities, speaking a total of 24 languages.

Want to know more about Shippeo?Company culture, teams, technical stack, jobs... Let's go for an immersion!
Visit the profile

Job description

We are looking for a Data Engineer to join our Data Intelligence Tribe.

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions
  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking
  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms.
You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),
  • Batch data transformation (Airflow, DBT),
  • Cloud Data Warehousing (Snowflake, BigQuery),
  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

Preferred experience


  • You have a degree (MSc or equivalent) in Computer Science.
  • 5+ years of experience as a Data Engineer.
  • Experience building, maintaining, testing and optimizing data pipelines and architectures
  • Programming skills in Python and experience with asynchronous event processing (asyncio).
  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.
  • Working knowledge of message queuing and stream processing.
  • Knowledge of Docker and Kubernetes.
  • Knowledge of a cloud platform (preferably GCP).
  • Experience working with workflow management systems such as Airflow.


  • Experience with cloud based data warehouse solutions (BigQuery, Snowflake).
  • Experience with Kafka and KafkaConnect (Debezium).
  • Experience with Infrastructure as code (Terraform/Terragrunt).
  • Experience building and evolving CI/CD pipelines with Github Actions.
  • Monitoring and alerting on Grafana / Prometheus.
  • Experience working on Apache Nifi.

Cette offre vous tente ?

Questions and answers about the offer

D'autres offres de Data Engineering

Ces offres peuvent vous intéresser !

Voir toutes les offres