We are looking for a Data Engineer who will strengthen our data infrastructure by improving our existing data pipeline. He/she will work closely with Santiago, Head of Data, to design the foundations of our upcoming projects data projects
Missions
• Build scalable pipelines (recommendation system, optimize data storage, API connections etc.)
• Build cost-efficient pipelines (data warehousing principles, computing and storage costs)
• Ensures code quality remains at its highest level on the dataserver and new structures
Our current technical stack :
• Python 3
• Tensorflow
• PostgreSQL
• AWS
• You have a significant Data Engineer experience of at least 2 years in another company .
• You know Python & SQL and you work with it.
• You have a distributed system experience with Hadoop / Spark.
• You have built public API connections before.
• You have a task planification experience with Airflow.
• First 30 minutes interview with HR or Head of Data
• Technical challenge to do at home
• In person meeting in our offices with the data team (2h) with logic exercises or peer-programming and fit
• In person meeting in our offices with the CEO
• Final steps (reference checks) towards an offer
These companies are also recruiting for the position of “Data / Business Intelligence”.