- IT / Digital, Big Data
- Entre 250 et 2000 salariés
Lead Big Data Engineer
- Éducation : Non spécifié
- Expérience : > 2 ans
Who are they?
AB Tasty is a customer experience optimization and feature management company. We help brands build better user experiences and unlock new levels of possibilities, faster. Thanks to our 2 platforms (AB Tasty and Flagship by AB Tasty), our ambition is to reinvent the way Marketing, Product and Tech teams develop Product (websites/app) by easily shipping new features & messages.
We have :
- 1000 customers, including Le Bon Coin, Cdiscount, Carrefour …
- 250+ employees in 7 countries on 3 continents (Americas, Europe, Asia)
- Raised $64 million to grow globally
AB Tasty (www.abtasty.com) is the customer experience optimization company several times rated as one of the Best Places to Work. We help build the internet of the future by allowing brands to address their users in a personalized way. Images, messages, page structure... everything can be adapted to meet the needs, wishes, and emotions of website visitors or app users.
We are looking for a Lead Big Data Engineer to be part of the our data team, composed of 7 members who work closely with both the DevOps and the Data Science teams, that is mainly in charge of developing and monitoring the data collect pipeline. The collect, which processes a few terabytes of data per day, has been deployed on a Google Cloud Platform environment and is critical to AB Tasty and Flagship. Different GCP environments are available to ensure the good development and deployment of features as well as data modelisation and documentation.
You will report to the Data Engineering Team Leader, based in Paris.
📍Contract & Location
👀 A few examples of your responsibilities
Onboard, mentor and upskill the Data Engineer team
Be responsible of the technical quality and compliance with the technical frame of the delivery
Ensure the data engineer team uses appropriate engineering practices
Ensure the quality of collected data, either on a technical point of view (data consistency), and the business analysis for the client (time series analysis)
Develop streaming pipelines in the GCP environment (Dataflow, Pub/Sub, Big Table, Big Query)
Design new API / Architecture optimisations in order to improve performances
Test & Learn new technologies
Invest in continual improvements of the platform to meet the changing needs and environment
Be actively involved in the Data Engineering community
🎁 What we offer
🕵️ What we look for