Data Engineer - Permanent Contract
Cette offre a été pourvue !
Who are they?
Join us! Innovation, adventure, agile environment, start-up context… A dynamic team spirit and a company on a human scale aiming for excellence, that’s what defines us.
ORIS is the first digital platform for sustainable infrastructure materials. Supported by high-level artificial intelligence, ORIS evaluates road designs from a global perspective to improve the sustainability of road construction. This solution can reduce the cost of road construction projects by a third and carbon dioxide emissions by half, while tripling the durability and service life of roads. Through intelligent design, ORIS helps managers, road authorities and investors to increase sustainability and reduce inefficiencies in road construction. ORIS also helps material suppliers to present and manage their product catalogue and identify road project opportunities.
website: www.oris-connect.com
Rencontrez Laurent, Data Officer
Job description
We are looking for a Data Engineer position. We offer a permanent contract in Lyon, as soon as possible.
Main tasks:
Develop and maintain data integration pipelines from internal and external data sources to a datalake
Develop data flows from the datalake, allowing their explotation (Datamart, reinjection in ORIS, manual verification reporting, …)
Ensure automatic verification and continuous improvement of data quality
Ensure data security, accessibility and integrity within the framework of ISO 27001 certification
You may also be required to work on related data analysis or machine learning topics :
Implementing machine learning models from geo-spatial data with AWS Sagemaker or text recognition
Develop AI solutions to assess climate change resilience of infrastructure
Create interactive dashboards for internal and external users with AWS Quicksight
Preferred experience
Knowledge of the Python language for the creation of ETL and machine learning solutions
Knowledge of the AWS cloud environment and services such as S3, Lambda functions and Step functions, Sagemaker, …
Good knowledge of GitLab versioning
Knowledge of SQL databases
Ability to extract data via APIs
Knowledge of Docker containers
Knowledge of a BI tool ideally AWS Quicksight, or such as Tableau, Qliksense, PowerBI
A good command of English is essential
Mastery of algorithms and the development of structured, modular and reusable code
Ability to develop solutions iteratively in an agile environment (Kanban, Scrum)
Good knowledge of data modelling and business intelligence concepts
Ability to document well the codes or methodologies used.
Desire to work in a start-up structure, focused on innovation and sustainable development
Enjoy teamwork and knowledge sharing
Have a great technical curiosity and a capacity to learn and adapt quickly
Good communication skills and an entrepreneurial spirit
Requirements :
Master degree with an IS background and ideally a Data specialisation
Have previous experience in data management and development of data integration solutions in Python, ideally in the AWS environment
Recruitment process
A data test to realize
Several interviews with HR Manager and CDO, and then with the COO and CEO.