Tato pozice již není k dispozici.

Data Dev Engineer

Plný úvazek
Paris
Plat: Neuvedeno
Příležitostná práce z domova

L'olivier Assurance
L'olivier Assurance

Máte zájem o tuto nabídku?

jobs.faq.title

Pozice

Popis pozice

Build the data processing pipelines for data preparation and transformation, including raw data ingestion and curated data. They should be also responsible for implementing the data models which contain the KPIs and key analysis defined by the business.

Key Objectives

  • Work with the Data Architect to ensure development is aligned with the target architecture.
  • Develop the data ingestion processes to stream the data in the Data lake.
  • Implement the mechanism to generate the curated data layer with the different dataset available for the business.
  • Create the data models which will cover the KPIs defined by business.

Main Responsibilities

  • They focus on the development of the target architecture following the design set up by the data architect.
  • They will be responsible for developing the data ingestion pipelines to stream the data in the data platform.
  • They will be involved in the creation and structuring the curated layer.
  • They will participate in the DWH implementation and will support the KPIs generation.
  • They work closely with the Data Architect to implement the target architecture.
  • Participate in the ART events with the team.
  • Communicate with Scrum Master to escalate impediments or improvement for the wagon.
  • They work closely with the Product Owner and Scrum Master to execute the task assigned to achieve the goal of the wagon.
  • Maintain an agile mindset every day
  • You will participate in team and train ceremonies: PI planning, Sprint planning, Retrospectives, Daily scrum or stand up meetings.

Požadavky na pozici

You are enthusiasm to work in an European team to develop the technical capabilities in a common European Data Platform

Have an Agile mind-set and you are open to new ideas and creative ways to work

You are organized and get things done!

Must have:
Spark, PySpark, Python, SQL
Data warehouses modeling
BI tools: Microstrategy, Quicksight
AWS Cloud Services (EC2, S3, VPC, RDS, IAM, SQS, SNS, ECS, ECR, Fargate, API Gateway, CloudWatch, CloudFront, RedShift, Kinesis, Glue, Athena, Quicksight, GuardDuty, Inspector, Transcribe, Polly, Lex…)

Nice to have:
Docker, Git, Gitflow, Jupyter Notebook, pip, R, Java

Chcete se dozvědět více?

Tato volná pracovní místa by vás mohla zajímat!

Tyto společnosti rovněž nabírají pracovníky na pozici "{profese}".

Podívat se na všechny nabídky