Our team context:
Dailymotion is seeking an Analytics Engineer to join the Data Engineering Teams. Our craft is responsible for most of the Data products at Dailymotion; your work will have an impact throughout Dailymotion’s business and help make data-driven decisions on products and strategy.
You'll join the Data Engineering craft at Dailymotion, to create and maintain data products. The Analytics Engineering team focuses on providing reliable data for analysis company-wide. This includes building and managing our multi-petabyte data warehouse, highly scalable client-facing analytics, data ingestion & distribution, and data synchronization systems. As an Analytics Engineer, you'll blend software engineering and data skills to maintain code and model data effectively. If you're eager to solve challenging business problems, this role offers a broad impact across all of Dailymotion's businesses.
Responsibilities:
Our stack runs almost exclusively on Google Cloud Platform. You will work in an environment made up of Data Lakes (BigQuery, etc.), data streaming platforms (Beam / Dataflow, Flink, etc.), orchestration and scheduling platforms (Airflow), container-oriented deployment, and management platforms (Docker, K8S, JenkinsX), SQL, Data Quality Tools (DBT, Sifflet). You will also participate in data modeling activities and design of data flows until their implementation and support in production.
- Ingest extensive volumes of raw data from both internal and external sources using batch and streaming methods.
- Expose the data through various means such as APIs, datamarts, and flat files for both internal and external users.
- Build complex and efficient SQL queries to transform data within our data lake into reliable business entities and reporting aggregates. Identify and manage dependencies for these transformations, scheduling them using tools like Airflow.
- Investigate discrepancies and quality issues in the data, as well as addressing performance issues.
- Design optimized and cost-efficient data models in BQ while addressing business use cases.
- Design Druid (preferred) datasets tailored for external consumers, prioritizing speed, consistency, cost-effectiveness, and efficiency.
- Ensure data cleanliness, consistency, and availability by performing data quality checks and implementing monitoring.
- Catalog and document various aspects of the data, including business entities, datamarts, dimensions, metrics, and business rules.
- Serve as a subject matter expert on business entities and datamarts, providing training to users on SQL and analytics best practices (collaboration with Business Insight).
- Innovate by proposing new tools, processes, documentation, and exploring emerging technologies during designated cool-down periods.
Informations supplémentairesWhat we offer you:
• Additional opportunities as we grow and learn together.
• Join our open, collaborative culture.
• Exciting, dynamic projects to work on.
• Flexibility.
For the France offices
- 🏡Hybrid Work Framework (4 types of remote work : Full office /Flex office (1/2 days remote) / Flex remote (1/2 days at the office) / Full remote + ability to work 3 month abroad)
- 💰 Saving Plan Vivendi
- 🍼 Paternity leave or Coparental leave extended
🕶️ Living Employee Culture (Events / Trainings / Partys / All hands / Dailymotion tradition…) - 🚀 Career development support (training / internal mobility / compensation cycle / 360 quarter feedback review …)
- 🏥 High-end Health Insurance and Personal Services Vouchers (CESU)
- ⛱️ Paid Time off – RTT and Saving time plan (CET)
- ✅ Meal Vouchers – Public Transport and Bike refund
- 🎡 European Economic and Social Committee (sport membership/cinemas vouchers/gift vouchers/discount)