Analytics Engineer – AI Enablement [Freelance]

Freelance
Paris
Télétravail fréquent
Salaire : Non spécifié
Postuler

Equativ
Equativ

Cette offre vous tente ?

Postuler
Questions et réponses sur l'offre

Le poste

Descriptif du poste

About the team

At Equativ, we’re on a mission to develop advertising technologies that empower our customers to reach their digital business goals. This means that we rely on massively scalable, widely distributed, highly available, and efficient software systems; the platform deals with over 200 billions auctions per day and sends to the data stack 20Tb per day. 

Our analytics team is composed of 8 skilled data analysts, mainly based in Paris and working with all teams in the company: Sales, Product, Finance, Account Management… across 15+ locations worldwide

Our Mission

Our Analytics team is at the heart of Equativ’s data-centric business, bridging the gap between raw data and business-critical insights. We are currently modernizing our data stack to make our data easier to manage, access, query, and use for AI applications.

In order to accelerate this transition, our current objective is to finish cleaning up our legacy infrastructure, implement dbt at scale and build out our new Snowflake Intelligence features so our business teams can get what they need faster and more reliably

What you will do

as a Freelance Analytics Engineer, you will play a pivotal role in finalizing our data stack migration and setting up our Snowflake Intelligence environment. You will take ownership of technical tasks focused on clean architecture and code standardization. Your mission will be split across three main areas:

Snowflake Migration:

  • Migrate existing Snowflake tables from legacy databases to our new medallion-like data warehouse structure
  • Deprecate remaining legacy pipelines (mostly in Talend) over to generic or custom Python jobs
  • Identify and mark tables for deprecation, to finalize the end of our old warehouse structure.
  • Dbt Implementation:

  • Lead the operational shift to dbt by migrating legacy aggregation jobs, Tableau views, and dynamic tables into our dbt project
  • Optimize our codebase by developing and implementing dbt macros to harmonize data transformations and reduce duplication.
  • AI & Snowflake Intelligence:

  • Define user stories and build out new data sources for our Snowflake Intelligence agent
  • Enrich our semantic layer and existing models to handle more complex analyses and increase user adoption
  • Accelerate the documentation of all tables and columns to provide context to all AI tools and users connected to Snowflake
  • About you

    - Available for 3 months for at least 3 days per week

    - Master degree in Computer Science or similar technical field of study

    - 3+ years as an Analytics Engineer, BI Engineer, or Data Engineer with proven autonomy in freelance missions

    - Passion for working with large datasets and a commitment to data-driven decision-making

    - SQL mastery is a must

    - Knowledge of Python for custom sourcing jobs

    - Hands-on experience with Snowflake and dbt (specifically writing macros and working with semantic models)

    - Good understanding of software development processes (Git, CI/CD), or experience with setting up analytics AI agents are a plus

    - Organized problem-solver, comfortable managing your own backlog and communicating technical changes

    - Working proficiency and communication skills in verbal and written English

    Envie d’en savoir plus ?

    Postuler