HAVAS MARKET - CDI - DATA ENGINEER (DATA SCIENCE)

Permanent contract
Leeds
Salary: Not specified
Apply

Havas France
Havas France

Interested in this job?

Apply
Questions and answers about the job

The position

Job description

Data Engineer (Data Science) 

 

Reporting To: Head of Data Science 

Hiring Manager: Head of Data Science 

Office Location: BlokHaus West Park, Ring Rd, Leeds LS16 6QG 

 

About Us – Havas Media Network 

Havas Media Network (HMN) employees over 900 people in the UK & Ireland. We are passionate about helping our clients create more Meaningful Brands through the creation and delivery of more valuable experiences. Our Havas mission: To make a meaningful difference to the brands, the businesses and the lives of the people we work with.  

HMN UK spans London, Leeds, Manchester & Edinburgh, servicing our clients brilliantly through our agencies including Ledger Bennett, Havas Market, Havas Media, Arena Media, DMPG and Havas Play Network.  

This role will be part of Havas Market, our performance-focused digital marketing agency.  

Our values shape the way we work and define what we expect from our people: 

  • Human at Heart: You will respect, empower, and support others, fostering an inclusive workplace and creating meaningful experiences. 

  • Head for Rigour: You will take pride in delivering high-quality, outcome-focused work and continually strive for improvement. 

  • Mind for Flair: You will embrace diversity and bold thinking to innovate and craft brilliant, unique solutions. 

These behaviours are integral to our culture and essential for delivering impactful work for our clients and colleagues. 

The Role 

In this position, you'll play a vital role in delivering a wide variety of projects for our clients and internal teams. You’ll be responsible for creating solutions to a range of problems – from bringing data together from multiple sources into centralised datasets, to building predictive models to drive optimisation of our clients’ digital marketing. 

We are a small, highly collaborative team, and we value cloud-agnostic technical fundamentals and self-sufficiency above specific platform expertise. The following requirements reflect the skills needed to contribute immediately and integrate smoothly with our existing workflow. 

Key Responsibilities 

  • Build and maintain data pipelines to integrate marketing platform APIs (Google Ads, Meta, TikTok, etc.) with cloud data warehouses, including custom API development where platform connectors are unavailable 

  • Develop and optimize SQL queries and data transformations in BigQuery and AWS to aggregate campaign performance data, customer behavior metrics, and attribution models for reporting and analysis 

  • Design and implement data models that combine first-party customer data with marketing performance data to enable cross-channel analysis and audience segmentation 

  • Deploy containerized data solutions using Docker and Cloud Run, ensuring pipelines run reliably at scale with appropriate error handling and monitoring 

  • Implement statistical techniques such as time series forecasting, propensity modeling, or multi-touch attribution to build predictive models for client campaign optimization 

  • Develop, test, and deploy machine learning models into production environments with MLOps best practices including versioning, monitoring, and automated retraining workflows 

  • Translate client briefs and business stakeholder requirements into detailed technical specifications, delivery plans, and accurate time estimates 

  • Configure and maintain CI/CD pipelines in Azure DevOps to automate testing, deployment, and infrastructure provisioning for data and ML projects 

  • Create clear technical documentation including architecture diagrams, data dictionaries, and implementation guides to enable team knowledge sharing and project handovers 

  • Participate actively in code reviews, providing constructive feedback on SQL queries, Python code, and infrastructure configurations to maintain team code quality standards 

  • Provide technical consultation to clients on topics such as data architecture design, measurement strategy, and the feasibility of proposed ML applications 

  • Support Analytics and Business Intelligence teams by creating reusable data assets, troubleshooting data quality issues, and building datasets that enable self-service reporting 

  • Train and mentor junior team members through pair programming, code review feedback, and guided project work on data engineering and ML workflows 

  • Implement workflow orchestration using tools like Kubeflow to coordinate complex multi-step data pipelines with appropriate dependency management and retry logic 

  • Stay current with developments in cloud data platforms, digital marketing measurement, and ML techniques relevant to performance marketing optimization 

  • Identify and implement improvements to team infrastructure, development workflows, and data quality processes 

 

Core Skills and Experience We Are Looking For: 

  • Expert-level proficiency in Python for building robust APIs, scripting, and maintaining complex data/ML codebases. 

  • Strong SQL expertise and deep familiarity with data warehousing concepts relevant to tools like BigQuery. 

  • Practical experience with Docker and a firm grasp of the Linux to manage local devcontainers, servers, and Cloud Run deployments. 

  • Advanced Git proficiency and active experience participating in PR reviews to maintain code quality. 

  • Solid understanding of CI/CD principles and practical experience defining or managing pipelines, preferably using a tool like Azure DevOps. 

  • Proven ability to quickly read, understand, and apply technical documentation to translate broad business requirements into precise technical specifications. 

  • Excellent written and verbal communication skills for proactive knowledge sharing, constructive PR feedback, participating in daily standups, and documenting processes. 

Beneficial skills and experience to have:  

  • Hands-on experience with any major cloud ML platform, focusing on MLOps workflow patterns. 

  • Practical experience with stream or batch processing tools like GCP Dataflow or general orchestrators like Apache Beam. 

  • Familiarity with Python ML frameworks or data modeling tools like Dataform/dbt. 

  • Familiarity with the structure and core offerings of GCP or AWS. 

 

Contract Type: Permanent 

Here at Havas across the group we pride ourselves on being committed to offering equal opportunities to all potential employees and have zero tolerance for discrimination. We are an equal opportunity employer and welcome applicants irrespective of age, sex, race, ethnicity, disability and other factors that have no bearing on an individual’s ability to perform their job. 

 

 

#LI-PH1

Want to know more?

Apply