Data & Automation Engineer - Internship (4 months)

Job summary
Internship
Paris
No remote work
Salary: €1K a month
Skills & expertise
Technical writing
Attention to detail
Technical aptitude
Communication skills
Collaboration and teamwork
+12
Apply

PIGMENT
PIGMENT

Interested in this job?

Apply
Questions and answers about the job

The position

Job description

Join Pigment: The AI Platform Redefining Business Planning

Pigment is the AI-powered business planning and performance management platform built for agility and scale. We connect people, data, and processes in one intuitive, feature-rich solution, empowering every team—from Finance to HR—to build, adapt, and align strategic plans in real time.

Founded in 2019, Pigment is one of the fastest-growing SaaS companies globally. Industry leaders like Unilever, Snowflake, Siemens, and DPD use Pigment daily to make more informed decisions and confidently navigate any scenario.

With a team of 500+ across Paris, London, New York, San Francisco, and Toronto, we've raised nearly $400M from top-tier investors and were named a Visionary in the 2024 Gartner® Magic Quadrant™ for Financial Planning Software.

At Pigment, we take smart risks, celebrate bold ideas, and challenge the status quo—all while working as one team. If you're driven by innovation and ready to make an impact at scale, we’d love to hear from you.

The Data & Automations Team

The Data & Automations team is designing, implementing, and maintaining intelligent automation solutions that streamline business processes and enhance data-driven decision making. We empower business and corporate functions through technology integration, robust data architecture, and process optimization, enabling teams to focus on high-value activities while ensuring accuracy and scalability.

Our team operates across multiple domains, maintaining shared services and supporting Pigment’s internal functions through automation solutions. We work with a variety of technologies to build reliable data pipelines and automation workflows that support Pigment's growth and operational efficiency.

Role

This internship aims at working on several key projects relating to improving code quality, documentation, and developer experience for the Data & Automations team's internal tools and practices.

The internship will be organized in phases, that progress from revisiting one of the key internal development library we rely on, better integrating it with the rest of our modern data stack, to documentation and typing work, to more complex infrastructure and orchestration tasks.

This role involves collaborating with stakeholders within different teams at Pigment and requires strong communication skills, in both French and English.

Tech environment

Our core stack relies on Google Cloud Platform (GCP) and Python, with components including BigQuery, Cloud Run, Cloud Composer (Airflow), as well as dbt, Fivetran, Hightouch.

You will have the opportunity to learn and grow your skills across these technologies while making a meaningful impact on the Pigment’s operational efficiency.

Internship Overview

The internship is structured in three progressive phases, moving from foundational improvements to more complex data infrastructure work:

Phase 1: Onboarding & Peacock Improvements

Focus: Transform Peacock into a core asset of our data engineering toolbox by improving typing, documentation, and refactoring, while enabling its direct usage within DAG components

Key projects:

Type Peacock Objects: Add Python type hints to improve IDE support and catch bugs early

Document Peacock Package: Create comprehensive documentation and usage examples

Enable DAG Integration: Refactor Peacock to support direct usage from Airflow DAG components, eliminating the need for wrapper services

Skills learned: Python type hints, library design, cross-service dependencies, technical writing, documentation best practices, Airflow integration patterns

Phase 2: Code Quality & Standards

Focus: Implement automated CI agents, enhance linting standards, and create unified frameworks

Key projects:

Add Automated CI Agents: Build agents for PII detection, anonymization checks, and other data related agents

Review Linting: Enhance SQLFluff, Yamllint, and Ruff configurations across repositories

Shared Utils Documentation: Audit and document duplicated utility functions

Unified Automation Framework: Design standardized framework for automation services

Skills learned: SQL best practices, code linting, dbt conventions, code architecture, refactoring strategies, system design trade-offs, software architecture, design patterns, framework design, testing strategies

Phase 3: Data Synchronization Projects

Focus: Implement data synchronization projects from the team backlog

Approach:

Start with easy single-direction syncs to build confidence

Progress to medium complexity bi-directional syncs

Follow established patterns: source system → BigQuery → destination system

Skills learned: API integration, data mapping, ETL patterns, Airflow DAG development

Qualifications

Academic Background

Currently enrolled in 4th year of engineering curriculum

Strong academic foundation in Software Engineering, Data Engineering, or related technical field

4-month internship

Technical Skills

Solid programming skills in Python (experience with libraries for automation and data processing is a plus)

Good understanding of SQL and database concepts

Familiarity with version control systems (Git)

Basic knowledge of cloud platforms (GCP experience is a plus but not required)

Understanding of API integrations and RESTful services

Soft Skills

Strong problem-solving mindset and analytical thinking

Excellent attention to detail and some understanding of what clean, maintainable code stands for

Good communication skills in French and English (both written and verbal)

Eagerness to learn new technologies and best practices

Comfortable working in a collaborative, fast-paced environment

Nice to Have

Personal projects or open-source contributions demonstrating coding skills

Some exposure to modern data stack tools (dbt, Airflow, BigQuery)

Understanding of CI/CD concepts and practices

Experience with technical documentation and writing

We conduct background checks as part of our hiring process, in accordance with applicable laws and regulations in the countries where we operate. This may include verification of employment history, education, and, where legally permitted, criminal records. Any checks will be conducted lawfully prior to formal employment contracts being signed, with candidate consent, and information will be treated confidentially.

Pigment is an equal opportunity employer. We believe diversity is a strength and fosters innovation. We are committed to enabling everyone to feel included and valued at the workplace.  All qualified applicants will receive consideration for employment without regard to age, color, family, gender identity, marital status, national origin, physical or mental disability,  sex (including pregnancy), sexual orientation, social origin, or any other characteristic protected by applicable laws. We may process your personal data in accordance with our HR Data Protection Notice.

Want to know more?

These job openings might interest you!

These companies are also recruiting for the position of “Data / Business Intelligence”.

Apply