A strong advantage for this role is prior experience in sectors such as banking, finance, or energy—particularly in environments where automation is applied to complex, domain-specific workflows.
The team’s core responsibility involves transforming business-developed code (commonly written in VBA or Python) into robust, automated pipelines orchestrated through Apache Airflow. A key focus is on observability and reliability: business stakeholders must be able to monitor execution, identify failures, and review outputs of automated processes with ease.
The hiring manager places significant emphasis on deep technical expertise in Python and software architecture. Candidates are expected to confidently engage in technical discussions with experienced engineers, propose well-reasoned solutions, and select appropriate tools for specific challenges. Topics such as Python performance optimization are frequently explored, including:
Concurrency and parallelism, with particular attention to Python’s constraints (e.g., the GIL) compared to other languages
Asynchronous programming patterns and execution models
The team operates within a microservices architecture, relying heavily on APIs to retrieve and process data—ranging from power plant operational status to production planning and market bidding data. Relevant experience includes:
Developing RESTful applications in Python using frameworks such as Flask or FastAPI
Building API clients and services using libraries like requests, aiohttp, or httpx
Implementing authentication mechanisms such as OAuth2
Familiarity with Domain-Driven Design principles (e.g., using Pydantic for data validation and modeling)
In addition to writing code, engineers are responsible for deployment and operational reliability. Key competencies include:
Experience with CI/CD pipelines and tools such as Git, Kubernetes, Azure Pipelines, GitLab CI, or Jenkins
Using Docker and docker-compose for local development and end-to-end testing
Implementing regression testing to ensure stability and maintain backward compatibility
The role is primarily focused on backend development and system architecture. This includes working with a variety of data storage solutions, both SQL and NoSQL (e.g., PostgreSQL, Redis), as well as experience with ORMs—particularly SQLAlchemy.
Finally, strong debugging, monitoring, and observability practices are essential. The team views software not just as code that runs, but as systems that must be diagnosable when failures occur. Experience with logging and monitoring tools—such as Splunk or Kubernetes logs—and workflow orchestration platforms like Apache Airflow is highly valued.
An Employee Back-End Developer Python with a Devops Mindset.
At least 3 years of Proven experience.
Extensive knowledge in Python (Optimization for example).
Airflow is nearly a Must have.
Devops mindset: familiar with CI/CD pipelines and Containerisation.
At least, Fluent in English.
Experience in Software Architecture: ability challenge and propose designs solutions.
First interview with HR.
Second Technical interview.
Third interview onsite with Business Manager and the team.
Rencontrez Elodie, Data Scientist Senior & Technical Community Officer
Rencontrez Margot, Data Scientist Senior Coach
These companies are also recruiting for the position of “Software & Web Development”.