Your Mission: You will play a key role in driving the success of “AI & Data-Driven” initiatives for our clients. Within a project, you will be responsible for both implementing agentic solutions and establishing data architecture and data processing workflows. You will be expected to:
Contribute to client projects delivery from scoping and iterative development through to deployment and operational support.
Understand client business challenges and needs, translating them into technical & data requirements.
Assess the data ecosystem of the application to be developed or enhanced, proposing optimal approaches (architecture, tech stack, data sources, off-the-shelf vs custom model, etc.), presenting options and their impacts to stakeholders to guide decision-making.
Collaborate with software engineers to define architecture and interfaces to embed AI & Data solutions into applications, APIs, or services.
Sketch prototypes, and turn them into production-grade systems, ensuring data pipelines & AI systems are robust, scalable, monitored, and implement security best practices.
Identify all required tasks and prerequisites, organize and sequence your work, estimate workloads, and highlight the critical path.
Propose alternative approaches as needed to overcome challenges and ensure on-time, high-quality delivery.
Your Environment:
In client projects, you may work within a large cross-functional agile team (Architects, Data Scientists, DevOps Engineers, UX/UI Designers, Product Owners, Scrum Masters, Backend and Frontend Developers) or independently (with DevOps support mainly for deployment).
Internally, you will contribute to Fieldbox shared expertise in “Data & AI-Driven” solutions: staying abreast of technological and scientific advancements, leveraging the open-source ecosystem, and mastering major cloud & SaaS platforms.
Personal Qualities:
Business-value oriented
Comfortable collaborating with multidisciplinary teams
Ability to communicate complex topics effectively to diverse audiences
Skilled in solving complex problems using varied approaches
Fluent in English
Preferred Experience: 5+ years in Data & AI engineering, with a proven track record of deploying Data workflows & AI systems in production.
Mandatory:
Ability to drive data solution architecture
Proficiency in Python, with production-grade coding standards (tests, modularity, reproducibility….)
Hands-on experience developing and industrializing agentic solutions (bonus for tools, human-in-the-loop, and advanced workflows)
Experience with API design and development
Experience working with diverse data and storage types (tabular, images, time series, SQL, object storage, etc.)
Familiarity with at least one major data or cloud platform (GCP, AWS, Azure, Databricks, Snowflake, etc.)
Nice-to-Have:
Knowledge of edge AI, on-premise deployment, or hybrid cloud environments.
Experience with workload orchestration frameworks (e.g. Airflow, Argo, Metaflow)
Experience with agentic cloud platform (e.g., Azure Foundry Agent Service, Vertex AI Agent builder, Amazon Bedrocks Agent)
Ability to containerize applications (Docker) and setup CI/CD pipelines autonomously
Typical process includes 4 steps :
HR interview,
Technical Test,
A technical interview with a Tech Lead,
Interview with the management team to validate the application,
In compliance with GDPR standards, all collected information is strictly used as part of our recruitment process. You may request the modification or deletion of your data at any time by contacting us directly
Rencontrez Théo, Lead Data Scientist
Rencontrez Clotilde, Lead Machine Learning Engineer
Ces entreprises recrutent aussi au poste de “Données/Business Intelligence”.