42Next is our new initiative aimed at expanding our innovative educational model to meet evolving industry needs. This offering builds upon our core curriculum while integrating new technologies and methodologies to prepare students for the rapidly changing digital landscape.
As an Analytics Engineering Intern at 42, you will:
Refine our data modeling approach across the organization, improving how we structure and organize information
Integrate multiple data sources from both legacy systems and new 42Next products into a cohesive data architecture
Design and build data pipelines using SQL and other relevant tools
Create and improve dashboards in Superset to enhance data visualization and accessibility for various stakeholders
Write technical documentation and proposals for data initiatives
Collaborate with team members to understand their data needs and provide effective solutions
Support day-to-day analytics projects and contribute to the existing codebase
Learn about Analytics Engineering principles and best practices while gaining hands-on experience with relevant tools and technologies
Level: Bachelor’s degree (BAC+3) or higher (or equivalent)
Fields: Computer Science, Data Science, Statistics, Software Engineering, Information Systems, Big Data, or related
Academic Projects: End-to-end data integration, dashboard development, group deliverables with milestones.
Open-source contributions (e.g., Dagster, Superset, dlthub)
Hackathons, data meetups, Kaggle competitions
Technical blogging or conference/student-club presentations
SQL & Data Modeling: advanced query writing, schema design (star/snowflake), optimization
Scripting & Programming: Python (Pandas, SQLAlchemy) or similar
ETL & Orchestration: Building and scheduling pipelines with Airflow, dbt, Dagster or equivalents
Version Control & CI/CD: Git workflows; basic Docker, automated testing (e.g. pytest), familiarity with CI/CD concepts
Linux/Unix: Command-line comfort, shell scripting
Visualization & Dashboards: Apache Superset (dashboard creation, filters, parameterization)
Open-Source Engagement: Experience contributing to OSS (bug fixes, feature PRs)
Cloud & Infra as Code: AWS/GCP/Azure fundamentals (e.g. S3, BigQuery), Terraform/CloudFormation
Container Orchestration: Kubernetes basics
Testing & Quality: Writing unit/integration tests, code reviews, documentation
Analytical Mindset: Translate business needs into data solutions
Communication: Explain technical concepts clearly to diverse stakeholders
Collaboration: Pair-programming, code reviews, cross-team alignment
Autonomy & Curiosity: Self-driven learning and initiative
Organization: Task prioritization, clear documentation
Languages: French (fluent) and English (professional proficiency)
These companies are also recruiting for the position of “Data / Business Intelligence”.