SQL Superhero: Help build and maintain SQL-based data models in BigQuery and dbt.
Pipeline Whisperer: Support automated ELT processes and ensure our data is reliable and trustworthy across GCP.
Dashboard Wizard: Create and update Looker Studio dashboards that make key business and operational KPIs easy to track.
Quality Guardian: Perform data validation and quality checks to keep our reports accurate and actionable.
Insight Detective: Analyse Cohabs operational data to uncover insights that help internal teams make better decisions.
Documentation Maestro: Work together with the team to document models, workflows, and business logic.
Troubleshooting Ninja: Solve pipeline issues, spot performance bottlenecks, and investigate unusual data behaviour.
Support Sidekick: Help internal stakeholders with ad hoc SQL queries and exploratory analyses.
You are comfortable writing clean, well-structured SQL.
You have a basic understanding of data warehousing concepts (ETL/ELT, modelling, pipelines).
You are familiar with BigQuery, dbt, or similar modern data stack tools.
You have experience building dashboards using Looker Studio or another BI tool.
You are detail-oriented and can identify inconsistencies or irregularities in data.
You have great analytical abilities and enjoy analysing data to gain insights
You are proactive, curious, and eager to learn within a fast-paced tech environment.
You have basic knowledge of Python (a plus, not required).
You are familiar with Git or willing to learn version-controlled workflows.
You are motivated to continually improve the Cohabs product and contribute to process enhancements.
Initial conversation with Femi, Data Architect: a chance to get to know you and your background.
Technical assessment: an opportunity to demonstrate your skills in practice.
Final interview with Alexandre and/or Rémi, VP Tech and CTO: to discuss your approach, vision, and potential contribution to the team.
Meet Rémi, VP of Technology
Meet François, Chief Strategy Officer & Cofounder