Vizzia is accelerating its growth and rolling out an IoT infrastructure scaling from 500 to over 2,000 sensors. To support this scale-up, we are hiring a Lead Data Platform Engineer responsible for designing, building, and operating the unified data platform that powers our AI, SaaS, and BizOps teams. This platform will be the technical foundation enabling self-service access to reliable, secure, governed, and scalable data — a key lever for our mission and the impact of our products.
Design, operate, and scale the unified data platform to serve AI, product, and operations.
Structure and lead data governance (canonical model, data contracts, quality, GDPR) while providing technical leadership to the data team.
Design, build, and operate a unified data lakehouse (IoT, CRM, internal systems, product data, AI outputs) with a high level of reliability.
Implement data quality, monitoring, observability, and lineage workflows, targeting less than 0.1% platform incidents.
Deploy a self-service data infrastructure used by 80% of internal teams and reduce manual data requests by 50%.
Optimize the platform to support 2,000+ IoT devices with a cost increase of less than 5% per device, while ensuring FinOps oversight.
Design and maintain reliable datasets for the AI team, with automated monitoring of data freshness and drift.
Work closely with AI, SaaS, BizOps, and Infrastructure teams to ensure consistency, performance, and continuous delivery.
Provide technical guidance to more junior engineers and promote engineering best practices (code reviews, standards, CI/CD).
Map the existing landscape: systems, flows, schemas, and identifiers (IoT, VPN/RMS, HubSpot, SaaS, Strapi, provisioning processes, etc.) and make this mapping understandable for product, operations, and business teams.
Define and maintain a canonical data model (customers, sites, IoT assets, tickets, events, users) serving as a shared reference for AI, SaaS, and operations.
Formalize and maintain data contracts between systems and teams (IoT, HubSpot, SaaS, Strapi, billing, etc.) to ensure consistency, completeness, and stability of key data.
Implement and operate a data catalog / data dictionary (business definitions, data owners, quality rules) and ensure internal adoption.
Co-build with Product, BizOps, and Support teams standardized datasets and analytical views (assets, fleet health, incidents, product usage, service quality) powering business dashboards.
Lead regular data quality and consistency reviews, track data quality indicators, and drive resolution of structural issues with the relevant teams.
Implement and enforce GDPR requirements: PII management, access policies, retention, auditability, and ensure their application across the data platform.
Excellent command of cloud data warehouses (Snowflake or equivalent) and distributed data platform architectures.
Strong expertise in ETL/ELT (Airbyte, Airflow, dbt) and AWS data stack (S3, Lambda, Glue, Athena, or equivalents).
Advanced skills in data modeling (conceptual and logical), canonical schemas, and database optimization (time-series / IoT databases, relational/columnar stores).
Solid experience in data reliability engineering: monitoring, alerting, incident management, SLOs/SLIs.
Strong understanding of GDPR requirements applied technically (PII, access controls, retention, auditability).
Proficiency in software engineering: Python, advanced SQL, API design, CI/CD.
Proven experience in business data modeling and building shared “core” models across multiple domains (IoT, CRM, product, support).
Hands-on experience with data governance: data dictionaries, data catalogs, data ownership, and implementing data contracts between teams.
Ability to facilitate workshops with non-technical stakeholders (field ops, support, product, sales) to understand processes, define key entities, and align KPI definitions.
Ability to collaborate with cross-functional teams and communicate technical decisions clearly.
Cloud certifications (AWS, Snowflake, GCP) are a plus.
Certifications in data engineering, data governance, or privacy engineering.
Experience with time-series / IoT databases (TimescaleDB or equivalent) and modern BI tools (Metabase, Looker, Tableau, etc.) is a plus.
Strong sense of ownership: demonstrated ability to own a platform, resolve incidents, and continuously improve systems.
Pragmatic, results- and quality-oriented mindset with a focus on simplicity, scalability, and impact.
Ability to translate business challenges (service quality, field deployment, product performance) into data architecture decisions.
Experience scaling IoT or highly distributed environments.
Strong interest in mentoring, structuring, and upskilling teams around data.
Rencontrez Alexandre, Co-fondateur & CTPO
Rencontrez Vincent, Computer Vision Engineer
Tyto společnosti rovněž nabírají pracovníky na pozici "{profese}".