Data Engineer — Analytics Engineering

  • Permanent contract 
  • Starting date:  
  • Possible full remote
  • Master's Degree

The company



    The job

    Data Engineer — Analytics Engineering

    • Permanent contract 
    • Starting date:  
    • Possible full remote
    • Master's Degree


    At Upflow, we’re building the platform revolutionizing how B2B businesses get paid.

    Today, most companies are still struggling to collect payments from their customers: hundreds of unpaid invoices, anarchic communications, multiple payment methods. They lose tons of hours on zero value-added tasks, suffer from late payments that hinder their growth, and sometimes simply go bust because of cash flow issues. It’s time for a change.

    Upflow is the modern hub to manage all data exchanges, communications, and payments to get paid faster, simpler. We are a product-led organization solving this problem with a tech approach.

    We’ve launched in 2018 and are trusted by hundreds of awesome users in the EU & US including Lattice, Front, Triplebyte, Iziwork, Adikteev, Proxyclick, and more. We’re also backed by leading investors (YCombinator, 9yards, eFounders) and top executives business angels from N26, Square, Mercury, Uber, and Netsuite.

    We’re a remote-friendly, distributed team across 3 continents and rapidly expanding. Our primary focus is to support our fast growth in the US. Now’s a perfect time to join if you’re looking for an exciting international experience.

    For more information, please visit our website www.upflow.io, or check out our product demo here: https://demo.upflow.io

    Job description

    About Upflow

    Getting paid on time represents a significant problem for B2B companies. Unlike consumer payments, where we’ve seen massive amounts of innovation in the form of companies like Venmo & Revolut, B2B payments remain archaic, with most of the work being done in spreadsheets and involving significant amounts of back and forth between different stakeholders.

    These inefficiencies are extremely problematic for companies, so much so that some go bankrupt because of this - and COVID hasn’t helped! We’re on a mission to fix this, and bring delightful B2C experiences to B2B finance teams in the process.

    We launched in 2018 and today are trusted by hundreds of amazing users across the EU & US including Lattice, Front, Triplebyte, Iziwork, Adikteev, Proxyclick and more. We’re also backed by leading investors (YCombinator, 9yards, eFounders) and top BAs from N26, Square, Mercury, Uber and Netsuite.

    For more information, please visit https://upflow.io or checkout our product demo here: https://demo.upflow.io

    Why we need your help?

    We’re building a state-of-the-art data team at Upflow and want to make sure that, from the outset, we bring software engineering best practices to our analytics code-base, an engineering mindset to discussions on how data is modeled and introduce practices such as non-regression testing, data quality certification and self-service tooling (data catalog, data status information, data lineage,…) early.

    You will be responsible not only for building pipelines & datasets but also for making sure they run reliably and in a timely fashion.

    Our current stack

    • BigQuery as our datawarehouse.
    • Postgres for production databases.
    • Segment for frontend events.
    • Stitch pulling data into our BigQuery warehouse from postgres, Segment and multiple 3rd party SaaS vendors (Salesforce…).
    • Periscope/Sisense for BI on top of BigQuery.
    • dbt for data transformation.

    What you will do

    • Ingest, clean & normalize raw data into BigQuery.
    • Build and maintain a set of core tables in BigQuery (using SQL) to efficiently describe our business through data making the data warehouse the trusted basis for analysis by the business & product analysts. Investigate data discrepancies, perform data quality checks, build monitoring systems.
    • Be part of discussions with product managers and analysts in order to guide them in their understanding of the data, shape the product solutions and to better grasp the context of requirements coming your way.
    • Be responsible for the overall data architecture and data governance in the company : document all data models, mappings, business logic and data flows across all our tools & apps. Track costs & performance, define & enforce retention policies where appropriate.
    • Train data users on SQL and analytics best practices.
    • Introduce new tools & practices, as needed.
    • Promote and embody our strong data culture at Upflow — with emphasis on data quality/consistency.

    Why join Upflow?

    • Join a product-driven company, with a focus on delivering exceptionally high quality software.
    • Work on-site in Paris, 100% remote or something in between.
    • Join us at an early stage, most of the work lies ahead of us.
    • Join a team that is passionate, caring, ambitious and intelligent that share the common goal to change the way B2B payments take place.
    • Enjoy a very high degree of autonomy with a hands-off management style.
    • All you can expect from a great place to work: free lunches, cool offices, top of the range equipment, great healthcare and competitive salary and equity.
    • Regular off-sites with the team, meetups and strong connections to the startup ecosystem.

    Preferred experience

    About yourself

    • You are analytical and bring structure to your work, you champion data consistency and predictability. You are curious & self-driven.
    • You know advanced SQL and have experienced with at least some of the following : aggregation & window functions, complex joins/subqueries/CTEs. You have experience with analytical data models and, ideally, with the methodologies to design such models.
    • You must have worked with a modern datawarehouses (BigQuery, RedShift, Snowflake). An ideal candidate will have experience with BigQuery
    • Ideally, you have experience using modern analytics tooling such as dbt, Great Expectations, data catalogs, etc. You known enough Python to automate your analytics work in Airflow or a similar tool.
    • You have good communication & collaboration skills and enjoying nurturing relationships inside the company : you will constantly work with data analysts, software engineers as well as product & business people.

    Recruitment process

    1. Discovery call with Talent Acquisition Manager (45 minutes).
    2. Screening call with VP Engineering (45 minutes).
    3. Technical home assessment (no time limit).
    4. Technical review (60 minutes).
    5. Team interview (60 minutes).
    6. Culture interview (30 minutes).

    Recruitment process takes maximum 15 business days and steps 4, 5 and 6 can be scheduled (almost) back-to-back.

    Recruitment process can be done entirely by Zoom.

    Meet the team

    This content is blocked
    Youtube cookies are required to show you this content


    • FinTech / InsurTech

    Data Engineer — Analytics Engineering

    • Permanent contract 
    • Starting date:  
    • Possible full remote
    • Master's Degree
    Questions and answers about the offer
    • Add to favorites
    • Share on Twitter
    • Share on Facebook
    • Share on LinkedIn