Data Engineer (Analytics Engineering)

  • Permanent contract 
  • Starting date:  
  • Possible full remote
  • Master's Degree
  • > 2 years

The company



    The job

    Data Engineer (Analytics Engineering)

    • Permanent contract 
    • Starting date:  
    • Possible full remote
    • Master's Degree
    • > 2 years


    At Upflow, we’re building the platform revolutionizing how B2B businesses get paid.

    Today, most companies are still struggling to collect payments from their customers: hundreds of unpaid invoices, anarchic communications, multiple payment methods. They lose tons of hours on zero value-added tasks, suffer from late payments that hinder their growth, and sometimes simply go bust because of cash flow issues. It’s time for a change.

    Upflow is the modern hub to manage all data exchanges, communications, and payments to get paid faster, simpler. We are a product-led organization solving this problem with a tech approach.

    We’ve launched in 2018 and are trusted by hundreds of awesome users in the EU & US including Lattice, Front, Triplebyte, Iziwork, Adikteev, Proxyclick, and more. We’re also backed by leading investors (YCombinator, 9yards, eFounders) and top executives business angels from N26, Square, Mercury, Uber, and Netsuite.

    We’re a remote-friendly, distributed team across 3 continents and rapidly expanding. Our primary focus is to support our fast growth in the US. Now’s a perfect time to join if you’re looking for an exciting international experience.

    For more information, please visit our website www.upflow.io, or check out our product demo here: https://demo.upflow.io

    Job description

    We’re building a state-of-the-art data team at Upflow and want to make sure that, from the outset, we bring software engineering best practices to our analytics code-base, an engineering mindset to discussions on how data is modeled and introduce practices such as non-regression testing, data quality certification and self-service tooling (data catalog, data status information, data lineage,…) early.

    You will be responsible not only for building pipelines & datasets but also for making sure they run reliably and in a timely fashion.

    Our current stack

    • BigQuery as our datawarehouse.
    • Postgres for production databases.
    • Segment for frontend events.
    • Stitch pulling data into our BigQuery warehouse from postgres, Segment and multiple 3rd party SaaS vendors (Salesforce…).
    • Periscope/Sisense for BI on top of BigQuery.
    • dbt for data transformation.

    What you will do

    • Ingest, clean & normalize raw data into BigQuery.
    • Build and maintain a set of core tables in BigQuery (using SQL) to efficiently describe our business through data making the data warehouse the trusted basis for analysis by the business & product analysts. Investigate data discrepancies, perform data quality checks, build monitoring systems.
    • Be part of discussions with product managers and analysts in order to guide them in their understanding of the data, shape the product solutions and to better grasp the context of requirements coming your way.
    • Be responsible for the overall data architecture and data governance in the company : document all data models, mappings, business logic and data flows across all our tools & apps. Track costs & performance, define & enforce retention policies where appropriate.
    • Train data users on SQL and analytics best practices.
    • Introduce new tools & practices, as needed.
    • Promote and embody our strong data culture at Upflow — with emphasis on data quality/consistency.

    Preferred experience

    • You are analytical and bring structure to your work, you champion data consistency and predictability. You are curious & self-driven.
    • You know advanced SQL and have experienced with at least some of the following : aggregation & window functions, complex joins/subqueries/CTEs. You have experience with analytical data models and, ideally, with the methodologies to design such models.
    • You must have worked with a modern datawarehouses (BigQuery, RedShift, Snowflake). An ideal candidate will have experience with BigQuery.
    • Ideally, you have experience using modern analytics tooling such as dbt, Great Expectations, data catalogs, etc. You known enough Python to automate your analytics work in Airflow or a similar tool.
    • You have good communication & collaboration skills and enjoying nurturing relationships inside the company : you will constantly work with data analysts, software engineers as well as product & business people.

    Recruitment process

    Applying at Upflow is a two-way process between you and us. We need you to want to work with us as much as we want to work with you!

    We strive to keep things efficient for you & us by going through the whole process in 2-3 weeks end-to-end.

    1️⃣ You apply
    Show us you’re smart, show us you’re a builder — be convincing about why you, why us, why now!

    2️⃣ Discovery Call (with Antoine, Senior Recruiter)
    30 min in visio to answer your first questions about Upflow and make sure we’re on the right path.

    3️⃣ Screening Interview (with Philippe, VP Engineering)
    45 min in visio with your hiring manager to get to know each other and answer your questions. Be prepared, be curious.

    4️⃣ Home assessment and review (with two engineers)
    Your time to shine! We usually provide a detailed written technical test allowing you to showcase your capabilities.

    5️⃣ Team Interview (with two other peers)
    2 x 30 min visio to get to know your (hopefully) future colleagues better.

    6️⃣ Final Interview (with Barnaby, cofounder & CTPO)
    30 min in visio to discuss the vision and ask as many questions as you want.

    7️⃣ Ref checks
    We always take the time to do a few ref checks from your previous lives. That will help us understand how to get you on board.

    🎉 Offer! 🎉
    We look forward to having you onboard! 😃

    Meet the team

    This content is blocked
    Youtube cookies are required to show you this content
    Questions and answers about the offer
    • Add to favorites
    • Share on Twitter
    • Share on Facebook
    • Share on LinkedIn