6 reasons to work @ Beamy 🔥
- Work with the best technology and the best technologists.
- Be a strong voice in the product development lifecycle.
- A key role within our organization with the opportunity to be a decisive player in transitioning Beamy from startup to scale up, while being part of the biggest startup campus of the world (indeed, we are located @ Station F)
- Have a wide scope of responsibilities : take part in the architecture design process, develop and release new features, meaning a lot of opportunities to learn.
- Attractive salary and benefits package (starting from 48K€ for a junior engineer + stocks options + benefits)
- Full ownership and strong autonomy, free of micro-management.
How we work ⚙️
- We build a tech product, by tech people, for tech customers.
- You’ll be in contact with peers: developers, product engineers, designers, data engineers, business teams.
- We work with a scrum workflow: sprint planning, 2 week sprints, daily meetings, sprint reviews, sprint retrospectives… We demonstrate the work accomplished during demos at the end of each sprint.
- Communication, peer programming and reviews are important for the team.
- We work remotely most of the time : only 2 days per sprint are required to be worked on-site.
Your role 👉
We’re looking a Data Ops Engineer to help us build and develop the data architecture allowing Beamy to offer its users an ever more efficient service. You will :
- Build and maintain the Beamy Data architecture ;
- Guarantee the maintainability and scalability of the various components by relying on the services of the Beamy cloud provider and all of DevOps best practices ;
- Adapt the technical stack to anticipate user needs and the limitations of existing systems ;
- Collaborate with our Machine Learning Engineer and developers in order to develop future features, driven by Data, within Beamy product ;
- Participate in the improvement of the team and its agile rituals.
- Design, develop and maintain robust and scalable data flows, from R&D to production.
- Design and develop tools to ensure the quality of data flows
- Industrialize and automate Data processes with a CI / CD approach
- Ingesting clients datas in our solution
Tech environment 🛠
Github, Docker, Kubernetes, Helm, Azure, Google Cloud Platform, Terraform, Python, Pyspark.
Team spirit 👨👩👦👦
- We have people from different horizons, cultures and backgrounds. There are several channels on slack that you can join to talk about different subjects: wellness, food, sports, fun and random things. We want our teams to be inclusive and diverse.
- We have a product and tech oriented culture : everybody understands each other and speak the same language.
- We’re still a small team with a huge solidarity, helping each other is always a priority.
- We regularly organize team events with the entire team, we get on very well and like to spend time together.