At Budbee we are obsessed with fueling e-commerce growth. We offer groundbreaking last-mile solutions for the e-commerce industry, and even though it's pretty advanced stuff, it is all about having the right people on board!
Since April 2018 Budbee is backed by the prestigious investor Kinnevik. Other major investors are Stena Sessan, H&M Co:Lab and AMF. Up until now we have raised over 90 MEUR in total, and today we are already doing millions of home and box deliveries each year, operating in Sweden, Netherlands, Finland and Denmark. Since we are growing incredibly fast with new markets or products being added pretty much every month, we are not looking for the average Joe. The right people for us are people who love change and challenges and at the same time have an inner drive to get things done. By joining us in this journey you are also becoming a part of the Budbee family.
We are aware that our operations impact the world around us. That is why we have offset our carbon emissions right from the very first Budbee delivery, with the help of ZeroMission. The need to take extra care of our planet affects every decision we make, however big or small it may be.
Everyone has heard about tech, or even fintech or medtech... Join us and be a part of the latest and greatest - Logtech!
About the role
We are looking for a passionate Data Engineer to join our Data & Analytics team in Stockholm. You would be expected to focus on designing and implementing data pipelines for our various product teams within Budbee, as well as providing reports to our various business units and merchant partners. Working closely with product stakeholders, you’ll help translate business requirements and KPIs into technical tasks that you will then implement and maintain. Collaborating closely with our VP of Data as the main stakeholder, you’ll contribute to the design of our data architecture.
We are an international team so Swedish is not a requirement if you are fluent in English.
- Java / Kotlin
- MySQL / Big Query
- Tableau / Klipfolio
- Snowflake / Athena
- Event Driven Architecture
- 3+ years working with Python and data pipelines
- 1+ years working with event-driven/event-oriented architectures and data models
- Experience with maintaining ETL scripts
- Comfortable with cloud providers such as AWS
Extra brownie points
- Experience developing with Java and/or Kotlin
- NoSQL and/or Graph database knowledge
- Experience working with data visualizations/reports such as Tableau
- Experience with setting up and maintaining data warehouses and data lakes
- Experience with writing and evolving machine learning algorithms
Feel excited about this role? Then we’d love to talk to you!