UK (remote or office-based) or remote up to UTC +/- 2
Who are we
We are a team of engineers who want to get things done.
Our open core behavioural data platform is used by data teams to drive a better understanding of their customers - the right way. The open-sourced data pipeline stack and tracking SDKs support teams to make key decisions on high quality, trusted data. We empower and actively encourage companies to own their data strategy and the data it creates. Data quality, privacy and security are our top priorities.
Snowplow is a technical product, so our Engineering team is at the forefront of what we’re offering our customers and community. We have steadily grown the team over the last two years, following investment from Atlantic Bridge and MMC Ventures. Our aim through this growth has been to maintain, if not increase, our productivity from those early days. We focus on building autonomous teams of engineers who understand where they fit in the wider Engineering efforts and collaborate closely with our top team of Product Managers. That’s us in 164 words, but there is so much more we could tell you!
We’re looking for an experienced Scala Engineer to enhance and extend the capabilities of our data pipeline. You’ll be solving the complex problems that come with building data pipelines that serve a wide variety of use cases. You’ll also be a key open source maintainer on our popular pipeline components and have the opportunity to make decisions that will affect our entire estate. You will be making data teams world over very happy.
Our ethos is that we want the right people making the best decisions, and you can expect we’ll be asking you what you think. Our engineers take on a lot of responsibility. This includes product ownership, testing (automated wherever possible), route to live, etc. We are not micromanaged, we don’t throw our work over the fence. In learning about and taking on these responsibilities we believe we can make quicker decisions.
You will be joining a wider team of 25+ remote engineers who work directly and closely with other teams in the business.. There is a huge opportunity at Snowplow to learn more about all aspects of engineering and data, from code to customers, and how to be the most productive and empowered version of you.
What you’ll be doing
● Enhancing the Snowplow data pipeline.
Contributing to our open source projects
and enhancing the capability of our data pipeline across AWS and GCP.
● Working within the Scala ecosystem. You’ll primarily be working with purely functional Scala, along with with cats, cats-effect, fs2 and various other libraries mostly from Typelevel stack. Although you might also get a chance to explore other languages we often use such as Go or Python.
● Looking after the route to live. You’ll be creating Docker images and writing the Terraform to automate the deployment process of hundreds of data pipelines.
● Automation and Testing. Continue to improve our automation and testing experiences, extend our existing Github Actions CI/CD processes to make it easier to work and contribute to our projects.
● Open Source. As one of Snowplow's open source maintainers, you’ll get to manage open source projects, plan and communicate our upcoming releases, engage with our users via slick documentation, create easy to understand READMEs and discuss problems with users through our forums.
● Empowered. Working in a productive, empowered team. Most companies talk about the importance of this, but we’re really doing it. Come talk to us about how.
We’d love to hear from you if
● You have Scala Experience. You have experience working within the Scala ecosystem, even better if that experience is working with data or data pipelines.
● You have used Cloud technologies. The Snowplow pipeline currently works on AWS and GCP, cloud experience isn’t essential but exposure to cloud technologies would be beneficial.
● You care about open source. You like the idea of working on open source projects and interacting with the developer community who use them.
● You enjoy working remotely. Our remote team depends on expert collaborators to work effectively. You’ll be a great communicator and enjoy working closely with the team.
● Experience working with data stacks. Previous experience in data is a plus, but most importantly you have an interest in data and how it empowers companies to make better decisions.
● Security. Data security and integrity are fundamental to what we do, so you’ll need a solid understanding of security threats and how to overcome them.
● Self motivated. You don’t wait to be told what to do. You can understand a problem, drive toward a solution and recognise when you need support or more direction.
● Pragmatic. We can’t do everything today. You’ll be pragmatic in your approach to software delivery and balance our speed of learning with our commitment to providing a reliable and trusted service to customers.
What you get in return for being awesome:
● A competitive package, including share options
● 25 days of holiday a year (plus public holidays)
● Freedom to work from wherever suits you best
● Cycle to work scheme if UK-based Two fantastic company Away Weeks in a different European city each year (or when this isn’t possible, we have “Stay Away Weeks”)
● Mental health support including therapy sessions
● Work alongside a supportive and talented team with the opportunity to work on cutting edge technology and challenging problems
● Grow and develop in a fast-moving, collaborative organisationMacBook and home office equipment
● Enjoy fun events organised by our Cultural Work Committee Convenient location in central London for those who want to work there / when you come to visit
● Continuous supply of Pact coffee and healthy snacks in the office when you’re here!
Snowplow is dedicated to building and supporting a brilliant, diverse and hugely inclusive team. We don't discriminate against gender, race, religion or belief, disability, age, marital status or sexual orientation. Whatever your background may be, we welcome anyone with talent, drive and emotional intelligence.