Return to jobs Return to jobs

Data Engineer

BeZero Carbon

Clock

Posted over 30 days ago...

Join the Climate Revolution as a Data Engineer with BeZero Carbon's Innovative Team

Overview

icon Salary

No salary declared 😔

icon Location

London

icon Nomad Friendly?
Tick

98% Remote- UK

icon Expires

Expires at anytime

BeZero Carbon is a pioneering ratings agency revolutionizing the Voluntary Carbon Market with its high-tech SaaS platform. The company provides essential ratings and research tools that support various stakeholders in the carbon offset space. With a diverse team of over 150 professionals across five continents, BeZero Carbon leverages expertise from multiple fields to drive the Net-Zero transition. Having secured substantial Series B funding, the company is on a rapid growth trajectory, offering an exhilarating opportunity for professionals to contribute to meaningful climate solutions. Learn more at www.bezerocarbon.com.

Role Summary:

  • Develop carbon offset-related data products for clients.
  • Build internal data tools to boost the efficiency of the Ratings teams.
  • Collaborate cross-functionally with product, ratings, and software engineering teams.
  • Design back-end API services, knowledge management tools with AI, and standardize quantitative analyses for renewable energy projects.
  • Deploy web crawlers and develop standardized data models for internal and client-facing use.

Role Requirements:

  • Passionate about climate change and carbon markets.
  • At least 2 years of experience in building ELT/ETL pipelines using Python and SQL.
  • Proficient with workflow orchestration tools (e.g., Airflow, Prefect, Dagster), Docker, and AWS.
  • Skilled in writing maintainable, scalable, and robust code in Python and SQL.
  • Experience with back-end service design and API deployment, ideally using FastAPI.
  • Knowledgeable in deploying and maintaining cloud resources with tools like AWS Cloud Formation, Terraform.

Application Process Details:

  • Initial screening interview with a recruiter (15 mins).
  • Introduction call with Chief Data Officer (30 mins).
  • Two technical interviews with data engineering team members (60-90 mins each).
  • Reference checks and job offer.


About Us

BeZero Carbon is a global ratings agency for the Voluntary Carbon Market. We distribute our ratings via our SaaS Product, BeZero Carbon Markets, informing all market participants on how to price and manage risk. Our ratings and research tools support buyers, intermediaries, investors and carbon project developers.

Founded in April 2020, our 150+ strong team combines climatic and earth sciences, sell-side financial research, earth observation, machine learning, data and technology, engineering, and public policy expertise. We work from five continents.

We raised a significant Series B funding round in late 2022, and are growing rapidly as a company, accelerating the Net-Zero transition through ratings.

www.bezerocarbon.com

Job Description

BeZero is looking for a mid-level data engineer to join our existing data products and tooling team, that sits within the broader data organisation. The team is focussed on developing carbon offset-related data products for our clients, as well as building internal data tools to increase the efficiency of our Ratings teams.

You’ll be responsible for building data products and tools that directly affect the way our ratings teams analyse carbon offset projects. This is a cross-functional role: you will be working together with colleagues from our product, ratings, and software engineering team every day.

To give you a flavour of the kind of work this team does, these are some of the projects members in our team have been working on recently:

This role is full time, and can be performed either hybrid (London Office) or remotely elsewhere in the UK. 

●Designing robust back-end API services that power our in-house central data portal, enabling ratings analysts to access prepared and curated data essential for evaluating carbon offset projects.

●Introducing an in-house knowledge management tool, with generative AI capabilities, to help rating analysts in navigating the large amounts of unstructured document data that exists in the carbon market.

●Collaborating closely with our rating analysts to standardise and automate quantitative analyses central to assessing renewable energy offsetting projects.

●Deploying a system of web crawlers to aggregate carbon project related data into our data warehouse, alongside developing a standardised data model so the data can be used internally and displayed on our client-facing platform.

If you’re excited by working on such problems and making impactful contributions to data in the climate space, then we’re looking for you.

Tech stack

As a data team, we have a bias towards shipping products, staying close to our internal and external customers, and end-to-end ownership of our infrastructure and deployments. This is a team that follows software engineering best practices closely. Our data stack includes the following technologies:

-AWS serves as our cloud infrastructure provider.

-Snowflake acts as our central data warehouse for tabular data. AWS S3 is used for any of our geospatial raster data, and we use PostGIS for storing and querying geospatial vector data.

-We use dbt for building SQL-style data models and Python jobs for non-SQL data transformations.

-Our computational jobs are executed in Docker containers on AWS ECS, and we use Prefect as our workflow orchestration engine.

-GitHub Actions for CI / CD.

-Metabase serves as a dashboarding solution for end-users.

We are a remote-friendly company and many of our colleagues work fully remote; however, for this position, we will only consider applications from candidates based in the UK. If you live in or near London, you are welcome (but not required!) to work from our London office.

Responsibilities:

You will be an individual contributor in our data engineering team, focused on designing and building robust data pipelines for the ingestion and processing of carbon offset-related data.

You will contribute to and maintain our analytical data models in our data warehouse.

You will work with our internal research and ratings teams to integrate the outputs of (analytical) data pipelines into BeZero’s business processes products.

You will work with other teams in the business to enable them to be more efficient, by building data tools and automations.

You’ll be our ideal candidate if:

You care deeply about the climate and carbon markets and are excited by solutions for decarbonising our economy.

You are a highly collaborative individual who wants to solve problems that drive business value.

You have at least 2 years of experience building ELT/ETL pipelines in production for data engineering use cases, using Python and SQL.

You have hands-on experience with workflow orchestration tools (e.g., Airflow, Prefect, Dagster), containerization using Docker, and a cloud platform like AWS.

You can write clean, maintainable, scalable, and robust code in Python and SQL, and are familiar with collaborative coding best practices and continuous integration tooling.

You are well-versed in code version control and have experience working in team setups on production code repositories.

You’ve designed back-end services and deployed APIs yourself, ideally using a framework like FastAPI.

You have experience in deploying and maintaining cloud resources into production using tools such as AWS Cloud Formation, Terraform, or others.

Our interview process:

●Initial screening interview with recruiter (15 mins)

●Introduction call with Chief Data Officer (30 mins)

●2x Technical interview with members from the data engineering team (60-90 mins)

●Reference checks + offer

We value diversity at BeZero Carbon. We need a team that brings different perspectives and backgrounds together to build the tools needed to make the voluntary carbon market transparent. We’re therefore committed to not discriminate based on race, religion, colour, national origin, sex, sexual orientation, gender identity, marital status, veteran status, age, or disability.

Medal
Computer

Hire with Escape

Showcase your progressive organisation and post your open roles to the biggest UK community of purpose driven job seekers.

Get Started