ΑΙhub.org
 

Using machine learning to track greenhouse gas emissions


by
15 December 2025



share this:

By Michelle Willebrands

PhD candidate Julia Wąsala searches for greenhouse gas emissions using satellite data. As a computer scientist, she bridges the gap between computer science and space research. “We really can’t do this research without collaboration.”

Wąsala collaborates with atmospheric scientists from SRON (Space Research Organisation Netherlands) on machine learning models that detect large greenhouse gas emissions from space. There is too much data to review manually, and such models offer a solution.

How much greenhouse gas do humans emit?

The machine learning method Wąsala refers to detects emissions in the form of a point source: plumes. “That project gives a better picture of how much methane humans emit,” she says. “This allows us to contribute to detecting leaks in gas pipes or emissions from landfills, for example, and then solve them.”

The PhD candidate designed a method that can also detect plumes of other gases. “My most important contribution is that this is done completely automatically. This method automatically adjusts the model for different gases, such as methane, carbon monoxide and, in the future, nitrogen dioxide.”

Not only the model has to be good, but so does the training data

It comes as no surprise that the model must be well constructed, but the training data is just as important, explains Wąsala. The vast majority of satellite images do not contain gas plumes, but the model must still have enough examples to learn from. This means that someone has to manually label hundreds of training images. “That’s a huge job. We’re very happy that someone from SRON was able to do it,” she adds.

Wąsala talks to kids about science during teach-out

Clouds throw a spanner in the works

However, the biggest challenge lies in unexpected biases in the data. “Many people know that AI models can be biased, but we usually associate such bias with people: skin colour, gender or socio-economic status,” says Wąsala. “We also see bias in satellite data – it just looks different.”

Many satellite images have missing pixels, for example due to clouds obscuring the Earth’s surface. Often, these missing pixels do not occur randomly but follow a pattern. For example, there are more clouds above the equator. “The result was that my model learned to predict that a plume was present when there were few missing pixels in an image. But that’s not related at all.”

Collaboration between computer science and earth sciences

As a computer scientist Wąsala works “under the bonnet”, as she puts it. She writes the code but needs the earth scientists to interpret the data – and thus the results of her model. Fortunately, she easily crosses the boundaries of her field. “I enjoy working together. It’s also necessary: I don’t have the expertise to analyse the satellite images myself,” says Wąsala. “But sometimes it can be a challenge. We all speak our own language.”

“I want to show how much fun this research is”

The PhD candidate does a lot to make AI more accessible. She has a blog in which she informs earth scientists about the possibilities of machine learning. She appeared at the Weekend of Science event and is a member of the “Ask it Leiden” panel, where she answers children’s questions about AI. “I want to show how much fun this research is.”

Wąsala made a comic about her research






Universiteit Leiden




            AIhub is supported by:



Related posts :

Learning to see the physical world: an interview with Jiajun Wu

and   17 Feb 2026
Winner of the 2019 AAAI / ACM SIGAI dissertation award tells us about his current research.

3 Questions: Using AI to help Olympic skaters land a quint

  16 Feb 2026
Researchers are applying AI technologies to help figure skaters improve. They also have thoughts on whether five-rotation jumps are humanly possible.

AAAI presidential panel – AI and sustainability

  13 Feb 2026
Watch the next discussion based on sustainability, one of the topics covered in the AAAI Future of AI Research report.

How can robots acquire skills through interactions with the physical world? An interview with Jiaheng Hu

  12 Feb 2026
Find out more about work published at the Conference on Robot Learning (CoRL).

From Visual Question Answering to multimodal learning: an interview with Aishwarya Agrawal

and   11 Feb 2026
We hear from Aishwarya about research that received a 2019 AAAI / ACM SIGAI Doctoral Dissertation Award honourable mention.

Governing the rise of interactive AI will require behavioral insights

  10 Feb 2026
Yulu Pi writes about her work that was presented at the conference on AI, ethics and society (AIES 2025).

AI is coming to Olympic judging: what makes it a game changer?

  09 Feb 2026
Research suggests that trust, legitimacy, and cultural values may matter just as much as technical accuracy.

Sven Koenig wins the 2026 ACM/SIGAI Autonomous Agents Research Award

  06 Feb 2026
Sven honoured for his work on AI planning and search.


AIhub is supported by:







 













©2026.01 - Association for the Understanding of Artificial Intelligence