ΑΙhub.org
 

#NeurIPS – tackling climate change with machine learning

by
08 January 2020



share this:

Climate change was one of the many topics covered at NeurIPS 2019 (the Thirty-third Annual Conference on Neural Information Processing Systems), with a day-long workshop dedicated to the theme. The session was organised by Climate Change AI, a group of volunteers from academia and industry that seeks to facilitate work in climate change and machine learning.

Included in the workshop were four invited talks. These are summarised below, and the full presentations can be found in the embedded videos.

Jeff Dean, Google AI
“Computation + Systems vs Climate Change”

In his talk Jeff covered three areas relating to climate and energy where machine learning is being applied to good effect. The first concerns fusion energy, where Bayesian inference and TensorFlow Probability (a Python library for probabilistic reasoning and statistical analysis) are used to help fusion researchers find stable conditions for plasmas. Other uses of Bayesian inference include: forecasting greenhouse gas concentrations, forecasting energy demand, predicting forest fire occurrence and spread, detecting deforestation and environmental change and modelling salinity in estuaries and coral reefs.

The second part of Jeff’s talk focussed on simulation of large-scale climate models, with the example of flood forecasting. By using tensor processing units (TPUs) the computation speed of complex models can be improved significantly, and this increased speed means that the spatial resolution of studies can be much improved. For example, flooding predictions can be made at more localised level. (Note: A TPU is a type of integrated circuit developed by Google specifically for neural network machine learning.)

The final section of the talk covered the use of neural networks as a proxy for more mathematical computation. This is exemplified by application to weather forecasting. For very short-term forecasting (for a few hours in the future) neural networks can provide accurate results with a computational cost that is much lower than mathematical models (which are typically based on partial differential equations). The goal is to extend the capabilities of neural network predictions so that they are accurate for longer timescales.


Felix Creutzig, MCC Berlin
“Leveraging digitalization for urban solutions in the Anthropocene”

In his presentation Felix talked about different ways in which emissions could be lowered in urban areas and demonstrated how machine learning could be used to help. The first of these possible solutions relates to urban layout. The buildings and infrastructure that are put in place now will have a dramatic impact on our future emissions and energy use. Machine learning and big data can be used in urban planning to help contribute to lower-carbon cities. Data on current energy and transport usage can train neural networks, with patterns identified, and the findings used to optimise future urban developments.

Felix presented research showing that cities with higher fuel taxes have considerably lower greenhouse gas emissions. The implications of these higher prices are further reaching than a drop in car use. Without the high number of cars to be “stored” less space is required for urban dwellings and the available area can be used more efficiently, seeing a drop in energy required for new buildings.

There is much potential for AI to help understand our everyday energy consumption. Machine learning tools can build up a detailed picture of energy usage of individuals throughout the day. This information can be fed back to them to enable them to optimise their energy usage.

The talk concluded with calls for climate researchers, AI researchers and public policy makers to collaborate more widely.


Carla Gomes, Cornell University
“Computational Sustainability: Computing for a Better World and a Sustainable Future”

The field of computational sustainability is highly interdisciplinary and involves development of methods to tackle environmental challenges facing the planet. In her talk, Carla detailed three projects that are taking place at Cornell.

The first concerns the discovery of clean energy materials, primarily for use in fuel cells and as solar fuels. The team collaborated with materials scientists with the aim of finding new, stable materials. Experiments involved producing mixtures of three metals; the goal was to use x-ray diffraction (XRD) patterns of these new materials to determine the composition and crystal structure. Carla and her team developed a deep reasoning network method which combines machine learning techniques and symbolic AI. This method enables much faster analysis of the XRD data than has previously been possible, allowing many more metal mixtures to be analysed within a particular time-frame.

The second topic concerned biodiversity, where the fundamental question is: how are different species distributed across landscapes over time? An example given was the progress of a migratory birds. Much of the data were actually obtained from volunteers through a citizen science bird-watching project. These observations were combined with environmental data from remote sensors to build up a complete picture using machine learning techniques.

Finally, Carla talked about the socio-economic impacts of hydropower dam placement in the amazon basin. Here, machine learning techniques are used to determine the optimum locations for new dams. From a computational perspective this is a multi-objective optimization problem. The goal is the find an optimum balance between hydropower energy gained and minimal ecological impact.


Lester Mackey, Microsoft Research & Stanford University
“Improving Subseasonal Forecasting in the Western US”

Subseasonal forecasting covers the time period between a short-term weather forecast, which can give good predictions up to two weeks ahead, and seasonal forecasts, which provide long-term trends. Subseasonal forecasters are typically interested in the weather two-to-six weeks in advance. This advance climate information can be vital for allocating water resources, managing wildfires and preparing for weather extremes.

Numerical weather prediction models, based on physical laws, use partial differential equations and have become very accurate for the short-term. However, the chaotic nature of weather systems means that a small change in initial conditions can lead to a large difference in a prediction two or so weeks later. The lack of accurate forecasting for long-term weather led the United States Bureau of Reclamation to launch a competition (”The Subseasonal Climate Forecast Rodeo”) with the aim of yielding improved subseasonal forecast models.

In his talk Lester presented the machine learning forecasting models that his group worked on as part of the competition. The competition required teams to submit a series of two-weekly predictions of average temperature and precipitation for a number of grid locations across the Western United States. In all cases Lester and his team improved on the existing numerical models.  The team released a new dataset (the SubseasonalRodeo Dataset) that the machine learning community can use to train and benchmark subseasonal forecasting systems in the future.


Following the invited talks, other contributed talks and poster session, the workshop concluded with a panel discussion.

 

You can read more about tackling climate change with machine learning in one of our previous blog posts by Jessica Montgomery, Senior Policy Adviser, The Royal Society.



tags:


Lucy Smith , Managing Editor for AIhub.
Lucy Smith , Managing Editor for AIhub.




            AIhub is supported by:


Related posts :



DataLike: Interview with Tẹjúmádé Àfọ̀njá

"I place an emphasis on wellness and meticulously plan my schedule to ensure I can make meaningful contributions to what's important to me."

Beyond the mud: Datasets, benchmarks, and methods for computer vision in off-road racing

Off-road motorcycle racing poses unique challenges that push the boundaries of what existing computer vision systems can handle
17 April 2024, by

Interview with Bálint Gyevnár: Creating explanations for AI-based decision-making systems

PhD student and AAAI/SIGAI Doctoral Consortium participant tells us about his research.
16 April 2024, by

2024 AI Index report published

Read the latest edition of the AI Index Report which tracks and visualises data related to AI.
15 April 2024, by

#AAAI2024 workshops round-up 4: eXplainable AI approaches for deep reinforcement learning, and responsible language models

We hear from the organisers of two workshops at AAAI2024 and find out the key takeaways from their events.
12 April 2024, by

Deep learning-powered system maps corals in 3D

A system developed at EPFL can produce 3D maps of coral reefs from camera footage in just a few minutes.
11 April 2024, by




AIhub is supported by:






©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association