ΑΙhub.org
 

New satellite mapping with AI can quickly pinpoint hurricane damage


by
20 October 2022



share this:
satellite image of damaged area

By Zhe Zhu, University of Connecticut and Su Ye, University of Connecticut

Hurricane Ian left an extraordinarily broad path of destruction across much of South Florida. That was evident in reports from the ground, but it also shows up in satellite data. Using a new method, our team of spatial and environmental analysts was able to quickly provide a rare big picture view of damage across the entire state.

State of Florida with red dots across a large swath of the state from Charlotte Harbor to the Space Coast and for large distances on either side showing likely damage
Satellite images and artificial intelligence reveal Hurricane Ian’s widespread damage. The dark areas have a high probability of damage. Su Ye

By using satellite images from before the storm and real-time images from four satellite sensors, together with artificial intelligence, we created a disaster monitoring system that can map damage in 30-meter resolution and continuously update the data.

It’s a snapshot of what faster, more targeted disaster monitoring can look like in the future – and something that could eventually be deployed nationwide.

How artificial intellegence spots the damage

Satellites are already used to identify high-risk areas for floods, wildfires, landslides and other disasters, and to pinpoint the damage after these disasters. But most satelite-based disaster management approaches rely on visually assessing the latest images, one neighborhood at a time.

Our technique automatically compares pre-storm images with current satellite images to spot anomalies quickly over large areas. Those anomalies might be sand or water where that sand or water shouldn’t be, or heavily damaged roofs that don’t match their pre-storm appearance. Each area with a significant anomaly is flagged in yellow.

Damage detected in the same area of Matlacha as in the photo. Su Ye.

Five days after Ian lashed Florida, the map showed yellow alert polygons all over South Florida. We found that it could spot patches of damage with about 84% accuracy.

A natural disaster like a hurricane or tornado often leaves behind large areas of spectral change at the surface, meaning changes in how light reflects off whatever is there, such as houses, ground or water. Our algorithm compares the reflectance in models based on pre-storm images with reflectance after the storm.

Damage in the same part of Punta Gorda shown in the photo. Su Ye.

The system spots both changes in physical properties of natural areas, such as changes in wetness or brightness, and the overall intensity of the change. An increase in brightness often is related to exposed sand or bare land due to hurricane damage.

Using a machine-learning model, we can use those images to predict disturbance probabilities, which measures the influences of natural disaster on land surfaces. This approach allows us to automate disaster mapping and provide full coverage of an entire state as soon as the satellite data is released.

The system uses data from four satellites, Landsat 8 and Landsat 9, both operated by NASA and the U.S. Geological Survey, and Sentinel 2A and Sentinel 2B, launched as part of the European Commission’s Copernicus program.

Real-time monitoring, nationwide

Extreme storms with destructive flooding have been documented with increasing frequency over large parts of the globe in recent years.

While disaster response teams can rely on airplane surveillance and drones to pinpoint damage in small areas, it’s much harder to see the big picture in a widespread disaster like hurricanes and other tropical cyclones, and time is of the essence. Our system provides a fast approach using free government-produced images to see the big picture. One current drawback is the timing of those images, which often aren’t released publicly until a few days after the disaster.

We are now working on developing near real-time monitoring of the whole conterminous United States to quickly provide the most up-to-date land information for the next natural disaster.The Conversation

Zhe Zhu, Assistant Professor of Natural Resources and the Environment, University of Connecticut and Su Ye, Postdoctoral researcher in environment and remote sensing, University of Connecticut

This article is republished from The Conversation under a Creative Commons license. Read the original article.




The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.
The Conversation is an independent source of news and views, sourced from the academic and research community and delivered direct to the public.




            AIhub is supported by:


Related posts :



monthly digest

AIhub monthly digest: May 2025 – materials design, object state classification, and real-time monitoring for healthcare data

  30 May 2025
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.

The Good Robot podcast: Transhumanist fantasies with Alexander Thomas

  28 May 2025
In this episode, Eleanor talks to Alexander Thomas, a filmmaker and academic, about the transhumanist narrative.

Congratulations to the #ICRA2025 best paper award winners

  27 May 2025
The winners and finalists in the different categories have been announced.

#ICRA2025 social media round-up

  23 May 2025
Find out what the participants got up to at the International Conference on Robotics & Automation.

Interview with Gillian Hadfield: Normative infrastructure for AI alignment

  22 May 2025
Kumar Kshitij Patel spoke to Gillian Hadfield about her interdisciplinary research, career trajectory, path into AI alignment, law, and general thoughts on AI systems.

PitcherNet helps researchers throw strikes with AI analysis

  21 May 2025
Baltimore Orioles tasks Waterloo Engineering researchers to develop AI tech that can monitor pitchers using low-resolution video captured by smartphones

Interview with Filippos Gouidis: Object state classification

  20 May 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.



 

AIhub is supported by:






©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence