New Publication on Data-Driven Wildfire Spread Modeling of European Wildfires

New Publication on Data-Driven Wildfire Spread Modeling of European Wildfires

July 8, 2024

New Publication in the Journal “Fire” by Moritz Rösch on “Data-Driven Wildfire Spread Modeling of European Wildfires Using a Spatiotemporal Graph Neural Network” together with colleagues from the German Aerospace Center (DLR).

From the abstract: Wildfire spread models are an essential tool for mitigating catastrophic effects associated with wildfires. However, current operational models suffer from significant limitations regarding accuracy and transferability. Recent advances in the availability and capability of Earth observation data and artificial intelligence offer new perspectives for data-driven modeling approaches with the potential to overcome the existing limitations. Therefore, this study developed a data-driven Deep Learning wildfire spread modeling approach based on a comprehensive dataset of European wildfires and a Spatiotemporal Graph Neural Network, which was applied to this modeling problem for the first time. A country-scale model was developed on an individual wildfire time series in Portugal while a second continental-scale model was developed with wildfires from the entire Mediterranean region. While neither model was able to predict the daily spread of European wildfires with sufficient accuracy (weighted macro-mean IoU: Portugal model 0.37; Mediterranean model 0.36), the continental model was able to learn the generalized patterns of wildfire spread, achieving similar performances in various fire-prone Mediterranean countries, indicating an increased capacity in terms of transferability. Furthermore, we found that the spatial and temporal dimensions of wildfires significantly influence model performance. Inadequate reference data quality most likely contributed to the low overall performances, highlighting the current limitations of data-driven wildfire spread models

you may also like:

Our research site and project covered by BR

Our research site and project covered by BR

The University forest at Sailershausen is a unique forest owned by the University of Wuerzburg. It comes with a high diversity of trees and most important is part of various research projects. We conducted various UAS/UAV/drone flights with Lidar, multispectral and...

Meeting of the FluBig Project Team

Meeting of the FluBig Project Team

During the last two days, the team of the FluBig project (remote-sensing.org/new-dfg-project-on-fluvial-research/) met at the EORC for discussing the ongoing work on fluvial biogeomorphology. After returning from a successful field expedition to Kyrgyzstan a couple of...

‘Super Test Site Würzburg’ project meeting

‘Super Test Site Würzburg’ project meeting

After the successful "Super Test Site Würzburg" measurement campaign in June (please see here: https://remote-sensing.org/super-test-site-wurzburg-from-the-idea-to-realization/ ), the core team from the University of Würzburg, the Karlsruhe Institute of Technology,...

EORC Talk: Geolingual Studies: A New Research Direction

EORC Talk: Geolingual Studies: A New Research Direction

On July 19th, Lisa Lehnen and Richard Lemoine Rodríguez, two postdoctoral researchers of the Geolingual Studies project, gave an inspiring presentation at the EORC talk series.   In the talk titled "Geolingual Studies – a new research direction", they...

EO support for UrbanPArt field work

EO support for UrbanPArt field work

From May to September, Karla Wenner, a PhD student at the Juniorprofessorship for Applied Biodiversity Science, will be sampling urban green spaces and semi-natural grasslands in Würzburg as part of the UrbanPArt project. Our cargo bikes support the research project...

Cinematic drone shots

Cinematic drone shots

We spend quite some time in the field conducting field work, from lidar measurements to vegetation samples in order to correlate it with remote sensing data to answer various research questions concerning global change. Field work is always a 24/7 work load and...