A multimodal data fusion and deep learning framework for large-scale wildfire surface fuel mapping

Accurate estimation of fuels is essential for wildland fire simulations as well as decision-making related to land management. Numerous research efforts have leveraged remote sensing and machine learning for classifying land cover and mapping forest vegetation species. In most cases that focused on surface fuel mapping, the spatial scale of interest was smaller than a few hundred square kilometers; thus, many small-scale site-specific models had to be created to cover the landscape at the national scale. The present work aims to develop a large-scale surface fuel identification model using a custom deep learning framework that can ingest multimodal data. Specifically, we use deep learning to extract information from multispectral signatures, high-resolution imagery, and biophysical climate and terrain data in a way that facilitates their end-to-end training on labeled data. A multi-layer neural network is used with spectral and biophysical data, and a convolutional neural network backbone is used to extract the visual features from high-resolution imagery. A Monte Carlo dropout mechanism was also devised to create a stochastic ensemble of models that can capture classification uncertainties while boosting the prediction performance. To train the system as a proof-of-concept, fuel pseudo-labels were created by a random geospatial sampling of existing fuel maps across California. Application results on independent test sets showed promising fuel identification performance with an overall accuracy ranging from 55% to 75%, depending on the level of granularity of the included fuel types. As expected, including the rare-and possibly less consequential-fuel types reduced the accuracy. On the other hand, the addition of high-resolution imagery improved classification performance at all levels.

To Access Resource:

Questions? Email Resource Support Contact:

  • opensky@ucar.edu
    UCAR/NCAR - Library

Resource Type publication
Temporal Range Begin N/A
Temporal Range End N/A
Temporal Resolution N/A
Bounding Box North Lat N/A
Bounding Box South Lat N/A
Bounding Box West Long N/A
Bounding Box East Long N/A
Spatial Representation N/A
Spatial Resolution N/A
Related Links N/A
Additional Information N/A
Resource Format PDF
Standardized Resource Format PDF
Asset Size N/A
Legal Constraints

Copyright author(s). This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.


Access Constraints None
Software Implementation Language N/A

Resource Support Name N/A
Resource Support Email opensky@ucar.edu
Resource Support Organization UCAR/NCAR - Library
Distributor N/A
Metadata Contact Name N/A
Metadata Contact Email opensky@ucar.edu
Metadata Contact Organization UCAR/NCAR - Library

Author Alipour, Mohamad
La Puma, Inga
Picotte, Joshua
Shamsaei, Kasra
Rowell, Eric
Watts, Adam
Kosovic, Branko
Ebrahimian, Hamed
Taciroglu, Ertugrul
Publisher UCAR/NCAR - Library
Publication Date 2023-02-01T00:00:00
Digital Object Identifier (DOI) Not Assigned
Alternate Identifier N/A
Resource Version N/A
Topic Category geoscientificInformation
Progress N/A
Metadata Date 2023-08-18T18:42:46.614973
Metadata Record Identifier edu.ucar.opensky::articles:26188
Metadata Language eng; USA
Suggested Citation Alipour, Mohamad, La Puma, Inga, Picotte, Joshua, Shamsaei, Kasra, Rowell, Eric, Watts, Adam, Kosovic, Branko, Ebrahimian, Hamed, Taciroglu, Ertugrul. (2023). A multimodal data fusion and deep learning framework for large-scale wildfire surface fuel mapping. UCAR/NCAR - Library. http://n2t.net/ark:/85065/d71j9fq5. Accessed 04 April 2025.

Harvest Source