Wildfire detection technology hasn’t changed much in 200 years. Fires are detected only when people see them. Spotters sit in towers to look for signs of smoke and gather reports from public sightings of fires. Forest rangers fly over the reported locations of lightning strikes to look for flames. But by the time those signs are visible, it’s often too late to contain a blaze.
With several record wildfire seasons already in the books, and climate change threatening to increase and intensify wildfires, researchers at the University of California, Berkeley are proposing to monitor the most fire-prone regions of the U.S. with a highly sensitive satellite. The team published their plan in the October issue of the journal Remote Sensing.
This system could detect fires as small as 12 square meters and send out an alert at any sign of a growing blaze. Early detection would enable authorities to make decisions faster, or better prepare for evacuations, explained Scott Stephens, a UC Berkeley associate professor of environmental science. “Wildfires would be smaller in scale if you could detect them before they got too big, like less than an acre.”
Launching one geosynchronous satellite to monitor the western United States would cost a few hundred million dollars, the report estimates. But that’s compared to the record $2 billion spent fighting fires in 2012.