Expert Commentary

Long-term perspective on wildfires in the western United States

2012 study in PNAS on the frequency of wildfires in the American West over the past 3,000 years and the implications for larger wildfires.

Forest fires (iStock)
(iStock)

As Anglo-American settlers moved across North America, they had a significant impact on the land, clearing trees, expanding agriculture and building towns. As settlement expanded, forest fires — once an integral part of the natural world — were systematically suppressed. This practice had consequences when fires did break out, as they seemed to be increasingly large and severe.

In recent decades, Western wildfires have become larger and more severe, and scientists note that this is often related to accumulated materials — fuel loads — that, historically, have been burned in smaller blazes. Indeed, the combination of human suppression and changes in climate could make for an increasingly volatile mix in the coming years, and there remains continuing debate over best practices for forest management in this regard. The cost of fighting fires is high — the federal government spent more than $1.5 billion in 2014 alone, according to the National Interagency Fire Center. The U.S. Forest Service released a report in August 2015 predicting that its costs will rise sharply through 2025.

To better understand long-term variations in wildfires in the American West, a group of researchers looked at 3,000 years of sedimentary charcoal accumulation rates — obtained from the Global Palaeofire Working Group at the University of Bristol, U.K. — and matched them to fire scars and historical accounts of blazes. The results of the 2012 study, “Long-term Perspective on Wildfires in the Western USA,”  were published in the journal Proceedings of the National Academy of Sciences.

The area studied includes the U.S. states of Washington, Oregon, California, Idaho, Nevada, Montana, Wyoming, Colorado, New Mexico and Arizona.

Key findings include:

  • Burning in the American West declined slightly over the past 3,000 years, with low levels between 1400 and 1700 CE and throughout the 20th century. Peaks in burning occurred between 950 and 1250 CE and then again during the 1800s CE.
  • Fire activity was historically highest at 1000, 1400 and 1800 CE. The rise in fires at 1000 CE occurred when temperatures high and drought area were widespread. Another increase in fires occurred at around 1400 CE, when drought conditions increased rapidly.
  • Humans began to have a significant impact on fires in the 1800s. During expansion of Anglo-American settlements in the 19th century, evidence of burnings increased. In the 20th century this is reversed, due in part to changing practices nationally in fire outbreak management. Since then, “fire activity has strongly diverged from the trend predicted by climate alone and current levels of fire activity are clearly out of equilibrium with contemporary climate conditions.”
  • Burning is currently at its lowest in history. The previous minimum was between 1440 and 1700 CE, and was due to a decrease in droughts and temperatures, which were at a 1,500-year minimum. Prior to the current era, fire reached its lowest historical level at 1500 CE, which corresponded with the collapse of several Native American populations.

The authors of the paper conclude: “The divergence in fire and climate since the mid 1800s CE has created a fire deficit in the West that is jointly attributable to human activities and climate change and unsustainable given the current trajectory of climate change.”

Related research: An August 2015 study that appeared in Ecological Economics looks at the economic impact associated with wildland fires as a result of climate change. A 2015 study published in the International Journal of Wildland Fire explores the likelihood that climate change will increase the potential for very large fires (VLFs) in the U.S.

 

Tags: California, global warming, disasters

About the Authors