Understanding the causes and consequences of wildfires in forests of the western United States requires integrated information about fire, climate changes, and human activity on multiple temporal scales. We use sedimentary charcoal accumulation rates to construct long-term variations in fire during the past 3,000 y in the American West and compare this record to independent firehistory data from historical records and fire scars. There has been a slight decline in burning over the past 3,000 y, with the lowest levels attained during the 20th century and during the Little Ice Age (LIA, ca. 1400–1700 CE [Common Era]). Prominent peaks in forest fires occurred during the Medieval Climate Anomaly (ca. 950–1250 CE) and during the 1800s. Analysis of climate reconstructions beginning from 500 CE and population data show that temperature and drought predict changes in biomass burning up to the late 1800s CE. Since the late 1800s , human activities and the ecological effects of recent high fire activity caused a large, abrupt decline in burning similar to the LIA fire decline. Consequently, there is now a forest “fire deficit” in the western United States attributable to the combined effects of human activities, ecological, and climate changes. Large fires in the late 20th and 21st century fires have begun to address the fire deficit, but it is continuing to grow.
Marlon JR, Bartlein PJ, Gavin DG, Long CJ, Anderson SR, Briles CE, Brown KJ, Colombaroli D, Hallett DJ, Power MJ, et al. Long-term perspective on wildfires in the western USA. National Academy of Sciences; 2012 p. 9. Available from: http://www.pnas.org/content/109/9/E535