Central Sierra
University of California
Central Sierra

Posts Tagged: climate change

Study: Wildfires, Climate Change Could Make Sierra a Polluter

Reposted from UC Merced News


Yosemite Valley in the western Sierra Nevada Mountains.

Yosemite Valley in the western Sierra Nevada Mountains.

What if nature were to become a polluter, discharging millions of tons of planet-warming carbon into the atmosphere in much the same way as diesel-fueled trucks or coal-fired power plants?

This nature-as-polluter scenario might seem far-fetched, but it's well on its way to becoming reality, according to a recent study co-authored by UC Merced Professor LeRoy Westerling.

In a paper published recently in Scientific Reports Opens a New Window.— “Potential decline in carbon carrying capacity under projected climate-wildfire interactions in the Sierra Nevada” — Westerling and collaborators from the University of New Mexico and Penn State University used three climate models and data from the Intergovernmental Panel on Climate Change to examine how rising global temperatures and increasingly severe wildfires will affect Sierra Nevada forests.

Their conclusion: Changing conditions will turn today's Sierra Nevada forests into tomorrow's greenhouse gas emitters.

Forests play an important part in regulating the levels of atmospheric carbon,” Westerling explained. “Forests are carbon sinks, essentially giant stockpiles of carbon. Forests are also active carbon consumers. They remove carbon dioxide from the air and convert it into biomass. This traps the carbon, which is no longer free to act as a greenhouse gas in Earth's atmosphere.”

Professor LeRoy Westerling.

Professor LeRoy Westerling

But projections from Westerling and colleagues suggest that this may change. According to their models, Sierra Nevada forests will experience both a dramatic loss of stored carbon and a substantial decline in their ability to remove CO2 from the atmosphere.

Rising temperatures are creating a warmer, drier Sierra Nevada climate. Westerling previously showed that these changes are leading to dramatic increases in the frequency, size and duration of wildfires. The new study suggests that these same changes will make it harder for forests to regenerate, leading to a loss of forest density, with plants better suited to the new climate eventually replacing trees.

As trees are displaced, the Sierra Nevada will lose its ability to sequester carbon,” Westerling explained. “The plants that spring up in their place will be significantly smaller, making them less effective carbon sinks than the trees they replaced.”

But the carbon stored in forest trees has to go somewhere.

As trees are burned in more frequent wildfires, and as dead trees undergo decomposition, Westerling and his colleagues predict that as much as 73 percent of the carbon in Sierra Nevada forests will be released, resulting in a dramatic spike in atmospheric carbon. This will transform the Sierra Nevada from a carbon sink into a carbon emitter, making the nature-as-polluter scenario a reality.

Westerling and his collaborators note that their predictions are actually conservative. The effects might be more extreme than their models suggest.

Our study does not account for a number of factors that might influence the dynamics of forest carbon,” Westerling said. “However, the factors we ignored are likely to accelerate the loss of forest. Our predictions likely underestimate the severity of actual effects.”

Though the predictions are alarming, the authors remain optimistic, hopeful that their findings can contribute to a larger conversation about environmental policy and promote avenues of research that lead to sustainable forest management.

Posted on Thursday, June 15, 2017 at 2:56 PM
  • Author: Jason Alvarez. University Communications

UC helps forest owners adapt to climate change

Reposted from the UCANR Green Blog

To help California forest property owners adapt to the changing climate, UC Agriculture and Natural Resources (UC ANR) has produced a 13-page peer-reviewed paper that outlines actions owners can take to sustain their forests' value even when temperatures rise.

“Managers of forest land have always had to adapt to changing conditions – such as markets, urban encroachment, droughts and floods,” said Susie Kocher, UC Cooperative Extension forestry and natural resources advisor. “We wrote this paper to help forest managers better understand the evolving science of climate change and how they can help their forests adapt to the climate of the future.”

Thinning trees in a mixed conifer forest is one way to improve its resiliency to climate change. (Photo: Will Suckow)
 

Forests are shaped by the climates in which they grow. The current rapid pace of climate change has not happened for thousands of years, according to climate scientists. Nevertheless, the authors assure forest landowners that there are land management decisions they can make to ensure the resiliency of their resources, and perhaps even improve them.

“Some trees may grow faster under the warmer conditions we experience with climate change,” Kocher said, “especially those at highest elevation where there is adequate precipitation.”

The paper details the solid scientific evidence that indicates the rise in global average temperatures over the past 100 years. The temperatures, it says, “will likely continue to rise in the future, with impacts on natural and human systems.”

The document provides specific recommendations for care of three common types of forest in California: mixed conifer, oak woodland and coastal redwood forests.

Oak woodland forests have moderate to high capacity to adapt to climate change. (Photo: Stephanie Drill)
 

Mixed conifer forests – typically composed of white fir, sugar pine, ponderosa pine, incense cedar and California black oak – are susceptible to moisture stress caused by warmer temperatures and reduced snow and rain. The drier conditions make the trees more vulnerable to fire and insect attack.

The drought of 2010-2016 has already had a substantial impact on mixed conifer forests in the Sierra Nevada. Aerial detection surveys show that more than 102 million trees have died since 2010; more than 62 million died in 2016 alone.

The UC ANR climate change adaptation paper suggests reducing competition for water by thinning trees and managing for species and structural diversity. The authors suggest property owners consider the source of seedlings when planting new trees.

“Select seedlings adapted to a slightly lower elevation or latitude than your property,” Kocher said. “These would be more likely to thrive under the 3- to 5-degree warmer temperatures we expect in 50 years or so.”

Oak woodlands are widely distributed and diverse in California, which gives them moderate to high capacity to adapt to climate change. Mature oaks are more resilient than young trees and seedlings.

One potential impact of climate change on oak woodlands is increasing precipitation variability and increasing spring rains. The moisture change could increase the spread and prevalence of Sudden Oak Death (SOD), a disease caused by a bacterium that was introduced into California from outside the U.S. SOD is primarily a concern in areas with tanoaks in Central to Northern California coastal areas.

“To reduce the spread of sudden oak death, land owners should prevent the movement of infected leaves, wood and soil,” according to the paper.

Fog frequency in California's redwood forests is significantly lower now than 100 years ago. (Photo: Jason Sturner, Wikimedia Commons.)
 

The primary concern for coastal redwood forests is the decline in fog. Fog frequency in coastal redwoods is 33 percent lower now compared to the early 20th Century. Less fog and rain plus warmer temperatures would leave coastal areas where redwoods typically thrive drier. But that doesn't mean redwoods will disappear. Areas with deep soil and areas close to streams and rivers may provide refuge for redwood forests.

The new publication, Adapting Forests to Climate Change, can be downloaded free from the UC ANR Catalog. It is the 25th in the Forest Stewardship series, developed to help forest landowners in California learn how to manage their land. It was written by Adrienne Marshall, a doctoral student at the University of Idaho; Susie Kocher, UC Cooperative Extension forestry and natural resources advisor; Amber Kerr, postdoctoral scholar with the UC John Muir Institute of the Environment; and Peter Stine, U.S. Forest Service.

Posted on Wednesday, April 5, 2017 at 9:11 PM
  • Author: Jeannette Warnert

Global warming hiatus disproved — again

Reposted from UC Berkeley News 

A controversial paper published two years ago that concluded there was no detectable slowdown in ocean warming over the previous 15 years — widely known as the “global warming hiatus” — has now been confirmed using independent data in research led by researchers from UC Berkeley and Berkeley Earth, a non-profit research institute focused on climate change.

an Argo float

A NEMO float, part of the global Argo array of ocean sensing stations, deployed in the Arctic from the German icebreaker Polarstern Bremerhaven. (Photo courtesy of Argo)

The 2015 analysis showed that the modern buoys now used to measure ocean temperatures tend to report slightly cooler temperatures than older ship-based systems, even when measuring the same part of the ocean at the same time. As buoy measurements have replaced ship measurements, this had hidden some of the real-world warming.

After correcting for this “cold bias,” researchers with the National Oceanic and Atmospheric Administration concluded in the journal Science that the oceans have actually warmed 0.12 degrees Celsius (0.22 degrees Fahrenheit) per decade since 2000, nearly twice as fast as earlier estimates of 0.07 degrees Celsius per decade. This brought the rate of ocean temperature rise in line with estimates for the previous 30 years, between 1970 and 1999.

This eliminated much of the global warming hiatus, an apparent slowdown in rising surface temperatures between 1998 and 2012. Many scientists, including the International Panel on Climate Change, acknowledged the puzzling hiatus, while those dubious about global warming pointed to it as evidence that climate change is a hoax.

Climate change skeptics attacked the NOAA researchers and a House of Representatives committee subpoenaed the scientists' emails. NOAA agreed to provide data and respond to any scientific questions but refused to comply with the subpoena, a decision supported by scientists who feared the “chilling effect” of political inquisitions.

The new study, which uses independent data from satellites and robotic floats as well as buoys, concludes that the NOAA results were correct. The paper will be published Jan. 4 in the online, open-access journal Science Advances.

“Our results mean that essentially NOAA got it right, that they were not cooking the books,” said lead author Zeke Hausfather, a graduate student in UC Berkeley's Energy and Resources Group.

Long-term climate records

Hausfather said that years ago, mariners measured the ocean temperature by scooping up a bucket of water from the ocean and sticking a thermometer in it. In the 1950s, however, ships began to automatically measure water piped through the engine room, which typically is warm. Nowadays, buoys cover much of the ocean and that data is beginning to supplant ship data. But the buoys report slightly cooler temperatures because they measure water directly from the ocean instead of after a trip through a warm engine room.

rising ocean temperatures

A new UC Berkeley analysis of ocean buoy (green) and satellite data (orange) show that ocean temperatures have increased steadily since 1999, as NOAA concluded in 2015 (red) after adjusting for a cold bias in buoy temperature measurements. NOAA's earlier assessment (blue) underestimated sea surface temperature changes, falsely suggesting a hiatus in global warming. The lines show the general upward trend in ocean temperatures. (Zeke Hausfather graphic)

NOAA is one of three organizations that keep historical records of ocean temperatures – some going back to the 1850s – widely used by climate modelers. The agency's paper was an attempt to accurately combine the old ship measurements and the newer buoy data.

Hausfather and colleague Kevin Cowtan of the University of York in the UK extended that study to include the newer satellite and Argo float data in addition to the buoy data.

“Only a small fraction of the ocean measurement data is being used by climate monitoring groups, and they are trying to smush together data from different instruments, which leads to a lot of judgment calls about how you weight one versus the other, and how you adjust for the transition from one to another,” Hausfather said. “So we said, ‘What if we create a temperature record just from the buoys, or just from the satellites, or just from the Argo floats, so there is no mixing and matching of instruments?'”

In each case, using data from only one instrument type – either satellites, buoys or Argo floats – the results matched those of the NOAA group, supporting the case that the oceans warmed 0.12 degrees Celsius per decade over the past two decades, nearly twice the previous estimate. In other words, the upward trend seen in the last half of the 20th century continued through the first 15 years of the 21st: there was no hiatus.

“In the grand scheme of things, the main implication of our study is on the hiatus, which many people have focused on, claiming that global warming has slowed greatly or even stopped,” Hausfather said. “Based on our analysis, a good portion of that apparent slowdown in warming was due to biases in the ship records.”

Correcting other biases in ship records

In the same publication last year, NOAA scientists also accounted for changing shipping routes and measurement techniques. Their correction – giving greater weight to buoy measurements than to ship measurements in warming calculations – is also valid, Hausfather said, and a good way to correct for this second bias, short of throwing out the ship data altogether and relying only on buoys.

hadley data

Berkeley's analysis of ocean buoy (green) and satellite data (orange) and NOAA's 2015 adjustment (red) are compared to the Hadley data (purple), which have not been adjusted to account for some sources of cold bias. The Hadley data still underestimate sea surface temperature changes. (Zeke Hausfather graphic)

Another repository of ocean temperature data, the Hadley Climatic Research Unit in the United Kingdom, corrected their data for the switch from ships to buoys, but not for this second factor, which means that the Hadley data produce a slightly lower rate of warming than do the NOAA data or the new UC Berkeley study.

“In the last seven years or so, you have buoys warming faster than ships are, independently of the ship offset, which produces a significant cool bias in the Hadley record,” Hausfather said. The new study, he said, argues that the Hadley center should introduce another correction to its data.

“People don't get much credit for doing studies that replicate or independently validate other people's work. But, particularly when things become so political, we feel it is really important to show that, if you look at all these other records, it seems these researchers did a good job with their corrections,” Hausfather said.

Co-author Mark Richardson of NASA‘s Jet Propulsion Laboratory and the California Institute of Technology in Pasadena added, “Satellites and automated floats are completely independent witnesses of recent ocean warming, and their testimony matches the NOAA results. It looks like the NOAA researchers were right all along.“

Other co-authors of the paper are David C. Clarke, an independent researcher from Montreal, Canada, Peter Jacobs of George Mason University in Fairfax, Virginia, and Robert Rohde of Berkeley Earth. The research was funded by Berkeley Earth.

 

Posted on Tuesday, January 24, 2017 at 9:50 AM
  • Author: Robert Sanders

It's not just climate change: Study finds human activity is a major factor driving wildfires

 Reposted from the UCANR news

Beyond climate change, humans contribute another set of factors that influence wildfires, including where structures are built, and the frequency and location of ignitions from a variety of sources—everything from cigarettes on the highway, to electrical poles that get blown down in Santa Ana winds.
 
A new study examining wildfires in California found that human activity explains as much about their frequency and location as climate influences. The researchers systematically looked at human (anthropogenic) behaviors and climate change together, which is unique and rarely attempted on an area of land this large.

The findings suggest many models of wildfire predictions do not accurately account for anthropogenic factors and may therefore be misleading when identifying the main causes/drivers of wildfires. The newest model proportionately accounts for climate change and human behavioral threats and allows experts to more accurately predict how much land is at risk of burning in California through 2050, which is estimated at more than 7 million acres in the next 25 years.

Climate change affects the severity of the fire season and the amount and type of vegetation on the land, which are major variables in predicting wildfires. However, humans contribute another set of factors that influence wildfires, including where structures are built, and the frequency and location of ignitions from a variety of sources—everything from cigarettes on the highway, to electrical poles that get blown down in Santa Ana winds. As a result of the near-saturation of the landscape, humans are currently responsible for igniting more than 90 percent of the wildfires in California.

“Individuals don't have much control over how climate change will effect wildfires in the future. However, we do have the ability to influence the other half of the equation, those variables that control our impact on the landscape,” said Michal Mann, assistant professor of geography at George Washington University and lead author of the study. “We can reduce our risks by disincentivizing housing development in fire-prone areas, better managing public land, and rethinking the effectiveness of our current firefighting approach.”

The researchers found that by omitting the human influence on California wildfires, they were overstating the influence of climate change. The authors recommend considering climate change and human variables at the same time for future models.

“There is widespread agreement about the importance of climate on wildfire at relatively broad scales. At more local scales, however, you can get the story quite wrong if you don't include human development patterns,” said co-author Max Moritz, UC Cooperative Extension fire ecology specialist whose lab is at the University of California, Berkeley. “This is an important finding about how we model climate change effects, and it also confirms that getting a handle on where and how we build our communities is essential to limiting future losses.”

Between 1999 and 2011, California reported an average of $160 million in annual wildfire-related damages, with nearly 13,000 homes and other structures destroyed in so-called state responsibility areas - fire jurisdictions maintained by California, according to Mann. During this same period, California and the U.S. Forest Service spent more than $5 billion on wildfire suppression.

In a model from 2014 that examined California wildfires' destruction over the last 60 years, Dr. Mann estimated that fire damage will more than triple by mid-century, increasing to nearly half a billion dollars annually. “This information is critical to policymakers, planners, and fire managers, to determine wildfire risks,” he said.

The paper, “Incorporating Anthropogenic Influences into Fire Probability Models: Effects of Human Activity and Climate Change on Fire Activity in California,” published Thursday in PLOS ONE.

Press release written by Emily Grebenstein, George Washington University, emgreb@gwu.edu, 202-994-3087

Posted on Monday, May 2, 2016 at 8:32 AM
  • Author: jeannette warnert
Tags: climate change (7), land use (1), Max Moritz (1), wildfie (1)

Burning Down the House: Should Private Assets be Sacrificed to Protect Public Land?

Reposted from California Magazine

Back when mastodons and giant ground sloths still roamed the earth – the late 70s and early 80s – I worked as a wildfire fighter for the U.S. Forest Service, both on hand crews and engine crews. Our training was narrow but relatively deep. Mainly, we were taught to construct fire lines with hand tools and chain saws. Water, when it was available, generally was used to protect the line and firefighters; seldom was it employed to directly extinguish the flames.

Our basic strategy consisted of digging and cutting line around the flanks of the fire, then burning out fuels to the advancing flames with fusees (devices resembling highway flares) or drip torches. In this way, the “head” of the fire could be steered to natural barriers or areas sufficiently devoid of fuels to make a direct attack possible. We received zero training for structure firefighting. The one time I responded to a burning structure was in Trinity County: A vacation cabin was ablaze due to a faulty propane line. Several engines responded. Federal Forest Service engines are smaller and hold far less water than municipal or state engines, but collectively, we mustered a lot of water on the scene. A direct attack could have been possible, but we knew our training for battling such a fire was inadequate. Instead, we dug a line around the cabin so the flames wouldn't encroach into the surrounding woods, and watched it burn to the ground.

Things are different now. For one thing, wildfires are bigger and more frequent. This is due to drought, climate change, and the sins of past forest managers. In the sixties, seventies and eighties, vast tracts of old growth timber were liquidated in massive clear cuts. These deforested landscapes were replanted as conifer monocrops, resulting in expansive stands of spindly, closely-spaced, second-growth trees that are as flammable as kerosene.

Meanwhile, the goal for wildfire fighters has changed drastically. The emphasis now is on “protecting interface,” which means preventing fires from immolating the homes that have sprouted across the West's woodlands like morel mushrooms after a rain (back when we had rain). This shift has made fighting wildfires far more expensive, more dangerous for firefighters, and has altered priorities from protecting public forests to protecting private assets. Wildfire fighters now receive training in structure fires, but that has diluted, perhaps even vitiated, their original mission. As Berkeley Environmental Science Professor and Wildfire Researcher Scott Stephens noted, more than half the U.S. Forest Service budget for the current fiscal year is dedicated to fire suppression; in the early 1990s, that figure was about 20 percent. Assuming the trend will continue, which seems certain, firefighting could consume 70 percent of the agency's budget by the 2020s.

That means there's less money than ever for restorative work. And this is work that must be done, and soon. Unless we alter the essential characteristics of our coniferous forests, they will quite literally vanish. It's already happening: Stephens observes that significant portions of California's forests are shifting from pine and fir to mixed hardwoods or even grasslands, the result of repeated, high-intensity fires and drought. And once our conifers are gone, we're not getting them back. The change will be permanent.

Even with drought and accelerating climate change, we can still have healthy coniferous forests in the West. But we won't get them by simply letting them grow — and burn (and burn). Stephens observes we need active management: intensive thinning by both mechanical means and prescriptive fire. This will result in forests with fewer but healthier trees, forests that are largely resistant to any but the most catastrophic fires.

A hundred years ago, disastrous wildfires were rare in California. Forests were characterized by widely spaced, extremely large trees; You could ride through them on horseback, unimpeded. Any fires that did ignite generally crept along. They didn't have the “fuel ladders” — dead limbs and needles on the ground, brush and ascending foliage higher up — needed to climb into the crowns of the trees and explode into rolling fireballs. Being large, the trees were thick-barked and resistant to fire. Indeed, periodic low-level fires disposed of deadwood, killed destructive insects, and returned nutrients to the soil as ashes. It was a virtuous cycle, assuring healthy, resilient wild lands that depended on fires, but were not destroyed by them.

That changed with the aggressive fire suppression of the Smokey the Bear era and accelerated clear-cut logging. But as Stephens notes, we can revitalize the “dog hair” (as in, thick as the hair on a dog's back) forests we now have. We can re-create the vibrant, fire-resistant forests of the early 20th century. We know how to do it. We have the tools: chain saws, heavy equipment, and prescriptive fire. It's not that complicated.

But it will take political will and money. It won't require a Manhattan Project-style response —but it'll require one similar to the Civilian Conservation Corps in scope and commitment. We need to put young men and women back into the woods in force, cutting trees and conducting controlled burns. By re-introducing fire into forest ecosystems, we can, paradoxically, protect them from fire. This will entail triage. We'll have to identify those areas that are most vulnerable to fire (e.g., interface communities). The first projects should be shaded fuel breaks, strips of thinned forests around highways and rural towns and residential developments. Following that, more ambitious projects could proceed on larger tracts.

Who pays? The state and feds must contribute, of course. But local communities, commercial timber companies, and private landowners must also cough up. In particular, the counties and interface residents must participate. So far, they've gotten a free ride. County planners have encouraged development in wild-land areas without thought to the implications of wildfire; After all, taxpayers have always picked up fire suppression costs. More suppression costs must be passed on to the counties so they are incentivized to discourage development in our wild lands, and homeowners must pay appropriately heavy premiums if they choose to build in the woods.

Stephens estimates we have about 30 years before it's too — before our coniferous forests are gone forever, replaced with oak woodlands, brush fields, or grassy savannas. And even then, of course, the wildfires will continue. As we saw with the recent Middletown conflagration, hardwood forests and scrublands can burn just as ferociously as conifers. As long as homes intrude into the wild lands, their continued destruction is assured.

We can continue down the current path of increasing fires and escalating suppression costs, or we can invest in forest restoration. The first course is a death spiral. The second will reduce wildfires, preserve the essential character of our wild lands, provide tens of thousands of jobs to young Americans, yield economic benefits ranging from timber production to recreation, stabilize watersheds, and preserve wildlife diversity. Let's just hope we do the right thing.

Posted on Monday, September 21, 2015 at 10:20 AM
  • Author: Glen Martin

Next 5 stories | Last story

 
E-mail
 
Webmaster Email: cecentralsierra@ucdavis.edu