Posts Tagged: climate change
Reposted from the UCANR Green Blog
UC Cooperative Extension researchers convey need for more climate change communication and curriculum tools
Reducing greenhouse gas emissions from natural and working lands is one of California's key climate change strategies. In particular, the potential for farm and rangeland soils to serve as carbon sinks has been getting a lot of attention lately in the national media — and during California Healthy Soils week, which wrapped up Dec. 7.
These are areas where UC Cooperative Extension, with its local presence across the state, is well-positioned to drive change. But as a recent survey of UCCE advisors, specialists and faculty found, while there is a good deal of climate work happening, there are also some significant obstacles.
The survey results — reported in an article by UCCE academics Ted Grantham, Faith Kearns, Susie Kocher, Leslie Roche and Tapan Pathak in the latest issue of California Agriculture — showed that while nearly 90 percent of respondents believe it is important to incorporate climate science into extension programming, only 43 percent currently do so.
Respondents pointed to a number of issues. One was "limited familiarity with climate science fundamentals." It's one thing to cite the overwhelming scientific consensus that climate change is real and is being driven largely by human activity; it is another to be able to respond quickly and convincingly to detailed questions from doubters. This list from Grist, for instance, details more than 100 common arguments raised by climate skeptics, many of which have non-trivially complex answers.
Another important issue cited by respondents was "fear of alienating clientele by talking about a contentious topic," a response that highlights the importance of personal relationships in UCCE's work, and the challenge of communicating an area of science that is highly politicized.
The authors conclude: "To further increase the capacity of UC ANR staff to support the needs of their clientele and the broader public, professional development around climate science fundamentals, communication, and adaptation strategies is critical." As an initial follow-up, the UCANR climate change program team (led by authors Grantham, Kocher and Pathak) is presenting a workshop and professional development meeting for extension professionals in February.
For more from California Agriculture, the research journal of UCANR, see the full issue with articles on mapping soil salinity in the San Joaquin Valley via satellite; choosing forage seed mixes for rangeland restoration; growing oilseeds in winter without irrigation; keeping dairy cows cool in the summer; breeding better carrots; and more.
Reposted from UC Merced News
Yosemite Valley in the western Sierra Nevada Mountains.
What if nature were to become a polluter, discharging millions of tons of planet-warming carbon into the atmosphere in much the same way as diesel-fueled trucks or coal-fired power plants?
This nature-as-polluter scenario might seem far-fetched, but it's well on its way to becoming reality, according to a recent study co-authored by UC Merced Professor LeRoy Westerling.
In a paper published recently in Scientific Reports Opens a New Window.— “Potential decline in carbon carrying capacity under projected climate-wildfire interactions in the Sierra Nevada” — Westerling and collaborators from the University of New Mexico and Penn State University used three climate models and data from the Intergovernmental Panel on Climate Change to examine how rising global temperatures and increasingly severe wildfires will affect Sierra Nevada forests.
Their conclusion: Changing conditions will turn today's Sierra Nevada forests into tomorrow's greenhouse gas emitters.
“Forests play an important part in regulating the levels of atmospheric carbon,” Westerling explained. “Forests are carbon sinks, essentially giant stockpiles of carbon. Forests are also active carbon consumers. They remove carbon dioxide from the air and convert it into biomass. This traps the carbon, which is no longer free to act as a greenhouse gas in Earth's atmosphere.”
Professor LeRoy Westerling
But projections from Westerling and colleagues suggest that this may change. According to their models, Sierra Nevada forests will experience both a dramatic loss of stored carbon and a substantial decline in their ability to remove CO2 from the atmosphere.
Rising temperatures are creating a warmer, drier Sierra Nevada climate. Westerling previously showed that these changes are leading to dramatic increases in the frequency, size and duration of wildfires. The new study suggests that these same changes will make it harder for forests to regenerate, leading to a loss of forest density, with plants better suited to the new climate eventually replacing trees.
“As trees are displaced, the Sierra Nevada will lose its ability to sequester carbon,” Westerling explained. “The plants that spring up in their place will be significantly smaller, making them less effective carbon sinks than the trees they replaced.”
But the carbon stored in forest trees has to go somewhere.
As trees are burned in more frequent wildfires, and as dead trees undergo decomposition, Westerling and his colleagues predict that as much as 73 percent of the carbon in Sierra Nevada forests will be released, resulting in a dramatic spike in atmospheric carbon. This will transform the Sierra Nevada from a carbon sink into a carbon emitter, making the nature-as-polluter scenario a reality.
Westerling and his collaborators note that their predictions are actually conservative. The effects might be more extreme than their models suggest.
“Our study does not account for a number of factors that might influence the dynamics of forest carbon,” Westerling said. “However, the factors we ignored are likely to accelerate the loss of forest. Our predictions likely underestimate the severity of actual effects.”
Though the predictions are alarming, the authors remain optimistic, hopeful that their findings can contribute to a larger conversation about environmental policy and promote avenues of research that lead to sustainable forest management.
Reposted from the UCANR Green Blog
To help California forest property owners adapt to the changing climate, UC Agriculture and Natural Resources (UC ANR) has produced a 13-page peer-reviewed paper that outlines actions owners can take to sustain their forests' value even when temperatures rise.
“Managers of forest land have always had to adapt to changing conditions – such as markets, urban encroachment, droughts and floods,” said Susie Kocher, UC Cooperative Extension forestry and natural resources advisor. “We wrote this paper to help forest managers better understand the evolving science of climate change and how they can help their forests adapt to the climate of the future.”
Forests are shaped by the climates in which they grow. The current rapid pace of climate change has not happened for thousands of years, according to climate scientists. Nevertheless, the authors assure forest landowners that there are land management decisions they can make to ensure the resiliency of their resources, and perhaps even improve them.
“Some trees may grow faster under the warmer conditions we experience with climate change,” Kocher said, “especially those at highest elevation where there is adequate precipitation.”
The paper details the solid scientific evidence that indicates the rise in global average temperatures over the past 100 years. The temperatures, it says, “will likely continue to rise in the future, with impacts on natural and human systems.”
The document provides specific recommendations for care of three common types of forest in California: mixed conifer, oak woodland and coastal redwood forests.
Mixed conifer forests – typically composed of white fir, sugar pine, ponderosa pine, incense cedar and California black oak – are susceptible to moisture stress caused by warmer temperatures and reduced snow and rain. The drier conditions make the trees more vulnerable to fire and insect attack.
The drought of 2010-2016 has already had a substantial impact on mixed conifer forests in the Sierra Nevada. Aerial detection surveys show that more than 102 million trees have died since 2010; more than 62 million died in 2016 alone.
The UC ANR climate change adaptation paper suggests reducing competition for water by thinning trees and managing for species and structural diversity. The authors suggest property owners consider the source of seedlings when planting new trees.
“Select seedlings adapted to a slightly lower elevation or latitude than your property,” Kocher said. “These would be more likely to thrive under the 3- to 5-degree warmer temperatures we expect in 50 years or so.”
Oak woodlands are widely distributed and diverse in California, which gives them moderate to high capacity to adapt to climate change. Mature oaks are more resilient than young trees and seedlings.
One potential impact of climate change on oak woodlands is increasing precipitation variability and increasing spring rains. The moisture change could increase the spread and prevalence of Sudden Oak Death (SOD), a disease caused by a bacterium that was introduced into California from outside the U.S. SOD is primarily a concern in areas with tanoaks in Central to Northern California coastal areas.
“To reduce the spread of sudden oak death, land owners should prevent the movement of infected leaves, wood and soil,” according to the paper.
The primary concern for coastal redwood forests is the decline in fog. Fog frequency in coastal redwoods is 33 percent lower now compared to the early 20th Century. Less fog and rain plus warmer temperatures would leave coastal areas where redwoods typically thrive drier. But that doesn't mean redwoods will disappear. Areas with deep soil and areas close to streams and rivers may provide refuge for redwood forests.
The new publication, Adapting Forests to Climate Change, can be downloaded free from the UC ANR Catalog. It is the 25th in the Forest Stewardship series, developed to help forest landowners in California learn how to manage their land. It was written by Adrienne Marshall, a doctoral student at the University of Idaho; Susie Kocher, UC Cooperative Extension forestry and natural resources advisor; Amber Kerr, postdoctoral scholar with the UC John Muir Institute of the Environment; and Peter Stine, U.S. Forest Service.
Reposted from UC Berkeley News
A controversial paper published two years ago that concluded there was no detectable slowdown in ocean warming over the previous 15 years — widely known as the “global warming hiatus” — has now been confirmed using independent data in research led by researchers from UC Berkeley and Berkeley Earth, a non-profit research institute focused on climate change.
The 2015 analysis showed that the modern buoys now used to measure ocean temperatures tend to report slightly cooler temperatures than older ship-based systems, even when measuring the same part of the ocean at the same time. As buoy measurements have replaced ship measurements, this had hidden some of the real-world warming.
After correcting for this “cold bias,” researchers with the National Oceanic and Atmospheric Administration concluded in the journal Science that the oceans have actually warmed 0.12 degrees Celsius (0.22 degrees Fahrenheit) per decade since 2000, nearly twice as fast as earlier estimates of 0.07 degrees Celsius per decade. This brought the rate of ocean temperature rise in line with estimates for the previous 30 years, between 1970 and 1999.
This eliminated much of the global warming hiatus, an apparent slowdown in rising surface temperatures between 1998 and 2012. Many scientists, including the International Panel on Climate Change, acknowledged the puzzling hiatus, while those dubious about global warming pointed to it as evidence that climate change is a hoax.
Climate change skeptics attacked the NOAA researchers and a House of Representatives committee subpoenaed the scientists' emails. NOAA agreed to provide data and respond to any scientific questions but refused to comply with the subpoena, a decision supported by scientists who feared the “chilling effect” of political inquisitions.
“Our results mean that essentially NOAA got it right, that they were not cooking the books,” said lead author Zeke Hausfather, a graduate student in UC Berkeley's Energy and Resources Group.
Long-term climate records
Hausfather said that years ago, mariners measured the ocean temperature by scooping up a bucket of water from the ocean and sticking a thermometer in it. In the 1950s, however, ships began to automatically measure water piped through the engine room, which typically is warm. Nowadays, buoys cover much of the ocean and that data is beginning to supplant ship data. But the buoys report slightly cooler temperatures because they measure water directly from the ocean instead of after a trip through a warm engine room.
NOAA is one of three organizations that keep historical records of ocean temperatures – some going back to the 1850s – widely used by climate modelers. The agency's paper was an attempt to accurately combine the old ship measurements and the newer buoy data.
Hausfather and colleague Kevin Cowtan of the University of York in the UK extended that study to include the newer satellite and Argo float data in addition to the buoy data.
“Only a small fraction of the ocean measurement data is being used by climate monitoring groups, and they are trying to smush together data from different instruments, which leads to a lot of judgment calls about how you weight one versus the other, and how you adjust for the transition from one to another,” Hausfather said. “So we said, ‘What if we create a temperature record just from the buoys, or just from the satellites, or just from the Argo floats, so there is no mixing and matching of instruments?'”
In each case, using data from only one instrument type – either satellites, buoys or Argo floats – the results matched those of the NOAA group, supporting the case that the oceans warmed 0.12 degrees Celsius per decade over the past two decades, nearly twice the previous estimate. In other words, the upward trend seen in the last half of the 20th century continued through the first 15 years of the 21st: there was no hiatus.
“In the grand scheme of things, the main implication of our study is on the hiatus, which many people have focused on, claiming that global warming has slowed greatly or even stopped,” Hausfather said. “Based on our analysis, a good portion of that apparent slowdown in warming was due to biases in the ship records.”
Correcting other biases in ship records
In the same publication last year, NOAA scientists also accounted for changing shipping routes and measurement techniques. Their correction – giving greater weight to buoy measurements than to ship measurements in warming calculations – is also valid, Hausfather said, and a good way to correct for this second bias, short of throwing out the ship data altogether and relying only on buoys.
Another repository of ocean temperature data, the Hadley Climatic Research Unit in the United Kingdom, corrected their data for the switch from ships to buoys, but not for this second factor, which means that the Hadley data produce a slightly lower rate of warming than do the NOAA data or the new UC Berkeley study.
“In the last seven years or so, you have buoys warming faster than ships are, independently of the ship offset, which produces a significant cool bias in the Hadley record,” Hausfather said. The new study, he said, argues that the Hadley center should introduce another correction to its data.
“People don't get much credit for doing studies that replicate or independently validate other people's work. But, particularly when things become so political, we feel it is really important to show that, if you look at all these other records, it seems these researchers did a good job with their corrections,” Hausfather said.
Co-author Mark Richardson of NASA‘s Jet Propulsion Laboratory and the California Institute of Technology in Pasadena added, “Satellites and automated floats are completely independent witnesses of recent ocean warming, and their testimony matches the NOAA results. It looks like the NOAA researchers were right all along.“
Other co-authors of the paper are David C. Clarke, an independent researcher from Montreal, Canada, Peter Jacobs of George Mason University in Fairfax, Virginia, and Robert Rohde of Berkeley Earth. The research was funded by Berkeley Earth.
Reposted from the UCANR news
The findings suggest many models of wildfire predictions do not accurately account for anthropogenic factors and may therefore be misleading when identifying the main causes/drivers of wildfires. The newest model proportionately accounts for climate change and human behavioral threats and allows experts to more accurately predict how much land is at risk of burning in California through 2050, which is estimated at more than 7 million acres in the next 25 years.
Climate change affects the severity of the fire season and the amount and type of vegetation on the land, which are major variables in predicting wildfires. However, humans contribute another set of factors that influence wildfires, including where structures are built, and the frequency and location of ignitions from a variety of sources—everything from cigarettes on the highway, to electrical poles that get blown down in Santa Ana winds. As a result of the near-saturation of the landscape, humans are currently responsible for igniting more than 90 percent of the wildfires in California.
“Individuals don't have much control over how climate change will effect wildfires in the future. However, we do have the ability to influence the other half of the equation, those variables that control our impact on the landscape,” said Michal Mann, assistant professor of geography at George Washington University and lead author of the study. “We can reduce our risks by disincentivizing housing development in fire-prone areas, better managing public land, and rethinking the effectiveness of our current firefighting approach.”
The researchers found that by omitting the human influence on California wildfires, they were overstating the influence of climate change. The authors recommend considering climate change and human variables at the same time for future models.
“There is widespread agreement about the importance of climate on wildfire at relatively broad scales. At more local scales, however, you can get the story quite wrong if you don't include human development patterns,” said co-author Max Moritz, UC Cooperative Extension fire ecology specialist whose lab is at the University of California, Berkeley. “This is an important finding about how we model climate change effects, and it also confirms that getting a handle on where and how we build our communities is essential to limiting future losses.”
Between 1999 and 2011, California reported an average of $160 million in annual wildfire-related damages, with nearly 13,000 homes and other structures destroyed in so-called state responsibility areas - fire jurisdictions maintained by California, according to Mann. During this same period, California and the U.S. Forest Service spent more than $5 billion on wildfire suppression.
In a model from 2014 that examined California wildfires' destruction over the last 60 years, Dr. Mann estimated that fire damage will more than triple by mid-century, increasing to nearly half a billion dollars annually. “This information is critical to policymakers, planners, and fire managers, to determine wildfire risks,” he said.
The paper, “Incorporating Anthropogenic Influences into Fire Probability Models: Effects of Human Activity and Climate Change on Fire Activity in California,” published Thursday in PLOS ONE.
Press release written by Emily Grebenstein, George Washington University, firstname.lastname@example.org, 202-994-3087