News Release

When Models And Satellites Mislead: NCAR Scientists Prescribe Caution In March 13 Nature

Peer-Reviewed Publication

National Center for Atmospheric Research/University Corporation for Atmospheric Research

BOULDER--Computer models and temperature-gleaning satellites are useful tools in the quest to diagnose global change, but only when their limitations are well understood. This is the message conveyed by two scientists from the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, in two articles appearing in the journal Nature on March 13. One article provides new findings on an ongoing controversy involving the reliability of global temperature trends obtained via satellite. The other provides an overview of how to use--and how not to use--computer models that mimic the earth's atmosphere for research on climate change.

Satellite- Versus Surface-Derived Temperatures: Why The Disagreement?

A puzzling discrepancy between global temperature trends ascertained by surface instruments versus satellites is analyzed by NCAR's James Hurrell and Kevin Trenberth in the article "Spurious trends in satellite MSU temperatures from merging different satellite records." Since 1979, microwave sounder units (MSUs) have been deployed aboard polar-orbiting satellites of the National Oceanic and Atmospheric Administration. MSUs measure the brightness of oxygen in the earth's atmosphere and thus infer the temperature across the globe at various heights.

MSU readings for the lowest several kilometers have been averaged and yearly trends calculated since 1989. These show a drop in global temperature of -0.03 to -0.05 degree Celsius per decade since 1979. More traditional global temperature averages taken near the ground show a rise of about 0.1 degree C/decade over the same period. The difference in trends has been a subject of spirited debate because of its implications for the projection and measurement of global warming. (The projected rate of warming is typically around 0.2 degree C/decade.)

In their Nature article, Hurrell and Trenberth argue that the MSU data, while useful for many purposes, are poorly suited for gauging long-term surface temperature trends. MSUs monitor the globe more thoroughly than surface reports, which are concentrated over land and approximated over oceans. However, each MSU lasts only a few years, to be replaced by another deployment on a different satellite. According to the NCAR scientists, the transitions between satellites may be producing spurious temperature drops that mask an actual rise in global readings. "The surface and MSU records measure different physical quantities," write Hurrell and Trenberth, "so that decadal trends should not be expected to be the same." However, they add, "unreconciled discrepancies among the different records remain."

To study the matter further, the scientists focused on the tropics between 20 degrees N and S, where "noise" from short-term weather variations is lower than it is in temperate and polar zones. Hurrell and Trenberth compared simultaneous MSU records to each other, to sea-surface temperatures (SSTs), and to air temperatures simulated by an NCAR climate model using SSTs. They found that most of the difference between MSU and surface trends could be explained by two significant drops in MSU data for 1981 and 1991, years when satellite transitions took place.

Some newspaper and magazine articles now cite only the MSU or only the surface data in reporting on global temperature trends, without noting the counterpart to each. Hurrell and Trenberth stress that both data sets are needed to unravel the mysteries of global climate. "The MSU data are excellent for analyzing year-to-year changes, but not necessarily for longer-term trends," says Hurrell.

Thoughts On Interpreting Climate Models And Their Results

Trenberth provides an overview of the strengths and weaknesses of global atmospheric models in his article "The use and abuse of climate models." He points out that humankind is now "performing a great geophysical experiment" by modifying the environment in a way that threatens to change the climate. Lacking a spare earth on which to run a true experiment, "we have to do the next best thing--try to understand the climate system well enough to build a good model of the planet earth system . . . a virtual model of the earth in a computer."

However, Trenberth notes, a climate model is only as realistic as the theoretical understanding behind it and the complexity allowed in it. Computer resources, while growing rapidly, still restrict the detail and sophistication of current models. NCAR's climate system model, for example, requires weeks of actual time for a single 100- or 200-year climate simulation. "Computing power is one key to future progress," says Trenberth. Another is to improve the representation of common processes such as cloud formation and ocean circulation in order to minimize the number of "flux adjustments"--shifts in energy, water, and momentum exchange that are artificially prescribed in order to make a model more stable. These adjustments run the risk of causing unforeseen and unrealistic side effects in the modeled climate.

In his article, Trenberth describes a strategy for carrying out climate experiments that removes much of the impact of flux adjustments and other potential sources of error. However, this strategy does not eliminate the possibility of complicated feedback effects. Among other sources of difficulty, clouds represent "probably the single greatest uncertainty in climate models," notes Trenberth. "The enormous variety of cloud types, their variability on all space scales . . . and time scales (microseconds to weeks) poses a special challenge."

To help gain confidence in model results, Trenberth advocates the use of such tools as sensitivity tests, to see how much a result varies with small changes in the input conditions or model procedures, and simplified models, which require less computer time, to check approximations and assumptions. He also suggests that the burden of proof for claims that model results are incorrect should be on the critic, not the modeler.

For policymakers hoping for guidance from computer models, Trenberth emphasizes the value of using pooled knowledge and results from a number of different models, such as those used in the estimates from the Intergovernmental Panel on Climate Change of a projected global warming from 1.3 to 2.9 degrees C by the year 2100. "Statements such as these, given with appropriate caveats, are likely to be the best that can be made because they factor in the substantial understanding of many processes included in climate models. Such projections cannot offer certainty, but they are far better than declaring ignorance and saying nothing at all."

###

NCAR is managed by the University Corporation for Atmospheric Research under sponsorship by the National Science Foundation.



Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.