A new study published in the prestigious scientific journal Nature  late last week, made headlines with the claim that the climate of our world may be more sensitive than we previously thought.
The study, authored by a group based at Britain's Oxford University, indicated the Earth's climate might -- itself too strong a term -- warm as much as 11 degree Celsius (about 20 degree Fahrenheit) under certain scenarios with a doubling of the atmospheric concentration of carbon dioxide (CO2).
Coming on the heels of a another report from Britain, Meeting the Climate Challenge, stating that the "point of no return" for forestalling disastrous climate change would be reached in 10 years, the mainstream media made the sensational 11 degree C warming scenario the lead subject .
What to make of this? As the author Michael Crichton (whose recent book State of Fear pillories undue environmental alarmism) jokingly told a Washington audience last week, "The mass media's dictum is to simplify and exaggerate; the same thing Walt Disney told his cartoonists." The unfunny thing is that is exactly what is going on with the study in Nature.
Indeed, the authors of the study, more than likely, did not intend to highlight the extreme scenario, which almost doubled a high-end estimate of the Intergovernmental Panel on Climate Change's (IPCC) Third Assessment Report (TAR) of 6 degrees C (about 11 degree F) by the year 2100.
The real point of the Nature article was more boring. The study created a database involving more computer generated General Circulation Model (GCM) climate simulations than ever before. The authors refer to this collection as a "super-ensemble." They accomplished the task of creating it by allowing volunteers to download a computer program that: 1) brought in the GCM to the volunteer's computer, 2) generated a model simulation and 3) uploaded the results back to Oxford. This procedure allowed the scientists to take advantage of unused computing power available worldwide in order to advance science. That is the newsworthy fact -- the ability to take advantage of unused computer power worldwide -- not the results of running the model simulations.
Indeed, the results actually bolster a point made by scientists who are more skeptical of an impending climate disaster. Prominent "skeptics" have pointed out that the large majority of model-generated climate change scenarios point to a more modest warming on the order of 1.4 - 2.8 degrees C (2 - 5o F) by 2100.
In the Nature article, too, the overwhelming majority of the "super-ensemble" simulations show a global warming of 2-4 degrees C within 15 years of doubling the atmospheric concentration of CO2. This tendency for the majority of the simulations to fall into the lower portion of the projected climate change range is consistent with the critique of the skeptics. What's more, despite the extreme range for future warming (1.9 - 11.5 degree C) reported in the article, the global temperatures in the model stabilize after these 15 years
As for the higher end warming, most of the media reports neglected that the GCM used for the study was fairly crude. The experimental design consisted of the equivalent of "hitting the model with a hammer," as The Economist observed, by imposing an instantaneous doubling of CO2 at a point 30 years after the start of the model run. In short, it is on the order of having several volcanic explosions or some other huge unlikely event dump huge quantities of CO2 into the atmosphere all at once. GCM results that demonstrate lower climate sensitivity generally raise the CO2 concentration gradually to better simulate reality. In addition, the study's doubling of CO2 concentration was beyond even those expected by 2100 (about 650 parts per million) under a "business as usual scenario" (that is if nothing happens that limits CO2 emissions, such as new and better technology).
Finally, the media also fail to understand that the study's increase in warming range is simply a result of using more computer model simulations, a consequence of Chaos Theory.
The real atmosphere is a system that displays "chaotic" tendencies. Model simulations of any system that is considered to be "chaotic" are particularly sensitive to the conditions set for the simulations. There is even an acronym to describe the situation -- SDIC, for sensitively dependent on the initial conditions. To scientists, that is like a warning label on a medicine to be cautious in how something is used. In the case of climate models, the initial conditions are the data that are used at the start of the model simulation. SDIC means that even two sets of a system's initial conditions that are very similar, or even the same, can evolve along widely divergent paths. Thus, the more model simulations that are used or the longer the time line of the simulation or both occur, the greater the range of results that can be expected.
So, the real news of this study published in Nature consists of those things which generally have gone unreported - that the study was able to make use of unused computer power worldwide, that most model simulations showed somewhat modest temperature changes and that the models showed temperature stability after the warming took place.
Those things are a little more complicated to report than a cartoonish 20 degree F warming, thus validating what Crichton jokingly called the media's prime dictum -- to simplify and exaggerate.
The author is professor of atmospheric science at the University of Missouri-Columbia.
 Stainforth, D.A., T. Aina, C. Christensen, M. Collins, N. Faull, D.J. Frame, J.A. Kettleborough, S. Knight, A. Martin, J.M. Murphy, C. Paini, D. Sexton, L.A. Smith, R.A. Spicer, A.J. Thorpe, and M.R. Allen, 2005: Uncertainty in Predictions of the Climate Response to Rising Levels of Greenhouse Gases. Nature, 433, 403 - 406. (27 January Issue).
 Kerr, R.A., Climate Modelers See Scorching Future as a Real Possibility. Science, 307, 497)