TCS Daily


Witches and Weather

By Sallie Baliunas - January 1, 2005 12:00 AM

Editor's note: The following remarks were delivered at the Risk: Regulation and Reality Conference by Dr. Sallie Baliunas, PhD, enviro-sci host of Tech Central Station. The conference was co-hosted by Tech Central Station and was held on October 7, 2004 in Toronto, ON.

James Glassman: Thank you very much, David Gratzer. [Applause] Next we are going to hear from two outstanding scientists, Dr. Sallie Baliunas and Dr. Tim Patterson. Sallie is the EnviroScience host of Tech Central Station, and has been now for three or four years, I think. She is a research astrophysicist, a past contributing editor to New Astronomy and The World Climate Report. She has received numerous awards. In 1991 Discover Magazine profiled her as one of America's outstanding women scientists. She's also a past deputy director of the Mt. Wilson Observatory, and once took me on a tour of that fascinating place. She doesn't sleep at night so she also has time to serve as a technical consultant for the science fiction television series Gene Roddenberry's Earth, Final Conflict. Is that still on?

Sallie Baliunas: In syndication.

James Glassman: In syndication, right. Tim Patterson is a professor of geology at Carlton University in Ottawa. He is a Canadian leader of the International Geological Correlation Program, project Catenary [ph.] Land Ocean Interactions, and is principle investigator of a Canadian foundation for climate and atmospheric sciences, studying high-resolution Holocene climate records from anoxic fjords and coast lakes in British Columbia, which is tremendously important, as I think he'll explain. He's also, by the way, an avid Star Trek fan, as all good scientists are. Please welcome Sallie Baliunas and Tim Patterson as we learn about complexities of global climate change. [Applause]

Sallie Baliunas: The most dangerous risk to take in face of destructive forces of extreme weather is to ally with scientific ignorance. Science - as opposed to cultural folktales or equally irrelevant widespread perceptions - is the only tool that yields reliable information on weather and climate.

Two points are noted here. First, measures of current climate show no cause for the excessive fears spread in news stories or by activists. Second is an historical note on the consequences of sincere, institutionally legal attempts - though in an age without scientific facts - at weather remediation by human sacrifice during the Little Ice Age.

Weather consists of day-to-day conditions and events in a locality, and climate, by convention, is weather averaged over thirty or more years. Both vary over regional scope and in time.

The current ice age, called the Pleistocene, became severe approximately two million years ago. One trait of the ice age is its pattern of glacial and interglacial periods, the harsh and cold glacial period persisting roughly 100,000 years, followed by a moderate interglacial period lasting only approximately 10,000 to 15,000 years. Around 10,000 years ago, the cold abated and marked the onset of the present interglacial period, called the Holocene, as massive ice sheets at mid- to high-latitudes shrank, subsequently raising sea levels and inundating the extended, continental boundaries previously defined by the glacial conditions. The next glacial is expected to return within several millennia.

Researchers with new and more precise techniques to measure past ecosystem change find surprisingly sharp past fluctuations in ecosystems and their local climates, even during the present interglacial. Studying periods prior to the late 20th century's large increase in the air's concentration of carbon dioxide and other greenhouse gases produced by human activities may provide accurate information on natural climate fluctuations, and hence, lead to improved forecasts of the enhanced greenhouse effect.

Discussions on enacting caps like the Kyoto Protocol on carbon dioxide emission arise from forecasts by computer simulations of century-future climate. Simulations contain substantive uncertainties and unknowns and are essential scholarly tools, but cannot accurately reproduce major features of climate. Taken in ensemble, simulations of the enhanced greenhouse effect, calculated under socio-economic futures fixed by convention but poorly known, indicate warming trends that are linear, with a middle value of about 2.5 C by 2100 averaged over the globe.

But measurements and analyses of relevant climate parameters suggest so far a much smaller enhanced greenhouse effect than the computer simulations do.

One gauge of the enhanced greenhouse effect comes from recent trends in surface temperature. Since approximately the middle of the 19th century, thermometer readings have been made in many locations across the world and combined into a record of globally-averaged temperature (Chart 1).



Chart 1
-- The annually-averaged change in surface temperature compiled from instruments and estimated across the globe from (www.cru.uea.ac.uk/cru/info/warming) from 1856 through 2003. The zero point is arbitrary and has been computed over the 30-year period 1961-1990; no ideal average temperature for the earth is known. The instrument record begins at the end of a relatively cold period. Best estimates have been applied to correct for the urban heat island effect.

One problem with the estimated temperature record is that suitable measurements sample less than twenty percent of the surface of the globe (for example, the high-latitude southern oceans are undersampled). Another difficulty is the quality of records needed to make an accurate estimate of the warm bias that accrues over time as urbanization intensifies around many thermometer stations. The net averaged warming across the globe at the end of the 20th century compared to the latter half of the 19th century seems approximately 0.6 C, with uncertainties presently estimated as one or two tenths of one degree C. Localities exhibit different temperature trends, with a few showing net cooling, as the world's temperature field is complex.

Interpreting the surface temperature record requires looking at the timing of the emission of human-produced greenhouse gases, plus other factors that could cause temperature change, including other artificial and natural factors.

Over the past 200 years the carbon dioxide concentration, the main component of the enhanced greenhouse effect, has risen about 30 percent. Most of the carbon dioxide emission, predominantly from human activities, occurred within the last 50 years. While there is a warming trend in the last decades of the early 20th century, coinciding with and possibly caused at least in part by the enhanced greenhouse effect, there is a prior warming trend of equal magnitude early in the 20th century apparently not primarily caused by the enhanced greenhouse effect.

If the recent warming trend, observed to be roughly 0.15-0.17 C per decade, is assumed to be caused entirely by the enhanced greenhouse effect, it is somewhat lower than projections from most computer simulations, indicating that the forecasts are still uncertain.

But other contributions to the recent warming trend have been suggested by new research. One possibility is a warming trend by the increasing atmospheric concentrations of a specific aerosol produced by human activities, black carbon. Another is widespread landscape modification, for example, farming. If both those anthropogenic factors, along with the enhanced greenhouse effect and natural factors, contribute to the recent surface warming trend as importantly as initial research indicates, then the estimated amplitude allotted to the enhanced greenhouse effect in explaining the trend would diminish. These are yet areas of active research, and conclusions, plus forecasts based on them, remain uncertain.

Regarding natural climate variability, it should be noted that the 19th century was the end of a well-documented, centuries-long cold period in areas of the world. Hence, the period of unusual cold at the start of the instrumental record may bias the casual observer to believe the second half of the 19th century displayed "normal" temperature, and the 20th century is "abnormal" in warmth.

One insightful test of estimates of the enhanced greenhouse effect comes from predictions made by the computer simulations of air temperature just above the surface. Simulations forecast increased temperature from the surface to a height of approximately several km (http://arxiv.org/abs/physics/0407074 and http://arxiv.org/pdf/physics/0407075). The air at those altitudes should already display an accelerated warming trend with respect to the surface if simulations of the enhanced greenhouse effect are correct (http://blue.atmos.colostate.edu/publications/pdf/R-271.pdf).

Measurements from weather balloons since the 1950s and NOAA satellites beginning in 1978 have yielded an independently-validated record of temperature integrated over the layer from the surface and rising to approximately 5 km, or the low troposphere (Chart 2).



Chart 2
- Monthly averaged change in temperatures for the layer of air from approximately the surface to a height of 8 km above the surface through June 2004 as measured by a series of NOAA satellites over most of the globe (www.ghcc.msfc.nasa.gov/MSU/msusci.html), and verified by good, independent balloon measurements (www.nsstc.uah.edu/atmos/john_pubs.html). The zero-point is arbitrary, and is computed over the entire record. Simulations of the enhanced greenhouse effect forecast a warming trend of approximately 0.25 to 0.35 C per decade, or accelerated warming compared to the surface. The well-validated temperature of the low troposphere shows a significantly smaller trend, +0.077 C per decade, through September 2004, which is several months more recent than the chart (http://vortex.nsstc.uah.edu/data/msu/t2lt/tltglhmam_5.1).

Temperatures higher than the record's average are tinted red; those below are blue. There is much variability from month to month, season to season and year to year. The dramatic and temporary influences of tropical Pacific conditions like warming during El Niño in 1997-98 or cooling during La Niña conditions, or cooling from large volcanic eruptions, are present. But a linear trend fitted through the record, +0.077 C per decade, is smaller than that at the surface, in contradiction to the climate simulations of the enhanced greenhouse effect, by at least a factor of three. It suggests that the incomplete simulations are predicting overly high warming trends, both for recently past and future decades.

Is the 20th century's climate unusual? This perspective from ice core information covers the last 17,000 years and pertains to Greenland (Chart 3), although features of the record are present in other regions. The record begins around the time of the coldest recent period of the glacial. During the termination of the glacial period, temperature fluctuated sharply, and the physical details of causal factors and climate responses are poorly known.



Chart 3
- Temperature and ice accumulation over the last 17,000 years, with climate and societal features noted, s developed from ice cores taken sampled in Greenland (courtesy Dr. Art Green).

Response by humans and ecosystems to the retreat of the glacial period and onset of a more stable and warm climate was swift. With the development of agriculture, human civilization expanded and sculpted extensive, artificial landscapes.

Compared to so great a change of the glacial termination, the last one thousand years look fairly calm. But significant fluctuations in local conditions did occur, driving notable ecosystem and human responses.

A broad period of equable climate reached parts of Western Europe as early as the 9th century C.E. and persisted in some areas though the 12th century, as documented by biological or geological indicators of past environmental change like cave formation sediment, ice cores, sea floor debris and pollen in bogs. Called the Medieval Warm Period (or Medieval Climate Optimum), peoples in Western Europe could, for example, grow familiar crops at more northerly latitudes or higher altitudes than had been possible in prior centuries.

By the 12th to 13th centuries, a series of harsher periods set in, some appearing seemingly abruptly. Economies had benefited from agriculture and sea trade; the onset of the climate deterioration eroded economies and shocked cultures. Called the Little Ice Age, it persisted in areas of Western Europe into the 19th century.

Climatologist Hubert H. Lamb describes evidence that the Little Ice Age was felt severely in Central Europe and in China, and brought not only a tendency for lower annual temperatures but also highly variable and unusual weather, including severe heat waves, harsh winters, late spring and early fall frosts, droughts and floods. Farms at high latitudes began to fail, bogs inundated pastures, and past agricultural patterns were upset. Farm output deteriorated, producing a rise in starvation, famines, disease and death.

Life expectancy in England, which had gained approximately ten years during the Medieval Warm Period, fell back to roughly 38 years, according to Lamb, by the mid-14th century, exacerbated by the bubonic plague, which may have been initiated by one of the world's deadliest known weather disasters.

Inland rivers of China flooded during the winter of 1331-32, killing an estimated 7 million people. The ecosystem's rats, which were infested with fleas carrying the bacterium of the bubonic plague, fled north into Northern China and west to meet the crusaders, who carried the disease to populations not previously exposed to it. In Euorpe the plague's havoc can be pinpointed; in October 1347, the plague washed ashore with a ship of dying sailors who make harbor in Sicily. Between 1347 and 1351, approximately 20 to 25 million people in Europe are killed by the black death, triggered in part by an unusual flood and aided by weakened immunity in a population already suffering from malnutrition and disease brought by poor harvests of the Little Ice Age.

A diarist, Giovanni Boccaccio opined (Decameron) that in 1348

when into the notable city of Florence, fair over every other of Italy, there came to death-dealing pestilence, which, through the operation of the heavenly bodies or of our own iniquitous doings, being sent down upon mankind for our correction by the just wrath of God, had some years before appeared in the parts of the East and after having bereft these latter of an innumerable number of inhabitants, extending without cease from one place to another, had unhappily spread towards the West.

Without scientific knowledge, the sweeping tide of death was explained as mysticism or cosmic justice.

Unusual weather calamities continued to strike in the 14th and 15th century. One diarist noted that in Smolensk (Central Europe) in 1438 the starvation was so bad that "the wild animals ate people and people ate people and small children." Survival meant cannibalism.

The Little Ice Age's series of weather catastrophes had contributed by the end of the 15th century to an institutionalization of witch hunting, prosecution and execution in the name of precaution. Pope Innocent VIII in 1484 issued a bull granting inquisitors the power to rid society of "all heretical depravity." The bull elaborates that men and women, "give themselves to devils" to "cause to perish the offspring of women, the foal of animals, the products of the earth, the grapes of vines, and the fruits and the trees...and perpetrate many other abominable offenses." In other words, it was widely believed, and then institutionally vetted in legal systems, that heretics could inflict storms, floods or droughts that would destroy crops and herd animals, causing famine and death. Heretics also were believed to deal disease, epidemics and death more directly than through protracted mechanisms like weather disasters.

The belief that witches could produce disease and bad weather was an attractive and deadly that spread across cultures from Catholicism into Protestant sects and secular governments. While opponents in the 16th century's Reformation and the Catholic Counter-reformation vehemently battled over whose religious views would better control society for the public good (and what that meant), they were resolutely unified in their belief that Satan was resurgent in the world and practiced witchcraft through willing human heretics. Protestant reformer Martin Luther (1438 - 1546) warned of the "epicureans [skeptics] and despisers of God who have given themselves over to Satan, such as the Wettermacheren [weather maker]."

Jean Bodin (1529 - 1596), as highly regarded in political social theory of the day as was Machiavelli, reasoned that satanic practices such as brewing storm disasters are tantamount to treason against a God, and is not only the highest heresy, but also the most terrible risk facing mankind because it invite's God's wrath - the ultimate jeopardy - if the public remains apathetic toward witchcraft and fails to eradicate witches (Démonomania des sorciers, 1580). The reasoning behind this is very simple and appealing, as stated in a later book from the early 1600s on the trial of a certain Mr. Darrell, "Why should we think that there are devils? If no devils, no God." A denial of weather cooking by sorcery equaled the highest and most dangerous heresy, the denial of God's existence.

Humane skeptics attacked the policy on its institutionalization of horrific torture and the incredulous and illogical choice by the devil of his supposed followers. Sixteenth century skeptic Johann Weyer (1515 - 1588) argued that institutions' victims were suffering from what we would modernly call mood disorders. Why would the devil choose such ineffective people for his work, when the devil supposedly had great power? But Johann Weyer's logic and humanity were quickly disposed of by institutionally-backed witch hunters like Bodin, who called for Weyer to be tried as a witch, and added precautionary threats, "Any country that tolerates [witches and Weyer's heresy] will be struck by plagues, famines and wars."

One famous mass witch trial, the North Berwick Trial, involved storms allegedly inflicted by witches directly on James VI of Scotland (later also James I of England). The trial owes part of its genesis to James' marriage by proxy in August of 1589 to Princess Anne of Denmark. Great storms stranded Princess Ann's ship in Norway, and in October, James sailed to Norway, also experiencing terrible sea storms. Finally, in May of 1590 James and Ann arrive in Scotland. At the same time, James was steeped in political danger, and he became convinced that the storms through which he had sailed were part of an assassination attempt conjured up by sorcery.

Torture became a standard aspect of witch trials, because evidence of satanic possession was an invisible phenomenon, other than external manifestations like storms, floods or crop failure. Weather cooking suspects showed little persuasive, physical proof. Added to the invisible nature of evidence was the great perceived risk to society of the possible existence of weather and other sorcery. Both factors demanded that the utmost legal measures be taken to assure societal safety, hence, confession under torture was not only was allowed but became a necessary tool with which to prove guilt. As great nubers of innocent suspects confessed under horrific tortures, the process - with elaborate, institutionalized instructions and legal circumscription - bred its own, self-serving evidence of success. James himself set legal precedent in the North Berwick Trial by expanding the use of torture to other innocents peripherally involved with suspects. With torture of an innocent, young female servant, James uncovered a supposed assassination plot against him, led by a school teacher John Fian (also known as John Cunningham). Fian was charged with twenty counts of witchcraft, including "Conspiracy with Satan to wreck the ship carrying King James to Norway, on a visit to his future queen, by throwing a dead cat into the sea." And no one could doubt Fian's ability to muster meteorological forces by Fian's having tossed a black cat into the sea.

Many countries in Europe held witch trials and executions during the 16th and 17th centuries. Historians like Wolfgang Behringer point to many reasons, with differences and similarities from one trial and one region to another, but weather disasters of the Little Ice Age form a perhaps underappreciated reason to modern societies as significant causes in trials.

The intensity (on a scale of 0-3) and times of occurrence estimated for strong flood events on the Pegnitz River in Central Europe. The blue curve is a 31-year smoothing of the flooding (R. Brádzil et al. 2002 PAGES News, 10, 21-23)



This chart (Chart 4) captures the history of storminess in the last 700 years along the Pegnitz River, a tributary of the Danube in Central Europe. Including the portion of the 20th century of the record represented here, the most intense and frequent period of storminess arrived rapidly in the mid-16th century, and persisted during the very coldest part of the Little Ice Age, approximately 1550 to 1700.

That seeming suddenness of a period of intense and frequent storms was noted by diarists, who were not far from meteorological fact. Common belief in and fear of weather cooking by witchcraft can be seen in this drawing (Chart 5) depicting an atmospheric disturbance brewed over a heated cauldron while rituals are performed around Satan, seated.



Chart 5 - From Züricher Chronik 1574 (http://www.zpr.uni-koeln.de/~nix/hexen/galerie/e-magie.htm), the upper-left portion depicts weather cooking in the presence of a group worshipping the devil. See also W. Behringer, p. 84, citing the drawing's reproduction in the 1568 description of a Berne trial; from the art collection of Johann Jacob Wick.

The dangerous idea that witches could bring natural disasters and disease writhed across cultures and socio-economic strata; King James, who, as mentioned above, had believed himself the victim of weather sorcery, had been highly educated in the knowledge of his day. Officials in Catholic and Protestant sects, along with secular governments, led or promoted witch trials, if not believing the superstition, then as a way to retain power by quieting the public outcry to do something about the crises of bad weather. The assiduity of witch trials in two German dukedoms, one Catholic and one Protestant (Chart 6), demonstrate the crossing of religions' boundaries with theological agreement in application of the precautionary policy against witches.


Chart 6 - Mass witch trials in Kurmainz (Catholic) and Thüringen (Protestant) areas of Germany, including the period of the Thirty Years' War (H. Pohl, Hexenverfolgung und Hexenferglgung in Kurfürstentum Mainz, 1988, p. 28ff and Ron Füssell, courtesy of http://www.zpr.uni-koeln.de/~nix/hexen/e-zelt3.htm)

Entries of diarists make it clear to historians like Behringer that weather extremes fomented witch trials. At the end of May in 1626 a killing frost descended upon Bamberg and Wurzburg. A chronicler in a Franconian town reported that "all the vineyards were totally destroyed by frost within the prince-bishoprics of Bamberg and Würzburg, same as the dear grain which had already flourished...Everything frozen which had not happened as long as one could remember. And it caused a big rise in prices...As a result pleading and begging began among the rabble, questioning why the authorities continued to tolerate the witches and sorcerers destructions of crops. Thus the prince-bishop punished these crimes, and the persecution began in this year."

According to Behringer, a modern researcher estimates that it had probably been 500 years since the last such spring killing frost, consistent with the populace's perception that so disastrous a frost was exceptional. Mass witch trials in Bamberg, Würzburg, Electorate Mainz and Westphalia in 1626, presumably triggered by weather cooking accusations after the killing spring frost, resulted in several thousand people executed.

One diarist, Johann Linden of Treves commented clearly in 1590 about widespread belief in and institutional response to weather sorcery, "Everybody thought the continuous crop failure was caused by witches from devilish hate, so the whole country stood up for their eradication."

Tempered by science, technology and resources to forecast, prepare for and improve survival from extreme weather, society no longer believes in weather cooking by witches.

Despite advances in science, activists have promoted the Kyoto Protocol as a way to reduce future weather catastrophes. In terms of addressing the air's increased carbon dioxide content - no matter what one believes of the temperature response to the air's added greenhouse gas content - the Kyoto Protocol would be ineffective, because its prescribed emission cuts are terribly inadequate. Climate simulations find that full compliance with the Kyoto Protocol's caps merely delays a meaningless temperature increase - several hundredths of one degree Celsius by 2050 - by several years.

Moreover, serious economic estimates suggest that the Protocol's mechanisms and outcomes are costly. The plan in essence rations energy through a carbon cartel. Any meaningful reductions in the air's carbon dioxide content would require thirty or so cuts on the order of one Kyoto Protocol. Unless nuclear power plants of 1,000MW-class are quickly built in numbers topping thousands worldwide, the size of discussed carbon cuts would suppress energy use and economic growth world wide - even in the poorest countries where clean drinking water and immunizations against childhood diseases are still unaffordable for most people.

A risk for policy makers is selling a plan like the Kyoto Protocol that cannot keep its extravagant promises of eliminating weather disasters, a fear hard-wired in humans. One Kyoto Protocol will not solve the problem it is often claimed to address - the reduction of future carbon dioxide in the air - and it will certainly not reduce extreme weather, and may divert resources that could be used for increased survival through better forecasting and preparation. On the other hand, the U.K., for example, has been commendably clear in discussing its risk-avoidance goals for which developed nations would theoretically emit net negative amounts of carbon to the air by approximately 2050.

In terms of extreme weather, the 20th century's storminess seem unexceptional. In the example of Western Europe, storminess was very severe four centuries ago. The solution to weather catastrophes - a fear hard-wired in humans -- is not to implement ineffective policies in fear out of a vague concept of precaution, but to strive for scientific facts.

Books cited or recommended:

W. Behringer, Witches and Witch-Hunts, A Global History, 2004, Polity Press, 337pp.

W. Behringer, translated by J. C. Grayson and D. Lederer, Witchcraft Persecutions in Bavaria : Popular Magic, Religious Zealotry and Reason of State in Early Modern Europe, 1997, Cambridge Univ. Press, 476pp.

H. H. Lamb, Climate, History and the Modern World, 1982, Metheun Press, 387pp.

H. H. Lamb, Climatic History and the Future, 1985, Princeton Univ. Press, 835pp.

A. C. Kors and E. Peters, eds., Witchcraft in Europe: 400-1700, Univ. Pennsylvania Press, 451pp.

James Glassman: Please get in line if you have questions.

Male Audience Member: [Inaudible] the first one is for Sallie. [Inaudible] for the past 1,000 years, this plot shows nearly constant temperature for 900 years, and then temperature rises suddenly in the 20th century. I understand this hockey stick curve has fallen into disrepute. Could you speak about the situation please?

Sallie Baliunas: Sure, and maybe Tim also has some views on this. There's a mathematically-constructed figure that was shown, beginning a few years ago, for example, in the 2001 report from the United Nations Intergovernmental Panel on Climate Change. The chart marks temperature, averaged over the Northern Hemisphere, for several centuries prior to the beginning of the 20th century as being fairly flat, and then rising through the 20th century. The curve has been challenged on several grounds, including the compilation of underlying material and its mathematics (http://www.uoguelph.ca/~rmckitri/research/trc.html). . One of the mathematical experts in tracking down errors is present today, Steve McIntyre. Errors have been documented, for example, in porting over other researchers' data and in calculations. Dr. Willie Soon and other colleagues http://cfa-www.harvard.edu/~wsoon/1000yrclimatehistory-d/Energy+EnvironmentSoonetal2003.pdf and http://cfa-www.harvard.edu/~wsoon/1000yrclimatehistory-dJan30-ClimateResearchpaper.pdf) took an alternate approach that examines the environmental information gotten from ecosystem studies - like pollen in bog sediments and ice cores - in widespread locations across the world without averaging disparate environmental indicators, each with sets of differing uncertainties, over broad spatial expanses that are undersampled or not sampled. Those environmental indicators (it is a bias to think that temperature fully represents climate or ecosystems; climate, for example, encompasses parameters like precipitation, which may be more important to ecosystems than temperature) are highly variable. On a location by location basis, the ecosystem indicators show neither the 20th century nor the latter half of the 20th century (when the period is included in the indicator) being abnormal over comparable periods in the last millennium; further, there was a period of "unusual" but natural state of parameters early in the last millennium sometimes exceeding or matched by environmental indicators' state in recent times. Further, we were able to document anthropogenic change in areas, most likely resulting from human landscape modification.

Dr. Tim Patterson: Some of you may not be familiar with the man at el hockey stick, but it's sort of the poster child of anything to do about, if it appeared in all the IPPC literature, very soon after it was published. In the quaternary geology community, I think it was much to their chagrin. We seem to be existing in parallel universes here. There is literally thousands of papers that which they went through and they document the little ice age, well documented, the medieval warm episode, and so on, but it all seemed to just disappear, all this literature, that has been accumulated for more than 200 years with this publication of this one paper. And, so, how it ever became the dominant force that it did in IPCC policy making, I have no idea, but I'm very pleased to see that it's disappearing just as quickly as it came. So it's, again, it was something that was not accepted in the geological community to which I belong.

James Glassman: There was an interesting article in Der Spiegel recently, by a scientist who debunked this, although a scientist who is kind of a believer in Kyoto. Could you just explain that?

Sallie Baliunas: This scientist and his colleagues used a climate simulation to study the amount of climate change in the model compared to the mathematical representation of temperature. The conclusion is that the original representation underestimated the natural fluctuations of temperature, an indication of greater uncertainty than previously believed. The lead scientist of the climate simulation study took a harsh tone in the Der Speigel interview, commenting on the earlier work being criticized as "rubbish," although I don't know the best translation of the word, which was given in German. More importantly, the lead scientist of the new study pointed out that the incorrect imagery of past temperature as a flattish hockey-stick shape has hindered the development of understanding natural variability, which is the backdrop against which a human effect has to be judged.

Male Audience Member: A second question from the audience to be read. This one is for you, Dr. Patterson. Considering how little carbon dioxide appears to drive climate change, what do you think of expensive endeavors to restrict the human production of carbon dioxide, such as the Kyoto protocol?

Dr. Tim Patterson: Well, I always find the Kyoto protocol kind of an amazing thing. Actually, I was a visiting Fellow at Queens University in Belfast last year, and we had this interesting [indiscernible] precautionary principles, and so on. We were talking about some of the science that I'm talking about that was just evolving at the present time. But they said, well, you know, there's maybe a slim chance it still might work and it's a great way to clean up the world and so on, but, you know what, all the money, the science is not static.

A lot of the science that's behind the Kyoto protocol is 1980s and 1990s science, early 1990s because that was the time that the agreement was being formulated. Well, it's the 21st century, folks, and things are changing. Now we have a better understanding of how climate systems work. We have a better idea now of the influence of the sun. We are starting to find we could explain that. It's time to move on.

One of my colleagues was saying one time, you know, there was, I guess the Environment Minister of Canada, Christine Stewart [ph.], one time said, well, you know, even if Kyoto is bunk, isn't it a great way that we could transfer wealth to poor nations? Well, I can think of much better ways that we could do that, possibly. Like all the money that would be spent on Kyoto in one year for these strange carbon trading and all the things that are supposed to go on with it that I have no understanding of this whatsoever why you would do this, but, you know, you could go into Africa, for example, and completely clean up all the water supply there in perpetuity. You think of all the millions of people that are in jeopardy there that die just because they haven't got clean water.

So I would think that there's, the science is changing, I think that's an outdated treaty and I hope that the [indiscernible] do to it what should be done because I think that it's something that I think is past its sell date.

Sallie Baliunas: I just wanted to add some calibration points. The Kyoto Protocol cost, summed over ten years, is estimated at U.S. $2.3 trillion for the U.S., per work of a Yale economist. And, remember, full implementation of the Kyoto Protocol would not stop carbon dioxide from increasing in the air over the next several decades because its cuts are just too small. The response, correctly made, is that 30 Kyoto's are needed. In terms of cost, I know Jim, not to take $2.3 trillion for one Kyoto implementation and multiply by the estimated 30 Kyoto's to get an estimate of the zero-risk carbon world. I expect that the total cost would have nonlinear components. The Kyoto Protocol seems an odd precedent, insisting on precaution over the current state of science, and - with few exceptions - not voicing the fact that its cuts are ineffective. Such an attitude implies that neither science nor solving the problem is important.

James Glassman: Thank you. We are running out of time, but, go ahead, ask the question. I want to make one point, just to reinforce what Tim said, as you go to the microphone. Some of you may know this, but it didn't get enough publicity in my opinion, a few months ago, well, actually it started about a year ago, the Economist Magazine and a government organization in Denmark put together a group of non-economists, including four Nobel Prize winners, to look at this very interesting question, which is, if you had lots of money, as, in fact we do around the world, what would be the best use of it to improve the well-being of the world in general, and especially poor people?

They looked at 17 different programs. The programs that came out at the top were, as Tim said, cleaning up dirty water, attacking AIDS, malaria and so forth. Ranking 16 and 17, out of 17, were the Kyoto protocol and another project for getting carbon out of the atmosphere. But it's amazing the political power of this. I'll be in, Sallie and I both will be in, Buenos Aires in December at the latest of many, many U.N. conferences that are pushing this Kyoto regime.

Stephen McIntyre: I was referred to, kindly, by Tim and Sallie. I've written a paper attacking the hockey stick (http://www.uoguelph.ca/~rmckitri/research/trc.html). I'm Steve McIntyre, as referred to. One of the interesting experiences that I've had in this, and I think there is some broader policy significance, is that the author of the famous hockey stick study refuses to disclose his supporting calculations or his source code. It's my hypothesis, and I'm quite convinced of it, that he's simply made an error in his principle components calculations, and I can show that you generate hockey sticks from red noise.

And, so, the two most distinctive features of his study, the principle components calculation and the hockey stick, are, in fact, intimately related. He refuses to disclose the computer code for his calculations. He refuses to disclose his supporting calculations. The journal that originally published his paper has refused to intervene to require him to disclose any of this information. The U.S. National Science Foundation, which funded the study, has refused to intervene to require him to disclose it. They say that his calculations are his personal property, that they are private property and have commercial value. The Canadian government refused to intervene to make a request to these people to disclose the information. The publishing journal refused to publish a criticism on the grounds that it was too technical. This is a science journal. So, I think for public policy to be based on material that is not subject to scrutiny is a pretty poor practice. I've got lots of experience in the mining industry and full disclosure is usually the best protection for the public and I think this applies to science policy as well.

James Glassman: Thank you, very much, Dr. McIntyre. It's an honor to have you. I didn't realize that you were here. This is, the hockey stick is really a tremendous scientific scandal, and this is the man who first brought it to public attention. We are going to have to move on to our next speaker, but thank you very much, Tim, thank you, Sallie. [Applause]

Categories:

1 Comment

No Movie Deal here!!!
If Dr.Bailunas is not careful she will find herself charged with a hate-crime-by-inference towards Al Gore and she also runs the risk of blowing a lucrative movie deal since a recitation of scientific facts and as opposed to hysteria and hyperbole is NOT a great marketing ploy. Can't wait for the usual suspects who fancy themselves as leaders of "The Sky is Falling" wing of the DNC get ahold of this! They will no doubt demand equal time for 'junk science'.

TCS Daily Archives