TCS Daily

How Might Hurricanes Change with Global Warming?

By Roy Spencer - October 7, 2004 12:00 AM

When watching science fiction, it is the plausibility of a story that allows us to imagine that what we are seeing is real. Sure, the aliens must speak English for us to understand them, and they usually have two arms and two legs, but we are fascinated by the possibility that what we are watching could be, or could someday become, reality.

Computerized climate models are a little like science fiction. They contain some of the physics we know of that describes how the atmosphere operates, they leave out other physics that computers are simply not yet fast enough to handle, and they (necessarily) ignore the things we haven't yet learned. Nevertheless, what comes out of the models fascinates us because, someday, those predictions might come true.

The recent hurricane action in Florida now begs the question, how might hurricanes change with global warming? A recent study published in the Journal of Climate by researchers at NOAA's Geophysical Fluid Dynamics Laboratory, used nine different low-resolution climate models to predict how those models' warmed climates would affect a high-resolution model that is used to "grow" hurricanes. For the warming experienced in response to 80 years of carbon dioxide increases at 1% per year, the average model response was to increase hurricane maximum wind speeds by 6% and precipitation by 18% within 100 km of the storm center (that's where most of the action is). This led the authors to conclude, "greenhouse gas-induced warming may lead to a gradually increasing risk (of) the occurrence of highly destructive category-5 storms".

While I will admit that, ultimately, such computer models are a necessary tool to help answer questions related to global warming, we must always keep in mind that there are a wide variety of assumptions that are necessary to perform these modeling experiments. First of all, the assumed 1% per year increase in atmospheric carbon dioxide concentration, while widely used by modelers for its simplicity, is considerably above what has been experienced in the last 30 years. The resultant 2.2-times increase in carbon dioxide over 80 years, as assumed in the models, would actually take 280 years if we extrapolate the real, observed upward trend over the last 30 years out in time. Of course, no one knows whether we'll even be using carbon-based fuels in another 100, let alone 280, years.

Secondly, the model-predicted warming in the tropics is strongly tied to how those models handle moist convection (showers and thunderstorms). All models have greatly simplified schemes for how this convection transfers heat and moisture from the surface to the atmosphere. Any warming in the models leads to moistening of the atmospheric humidity throughout the troposphere (where our weather occurs), and since water vapor is by far the most important greenhouse gas, this leads to further warming and moistening. It is not at all obvious that this strong of a water vapor feedback will occur in response to carbon dioxide increases. An increase in precipitation efficiency (how readily clouds convert water vapor into precipitation) is one possible negative feedback which isn't understood well enough to include in models yet. Furthermore, a mixture of surface thermometer, weather balloon, and satellite data over the last 25 years suggests that the tropical atmosphere might not behave as simply as the models assume. The satellite and weather balloon data suggest little, if any, warming of the tropical troposphere during that time, the reason for which remains a mystery, since all models suggest any surface warming should, if anything, be amplified with height.

On a more philosophical level, the paradigm that climate modelers operate in, which assumes the Earth to be in "radiative equilibrium", could itself be questioned. We typically assume that the Earth absorbs as much of the sun's energy as it emits back out to space in the form of infrared (heat) radiation. We actually can't measure this supposed balance from satellite instruments to an accuracy better than the level of "imbalance" (about 1%) that is expected to result from a doubling of the carbon dioxide level of the atmosphere. What if the Earth really isn't in radiative balance? Most of the mass of the ocean-atmosphere system that redistributes heat around the Earth is, somewhat surprisingly, near the freezing point: it's the deep ocean. What if the deep ocean has been sucking up excess heat for thousands of years, as a result of a change in climate after the last ice age? I'll admit this is just conjecture on my part, but in science the greatest discoveries are usually made when someone questions the most basic, long-held assumptions. And since we are talking science fiction anyway, my assumptions might be just as good as anyone else's.


TCS Daily Archives