TCS Daily


Doomsday Trippers

By Kenneth Silber - May 19, 2003 12:00 AM

Martin Rees is a cosmologist at Cambridge University and holder of the honorary position of England's Astronomer Royal. He has written several books that focus on the universe at the largest known scales of space and time. His latest book Our Final Hour draws upon large-scale thinking in focusing on the here and now. Rees sketches out diverse dangers confronting humanity in the 21st century, including possible nuclear or biological terrorism, asteroid impacts, environmental calamities, and nanotechnology run amok.

The book has the subtitle "A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century - On Earth and Beyond." In a noteworthy passage, Rees emphasizes just how high the stakes may be:

"It may not be absurd hyperbole - indeed, it may not even be an overstatement - to assert that the most crucial location in space and time (apart from the big bang) could be here and now. I think the odds are no better than fifty-fifty that our present civilisation on Earth will survive to the end of the present century. Our choices and actions could ensure the perpetual future of life (not just on Earth, but perhaps far beyond it, too). Or in contrast, through malign intent or through misadventure, twenty-first century technology could jeopardise life's potential, foreclosing its human and posthuman future. What happens here on Earth, in this century, could conceivably make the difference between a near eternity filled with ever more complex and subtle forms of life and one filled with nothing but base matter."

Notwithstanding this stark warning, the book presents generally well-balanced assessments of the various threats under discussion. Rees never really justifies the "fifty-fifty" figure, which is more a guess than an estimate. But he is clearly correct in arguing that, cumulatively, the dangers facing humanity in this century are substantial. Moreover, Rees makes the important point that space colonization would diminish the risks of any calamity being fatal for our species or civilization; it's a point that has been made by TCS contributing editor Glenn Reynolds, among others, and one that merits sustained repetition.

In Rees's view, humanity may be lucky to have survived the Cold War without a nuclear conflagration. However, his discussion of this subject is weakened by an unquestioning attitude toward arms-control shibboleths and "concerned scientists." He wonders whether resisting the Soviet Union was "worth it," given the risk of nuclear war. But this fails to recognize that acceding to Soviet domination, in addition to everything else it would have entailed, could have increased the nuclear risk. Saddam Hussein's career should have ended any notion that a government would never use weapons of mass destruction as a tool of repression or internal conflict.

Turning to current threats, Rees discusses the potential of bioterrorism and laboratory accidents, as well as the ongoing danger of nuclear proliferation. He notes the growing capability of small groups and disaffected individuals to cause vast harms. Should some areas of research be slowed or stopped to diminish the risks of misuse? Rees does not dismiss such an approach, but has doubts about its efficacy. As he points out, if a terrorist group is conducting dangerous research, it will be hard to develop countermeasures if no one else has relevant expertise.

Rees takes an appropriately cautious attitude toward predictions that superintelligent robots might take over in several decades, or that nanotechnology could result in an out-of-control conversion of matter into "gray goo." Still, he regards even extreme scenarios as worthy of attention. He notes concerns that certain physics experiments could cause catastrophe on a global or even cosmic scale, such as by ripping open the fabric of space. Such events, as Rees notes, may be extraordinarily unlikely or impossible, but must be considered carefully due to the effectively infinite downside if they did occur.

The book includes a discussion of the "Doomsday Argument" presented by physicists Brandon Carter and Richard Gott. The basic idea is that if humanity has a long future ahead of it, with an enormous human population yet to be born, it is statistically unlikely that we - you and I - would be alive at such an early period in the history of the species. It's as if one reached blindfolded into an urn filled with numbers and pulled out the number six; it's more likely that the urn has only, say, 10 numbers rather than 1,000.

Some scientists and philosophers have taken this as an indication that humanity probably does not have a long future (or at least not one with a large population). Whether the Doomsday Argument is compelling, or even makes sense, has been a matter of intense debate. Nonetheless, as Rees notes, the argument has shown a certain amount of resilience, and might be useful as one guideline in contemplating the human future.

Our Final Hour shifts fluidly between the material of current headlines and the arcane types of things that cosmologists normally write about. Rees considers whether there is intelligent life elsewhere in the universe. (That question is, by the way, a complicating factor in the Doomsday Argument, since it is unclear whether only humans belong in the reference class.) If alien intelligence is commonplace, then humanity's future may matter little in the cosmic scheme of things (though it is still of some importance to the locals).

But it is also possible that intelligent life is extremely rare or even limited to Earth. If so, Rees points out, the outcome of the current dangerous century on Earth has enormous implications. The future of the universe may depend on us, believe it or not.
Categories:
|

TCS Daily Archives