TCS Daily

No Nano Secrecy, Please

By Glenn Harlan Reynolds - April 24, 2002 12:00 AM

I've been hearing some disturbing things about nanotechnology lately. I don't swear that they're accurate, but I'm getting whispers and too-pointed-I-can-neither-confirm-nor-deny remarks that suggest that the federal government is ever so quietly beginning an effort to shut down or limit civilian nanotechnology research.

Some work, presumably, would move into classified programs, where there already seems to be a lot going on. Other work would be redirected away from sensitive areas, or shut down altogether. Some civilian work might go on, but without publications in the open scientific literature, or without peer-review from scientists who lack appropriate security classifications.

As I say, I can't be certain that this is happening - if I could be, I suppose I'd hold a job that wouldn't let me write about it - but the indications are worrisome, and I very much hope that the reports of such an effort are wrong. The reason is that it would be a dreadful idea.

It's not that nanotechnology doesn't have important military implications. It does. And it's not as if nanotechnology doesn't raise the possibility - though rather far down the line - of nanoterrorism. It does. And I would be the last to say that we shouldn't be thinking about such problems today. But if government officials are trying to use their not-inconsiderable clout (for even in the nonmilitary area, the combination of grant-making and purchasing power gives them a lot of influence) to shape nanotechnology now, they run serious risks of being wrong.

Imagine no computers

Just imagine, for example, a similar effort devoted to computer technology, circa 1955. At that point it was pretty obvious that computers were going to be important. It wasn't nearly so obvious how they were going to be important. Yet worried government officials might have concluded that computer technology posed a potential threat down the line, offering enemies the opportunity to break codes, to encrypt things, to design nuclear weapons without the need for real-world tests, and so on.

To some degree they did foresee such problems, and in a very limited way they even tried to limit the availability of computers via export controls and the like. But imagine if they had tried to keep computer science itself in the domain of classified military projects. The result would have been failure: The Soviets were never all that good at making computers, but they were certainly good enough at it to put them to hostile uses. Meanwhile the U.S. economy, and American intellectual and social life, would have missed out on all the benefits offered by computers.

The point is that in a new technology's early stages, we just don't know enough to engage in heavy-handed regulation, because we can't foresee how it will develop. And at a later stage, the technology has usually developed beyond the point at which it can be shut down, if it's ever really possible to shut down a technology.

As I've written in an earlier article on this subject, previous efforts to shut down the development of technologies have been abysmal failures. The British Explosives Act of 1875, intended to prevent abuse of the then-new technology of high explosives, merely served to ensure that it was the Germans who developed missiles in World War II, not the British. And the effort to ban germ warfare appears to have resulted in huge stockpiles of extra-deadly bio-engineered smallpox, which the Soviet Union began producing as soon as the ink on the treaty against biological warfare was dry.

Bad guidance?

It may be, of course, that the accounts I'm getting are distorted, and that the "guidance" that the federal government is attempting is really something more akin to what is recommended in the Foresight Institute guidelines regarding molecular nanotechnology: a set of recommendations for sound engineering practices and other safeguards designed to discourage abuse while not giving up what Arthur Kantrowitz calls the weapon of openness. But that's not what I'm hearing.

I think it's more likely that some folks in the Pentagon see nanotechnology as a "breakthrough" technology of such importance that a monopoly, or at least a big lead, on the part of the United States could guarantee decades of unchallengeable military supremacy. In order to maintain such a monopoly, I fear that they might try to bring all of nanotechnology under federal control, so that the U.S. military and the industries supporting it are its only beneficiaries.

While I would rather see the United States have such a monopoly than, say, the North Koreans, I find this scenario disturbing, and more than a little frightening. Nanotechnology potentially offers not only military advantages and opportunities for pervasive surveillance networks, but also cures for everything ranging from cancer to old age.

Do we want to forego those benefits for military advantage? Or worse yet, leave those benefits in the control of Pentagon bureaucrats? (There's actually a novel coming out based on this scenario: the Pentagon uses "black" nanotechnology to cure and rejuvenate political figures as a means of cultivating influence. It doesn't seem quite as farfetched as it did a few months ago.)

Benefits of openness

At any rate, with proposals to criminalize research into even therapeutic cloning already on the table, and with rumors of a nanotechnology clampdown, it looks as if we might be on the verge of a major retreat from technological openness. And I don't think this is a very good idea.

We beat the Soviet Union, after all, because we had a more vibrant, faster-learning society than theirs. We had - and have - such a society because of, not in spite of, our openness and freedom. We understood this in the 20th Century. We had better not forget it in the 21st.


TCS Daily Archives