TCS Daily


Tiny Troubles

By Glenn Harlan Reynolds - November 21, 2002 12:00 AM

Editor's note: An excerpt from a new paper on nanotechnology.

Some might suggest regulating research in nanotechnology because it may lead to knowledge that they would rather not have, as Bill Joy argues. Such regulation is unlikely to succeed, not only for lack of consensus on what kinds of knowledge are undesirable, but also for fairly obvious First Amendment reasons. While the degree of First Amendment protection enjoyed by scientific research as such is not entirely clear, regulation of research solely in order to ban the acquisition of knowledge seems a rather straightforward violation. But regardless of the knowledge that it may or may not yield, the government may certainly regulate research on safety grounds

In the area of nanotechnology, this means ensuring that self replicating systems (replicators) do not escape the laboratory and that, if such escape did occur, the replicators would be unable to reproduce. Aside from obvious containment measures, such safety regulations might specify that important parts of the replicators' blueprints be externally provided and that replicators depend on elements not found in the natural environment.

Such an approach, in fact, is consistent with the "physical containment" and "biological containment" approaches taken to the regulation of recombinant DNA research. As Eric Drexler has written, however, the real problem with nanotechnology is not accident, but abuse.

Beyond the Lab

Regulation of nanotechnology must thus focus more on preventing deliberate destructive uses than prevention of accidents. This is likely to involve several complementary approaches, some more promising than others.

Access Limitation: Only licensed professionals regarded as dependable would be allowed to work with nanotechnology - or at least with those deemed particularly risky such as general-purpose self replicating devices, which might be easier to reprogram in destructive ways. Such an approach would parallel the treatment of explosives and toxins and might offer some benefits, but the protection would be incomplete. Just as restrictions on high explosives can be evaded through the use of expedients such as fuel-oil/fertilizer mixes, or through theft, bribery, or blackmail, so restrictions on nanotechnology access can be evaded or neutralized by determined offenders.

Export Controls: Such controls would attempt to limit the spread of nanotechnology to hostile or irresponsible nation-states. This approach has proven modestly effective in some areas. Nuclear programs, in particular, are easy to control because they require a large and conspicuous physical plant, need significant quantities of rare fissionable materials, and make use of equipment that-at least until recently-was specialized in nature and easy to control. Nanotechnology does not possess these characteristics and may be compared more accurately to less conspicuous biological programs.

Professional Ethics: The single most successful example of technology control in the last century was the regime established to regulate the use of recombinant DNA technology. What is particularly interesting about this approach is that it was largely "soft law," the product of professional self regulation, culture, and expectations more than of harsh regulatory systems. Applying such an approach to nanotechnology has a number of advantages.

First, to the extent that the nanotechnology community in general is imbued with positive values, this approach produces a large number of "regulators" who can identify and respond to improper conduct that governmental authorities would be unlikely to notice. Second, to the extent that such an approach is regarded as morally binding by large numbers of people in the field, it is likely to be obeyed even under circumstances where formal legal controls would be unable to operate. Third, such attitudes are likely to be self reinforcing, spreading not only among those initially imbued with the attitude, but among coworkers. While this approach is not sufficient in itself, it offers many advantages.

Inherent Safety: The various items of nanotechnology might be required to be inherently safe; resistant to accident, misuse, and abuse. For example, the "genome" of replicating nanodevices might be encrypted to make reprogramming more difficult and to ensure that "mutations" would lead to nonsense instructions. There might be limitations on the number of generations that a device could reproduce. Software could be configured so that changes would produce an audit trail and certain types of programming or operations might be prohibited. To the extent that such protections are built into the most basic elements of nanotechnology, they would probably be extremely effective at preventing accidents, and helpful (though not insuperable) in preventing abuse.

Many enthusiasts believe that an open-source approach to nanotechnology architectures would be helpful in producing systems that are robust and resistant to abuse, though this may conflict to some degree with other control approaches. In evaluating these and other regulatory approaches, it will be important to maintain proper perspective.

Many of the gross dangers posed by nanotechnology - the runaway proliferation of hostile self replicating devices, for example - will not really be all that new. Disease organisms, after all, are hostile self replicating devices, and we have been dealing with their threat for some time, along with deliberate human modification of such organisms to enhance their virulence and deadliness. Indeed, crude biological weapons, and some that are not so crude, have been possessed by many nations for decades without being put to significant military use. It is also important to recognize that the choice is not a simple one between the accelerator and the brake.

Stopping nanotechnology through regulation is effectively impossible. The choice is not where the technology will be developed at all but rather how it can be developed in a benign rather than threatening fashion. Regulators must exercise as much care against unintended consequences as scientists because regulation leads to Frankensteinian results more often than does science.

Scholars of administrative law have long recognized the existence of what Cass Sunstein calls "paradoxes of the regulatory state." Such paradoxes occur when regulation turns out to be selfdefeating, something that happens more often than is generally understood. Sunstein has identified numerous examples of this phenomenon; following are a few that may be especially applicable to the regulation of nanotechnology.

Overregulation produces underregulation: When regulations are especially aggressive, administrators will tend not to enforce them. When statutes require especially stringent regulations, administrators will tend not to issue regulations at all. Extraordinarily strict rules on workplace toxins, for example, have led to the Occupational Safety and Health Administration's (OSHA) failure to address all but a tiny minority of chemicals believed to be toxic. The burden on OSHA, and the industry, would otherwise simply be too great. As Sunstein notes, "A crazy quilt pattern of severe controls in some areas and none in others is the predictable consequence of a statute that forbids balancing and tradeoffs."

Stringent Regulation of New Risks Can Increase Aggregate Risk Levels: New technologies are usually safer than old ones, but for political reasons it is easier to impose new regulations on new technologies than on entrenched industries. This, paradoxically, can actually make things more dangerous. Requiring that new automobiles be much cleaner, and thus more expensive, has the effect of encouraging people to drive old automobiles longer. "The strategy of imposing costs exclusively on new sources or entrants," writes Sunstein, "will discourage the addition of new sources and encourage the perpetuation of old ones. The problem is not merely that old risks will continue, but that, precisely because of regulatory programs, those risks will become more common and last longer than they otherwise would."

To require the best available technology is to retard technological development: If you require companies to employ the best available technology, you create a positive disincentive for companies to research and develop new technologies, since they will then be forced to adopt the results whether they want to or not. "Perversely, requiring adoption of the [best available technology] eliminates the incentive to innovate at all, and indeed creates disincentives for innovation by imposing an economic punishment on innovators."

These specific paradoxes of regulation suggest some specific regulatory approaches that should be avoided. More generally, they suggest that regulators should move incrementally, and with much caution, lest they aggravate the very problems they are trying to address. Experience demonstrates that such unintended consequences are not to be dismissed.

 

Categories:
|

TCS Daily Archives