TCS Daily

Research and Risks

By Glenn Harlan Reynolds - July 31, 2002 12:00 AM

In last week's column about nanotechnology, biotechnology, and other advanced technologies, I wrote, "Trying to stand still might well prove the most dangerous course of action."

This may seem surprising. But experience suggests that it's true.

For an academic project I've been working on this summer, I've been reviewing the history of what used to be called "recombinant DNA research" and is now generally just called genetic engineering or biotechnology. Back in the late 1960s and early 1970s, this was very controversial stuff.

Not all the fears were irrational: we didn't know very much about how such things worked, and it was possible to imagine scary scenarios that at least seemed plausible. Indeed, such plausible fears led scientists in the field to get together, twice, at Asilomar in California to propose guidelines that would ensure the safety of recombinant DNA research until more was known.

Those voluntary guidelines became the basis for government regulations, regulations that work so well that researchers often voluntarily submit their work to government review even when the law doesn't require it -- and standard DNA licensing agreements even call for such submission. Self-policing was their key element, and it worked.

It wasn't that way when the DNA research debate started. Indeed, scientific critics such as Erwin Chargaff spoke of Frankenstein, and of "little biological monsters" and compared the notion of scientific self-regulation to that of "incendiaries forming their own fire brigade." They warned that the harms that might result from permitting such research were literally incalculable, and that it thus should not be allowed.

Others took a different view. Physicist Freeman Dyson, who admitted that he had no personal stake in the debate, noted that "The real benefit to humanity from recombinant DNA will probably be the one no one has dreamed of. Our ignorance lies equally on both arms of the balance... The public costs of saying no to further development may in the end be far greater than the costs of saying yes."

Harvard's Matthew Meselson agreed. The risk of not going forward, he argued, was the risk of being left open to "forthcoming catastrophes," in the form of starvation (which could be addressed by crop biotechnology) and the spread of new viruses. Critics like Chargaff pooh-poohed this view, saying that the promise of the new technology to alleviate such problems was unproven.

Meselson and Dyson have been vindicated. Indeed, Meselson's comments about "forthcoming catastrophes" were made (though no one knew it at the time) just as AIDS was beginning to spread around the world. Without the tools developed through biotechnology and genetic engineering, the Human Immunodeficiency Virus could not even have been identified, and treatment efforts would have been limited. Had we listened to the critics, in other words, it's likely that many more people would have died. Meanwhile the critics' Frankensteinian fears have not come true, and the research that was feared then has become commonplace, as this excerpt from John Hockenberry's DNA Files program on NPR illustrates:

JOHN HOCKENBERRY: In those early days [Arthur] Caplan says people were concerned about what would happen if we tried to genetically engineer different bacteria.

ARTHUR CAPLAN: The mayor of Cambridge, Massachussetts, at one point said he was worried if there were scientific institutions in his town that were doing this, he didn't want to see sort of Frankenstein-type microbes coming out of the sewers.

JOHN HOCKENBERRY: Today those early concerns seem almost quaint. Now even high school biology classes like this one in Maine do the same gene combining experiments that once struck fear into the hearts of public officials and private citizens...

This experience suggests that we need pay close attention to the downsides of limiting scientific research. This is especially true at the moment, because, arguably, we're in a window of vulnerability where many technologies are concerned. For example, researchers at SUNY-Stony Brook recently synthesized a virus using a protein synthesizer and a genetic map downloaded from the Internet. This wasn't really news from a technical standpoint (I remember a scientist telling me in 1999 that anyone with a protein synthesizer and a computer could do such a thing) but many found it troubling.

But at the moment, it's troubling because we know more about viruses than their cures. In another decade or two, depending on the pace of research, developing a vaccine or cure will be just as easy. That being the case, doesn't it make sense to progress as rapidly as possible, to minimize the timespan in which we're at risk? It does to me.

Critics of biotechnology feel otherwise. But their track record hasn't been very impressive so far.



TCS Daily Archives