TCS Daily

Reality and the Code

By Glenn Harlan Reynolds - July 2, 2003 12:00 AM

Science fiction writers have written about mind uploading for years. Somehow (the actual process is generally left a bit vague in the stories, though the technology seems to be developing), you copy your mind from the organic computer that is your brain into a digital computer that's easier to program. The results may even sound appealing, as in Greg Egan's Permutation City, one of the best mind-uploading stories I've read. Digital reality: If you want it, you can think it, and it can happen -- a dream of virtual heaven, perhaps. No pain, no hurt, we'll go dreaming. . . . (Maybe even with a virtual Kirsty Hawkshaw thrown in? We can, er, dream. . . .)

But perhaps we should take a pointer from Milton, who wrote: "The mind is its own place, and in itself, can make heaven of Hell, and a hell of Heaven." Or, rather, perhaps we should ask, what if the mind isn't its own place anymore?

Larry Lessig has written that whoever controls the code, controls the Internet. But that's small potatoes compared to controlling the code in more advanced virtual universes. If you upload your mind into a computer, you'll have complete control over what happens -- if, and only if, the software that's running that copy of you allows it. Otherwise it's the people in charge who'll be deciding whether you're in Heaven or Hell. (And if the RIAA is in charge, you'll be literally unable to conceive of the idea of copying music, or perhaps suffer fire and brimstone if you do.)

This already shows up, to a degree, in role-playing games online. As Dan Hunter and Gregory Lastowka note in the latest issue of Legal Affairs, online games are already generating legal disputes. Not so much offline disputes -- though there are those, too -- as disputes about the laws that should operate in the virtual worlds. Can you "kill" or "torture" another character? Behavior that is sufficiently offensive can be stopped cold:

Unlike the real world, of course, virtual worlds are representational creations constructed of human-written code that designers can manipulate with uncommon precision. For example, in 'There' some users delighted in driving their dune buggies into groups of chatting avatars, scattering them like tenpins. Designers added a "forcefield" option to each user interface. With your forcefield turned on, the disruptive dune-buggy-driving "bowler" just bounces off you.

That sounds benign enough. But of course, "There" is a game, and the user's brain -- though it may at times seem otherwise (some online games feature an "alarm clock" to remind users to return to the real world) -- is under his or her own control. With mind uploading, that control would be surrendered, of necessity. That might not be so bad -- though online game designers wield enormous power, the commercial nature of such enterprises causes them to care a lot about making users happy, and unhappy users can always go elsewhere. So if you upload your mind into an artificial world of that type, the constraints might be tolerable. But in the non-commercial setting, mind uploading could be an instrument of torture (could you put someone in a virtual Hell, then return them to their body so that they would remember the experience, suffering psychological, but not physical, scars? The penological implications are troubling, to say the least).

We're quite a long way from mind uploading at the moment, of course. But people at NASA and elsewhere, are already thinking about some of these issues. A Code of Ethics for mind uploading? Yep. And I certainly wouldn't upload my mind unless the system in question was certified by somebody I trusted completely.


One glaring omission
In both this article and the one at Lifeboat Foundation there is no mention of the metaphysical objections to such a process. Is the replica actually me? At lifeboat the author pretty much implies the destruction of the physical self would be a desirable prerequisite; however, am I merely the sum total of my memories and motivations? Regardless of one's religious (or non-religious) feelings regarding this point it seems nonsensical to assume the replica is anything more than a copy of the 'real' individual and that the 'real' individual is now dead.

How is this desirable in any circumstance beyond impending physical death?

No Subject
Richard Morgan's Kovacs series does a nice analysis of the ramifications of neural storage and specifically the virus issue. Once again sci fi far ahead of the curve on technology and morality.

TCS Daily Archives