TCS Daily

On the iCouch

By John Ford - December 14, 2007 12:00 AM

A review article has been published speculating on the utility and future of computer-aided cognitive-behavioral therapy (CCBT). Basically, this amounts to software that does psychotherapy. It is remarkable that such systems are currently in use.

For a long time, strides have been made to develop "expert systems" to aid in the diagnosis of medical problems from patient data keyed into a computer. I've even reviewed (unfavorably) a layperson's version of such a system. Of course much more elaborate and impressive programs have been developed.

Such programs however, are a far cry from what is being discussed here. No specialty in medicine is more "touchy feely" than psychiatry. No specialty is more intimate and no specialty is more predicated on the integrity of the doctor-patient relationship. I would have thought, perhaps wrongly, that in no other therapeutic intervention is the "human touch" more important than in psychiatry.

What doctors Marks, Cavanagh and, Gega have documented is that in certain specific situations, patients typing into a computer will do as well as actually interacting with a mental health professional. It's so hard to know what to make of this. Is this because psychotherapy itself has so little efficacy or because the human touch is essentially overrated?

So many questions can be raised from this evolving technology:

1. Do such programs simply seek to be informational? In other words do they just probe the patient to identify patterns of known psychiatric disorders and assist the patient in helping himself?

2. Do these programs actually simulate a human interaction? In other words, does the software, in its responses, attempt to convey a sense of caring and concern to the user? To what extent do patients actually participate in the maintenance of such an illusion?

3. What kinds of provisions are there within the various programs to identify and warn medical personnel of potentially lethal conditions such as suicidality or homicidality? This leads to my next question.

4. What kinds of privacy protections are there for the patients that use these programs (some are simply accessed from the internet)? What kind of expectation of privacy will patients have?

5. Will such programs replace or phase out large numbers of mental health professionals and will such programs become the "new norm"?

6. What kind of provisions will be made for patients who are resistant to such "innovations"? Will insurance companies and managed care organizations refuse to fund psychotherapy with real people assuming data suggesting that it offers no additional benefit?

7. Will scientific studies (presumably based on patient surveys and clinical outcome measures) truly capture the important and subtle psychic changes that may occur when patients confront the fact that in the final analysis, they've been taking advice from a cold, calculating machine?

I think this list is a pretty good start but I would also ask this question: Suppose these programs really DO work? This is tantamount to saying that the therapeutic intervention of a doctor sitting with his patient, holding his hand, and being truly concerned may be less important than any of us imagined. What does this say about us as doctors or patients?

One last thought. The great mathematician Alan Turing tried to define artificial intelligence. He proposed that a computer program would be truly intelligent if it was so artfully constructed that a human being interacting with it (i.e. by typing) couldn't tell whether its responses arose from the computer or from a person typing in the next room.

Such a well-defined and operational goal seems increasingly within reach. Should we be concerned?

John S. Ford, MD, MPH is Assistant Professor of Medicine, David Geffen School of Medicine at UCLA. You can find more of his writing here .



I think it's a good idea and we shouldn't be concerned. They will also have about the same success rate of about 30%. Apparently that's what it is for all forms of therapy, including witch doctors, shamans, etc. But those in the profession will say that it's just a placebo. But we could then answer, yeah just like yours too, only the iCouch will be cheaper.

Well, it would seem to avoid the trasferrence issues.

Placebo effect.
Dietmar got it right, therapy is basically a placebo. People talk to themselves until they figure out their problems and do something about it. The next evidence I see that a licensed therapist actually does any good will be the first.

Suicidal people may need drugs, homicidal people may need confinement. For these decisions, we may still need a human touch.

For the vast majority of people, they just need to think about their problems a bit and listen to themselves talk it out. A computer can do that as easily as a human, and for much less cost.

how to programme the computer for it
They could start by using the content of a book I read once called, I think, 'Games that Therapists Play'. So they showed how those guys can bluff their way thru, by just asking back the same questions, or saying stuff like, ummmmm, all sorts of tricks they have. It's about the easiest thing to fake, so certainly computers can do it, and will have the same success rate.

TCS Daily Archives