TCS Daily

Politics, Decision Theory and Contradictory Complaints

By Arnold Kling - April 18, 2007 12:00 AM

"...the tales of woe and wrongdoing are blurring together -- at times in almost contradictory fashion, as when Mahar laments the excess of care, then, with whiplash-speed, segues into a condemnation of withholding treatments."
-- Ezra Klein, in an otherwise favorable review of Money-Driven Medicine, by Maggie Mahar

It is very common to move with "whiplash speed" from condemning one type of mistake to the opposite type of mistake. Doing so shows a lack of understanding of basic decision theory. We can see this in regard to health care spending, terrorism watch lists, and subprime mortgages.

In each case, you have to make an either-or decision. The patient either gets an MRI or does not. The individual either gets put on a terrorism watch list or does not. The mortgage application either is approved or it is not.

Consider the decision about the MRI. Without the MRI, the doctor would guess to go with treatment plan A. Beforehand, no one can be certain whether or not the MRI will provide information that causes the doctor to switch to plan B. The MRI can either be valuable or not. And the MRI can either be ordered or not. This leads to four possibilities.

Order the MRI

Forego the MRI

MRI would be valuable

good decision

type I error

MRI would not matter

type II error

good decision

Usually, the most costly mistake to make would be to fail to order an MRI for a patient where the results of the MRI would have affected the treatment plan. The most costly mistake is called a type I error.

On the other hand, if the MRI does not turn up information that affects the treatment plan, then the MRI is a waste of time and money. This is a type II error.

Statistically, we are bound to observe both kinds of errors. Suppose that out of every 10,000 patients who come in with back pain, an MRI will affect the treatment plan for 10 of them. Suppose that 2000 of them, including 2 of those for whom the MRI is valuable, receive MRI's. In that case, we will have

Order the MRI

Forego the MRI

MRI would be valuable



MRI would not matter



In this example, there would be 8 type I errors. That is, 8 people of the eight thousand not receiving MRI's who in fact would have benefited from the MRI. There are 1998 type II errors. That is, 1998 of the two thousand who get an MRI end up with the same treatment plan they would have had without the MRI.

If we give more people MRI's, we reduce type I errors but increase type II errors. If we give fewer people MRI's, we reduce type II errors but increase type I errors.

The Maggie Mahars of the world want to blame such errors on the fact that we have private-sector medicine. However, errors are inherent in medicine, because knowledge is imperfect and decisions must be made under uncertainty. Given the uncertainty, one cannot reduce errors of one type without increasing errors of another type. Most importantly, the existence of errors does not prove that the system is flawed.

The system may indeed be flawed in two ways. First, it may be possible to make decisions based on better information. This information could be statistical information. For example, if doctors are aware that 99.99 percent of the time an MRI does not matter, they may choose to forego an MRI.

Other information that could be helpful would be patient-specific. If the patient has had ever-increasing back pain that cannot be traced to any specific event, an MRI may be more likely to turn up a tumor or other abnormality than if the back pain is can be traced to a recent injury.

The second way that the system may be flawed is that it is overly biased against one type of error. For example, if there is a shortage of equipment, then doctors will be biased against ordering an MRI, and there will be more type I errors. Alternatively, if the MRI will be paid for by insurance rather than by the patient, then both the doctor and the patient may be biased in favor of going for the MRI, and there will be more type II errors.

Terrorism Watch Lists

In the case of terrorism watch lists, a type I error might be the failure to search thoroughly someone with an intent to hijack or crash an airliner. A type II error would be doing a thorough search of an ordinary citizen. Our current policy is to commit massive type II errors, as all of us go through metal detectors, take off our shoes, show picture id's, and so on. In theory if we cut down on type II errors, we would be increasing the risk of type I errors.

It is common to hear people complain about finding themselves erroneously placed on terrorism watch lists. However, the only way to be completely certain that no one is being placed on a watch list erroneously would be to take everyone off the watch list. Presumably, that would expose us to more type I errors -- it would make life easier for terrorists.

As with medicine, the only way to reduce type I and type II errors is with better information. The more accurate our database of people linked to terrorism, and the more accurate the profiling of potential terrorists, the better job an agency can do in minimizing both types of errors. The less profiling and data mining we allow, the more the security agencies have to treat innocent civilians as potential terrorists, so that we all get searched more frequently and intrusively.

Subprime Mortgages

Recently, high default rates on subprime mortgages have been in the news. A subprime mortgage is a loan made to a borrower with an imperfect credit history and relatively little equity (a down payment of less than 10 percent). If the borrower subsequently defaults, that is a type I error. Conversely, if the lender turns down an applicant who would have actually repaid the mortgage, that is a type II error.

Since the mid-1990's, lenders have been under increased pressure from regulators to improve access to home ownership. Banks and other lenders have been told to increase their loans to "underserved markets," to the point where at least one industry insider was complaining two years ago about "overserved markets."

The regulatory pressure on lenders caused a reduction in type II errors -- fewer risky borrowers were turned down. The rate of home ownership increased, despite soaring house prices.

Along with the reduction in type II errors came an increase in type I errors -- some estimates are that 15 percent of the subprime mortgages are in default. With "whiplash speed," politicians have shifted from blaming mortgage lenders for being too tight to blaming them from being too loose.

I do not wish to absolve the mortgage lending industry. There have been some very questionable practices, including out-and-out fraud. But it is important to understand the basic trade-off that when you try to turn down fewer good borrowers, you let more loans get through that are not going to perform.

The Moral

The moral of the story is that when people must make decisions without perfect information, they will make mistakes. The fact that decisions turn out wrong does not by itself condemn the decision-making process. The fact that doctors sometimes under-treat and sometimes over-treat does not prove that for-profit medicine is worse than socialized medicine. The fact that some terrorists elude detection and some people get put on terrorist watch lists by mistake does not prove that watch lists have no value. The fact that some potentially good mortgage borrowers get turned down and other borrowers go into default does not prove that regulators could do a better job of making mortgage lending decisions.

To make improvements in these areas requires more sophisticated analysis. Given the level of information, the only way to improve decision-making is to ensure that the costs of different errors are properly taken into account. In our health care system, with over 85 percent of the cost of medical services paid for by third parties, there is reason for concern about over-treatment.

The best improvements come from introducing better information into the decision-making process. Better information can help reduce both kinds of errors. More research on medical protocols, such as a recent heart stent study, would help doctors make better decisions. Better information on terrorist links and profiles would lead to better security practices. Improvements in the use of credit histories and property databases have helped mortgage lenders make better loan decisions.

Arnold Kling is the author of Learning Economics and Crisis of Abundance: Rethinking How We Pay for Health Care.


TCS Daily Archives