TCS Daily


The Risky Business of Understanding Risk

By Henry I. Miller - February 3, 2004 12:00 AM

Americans are more and more risk-conscious. We buy muscular SUV's, spend billions on dietary supplements annually, and try to cut down on saturated fats. Often, we learn about risks by relying on the media to interpret medical research and other information that purports to disclose what is bad (or good) for us. Some people even make decisions about risk on the basis of what they learn from the likes of Jay Leno and David Letterman: That's risky business, if you ask me.

Tad Friend wrote in The New Yorker, "It often seems that there is only one show on television, 'Dateline NBC: 48 Hours of 20/20, PrimeTime Thursday,' and that this show endlessly repeats one basic story: The Thing That Went Terribly Wrong." Stories about the risk of this or that, which are now often the bread and butter of media reporting, give new meaning to the old TV news maxim, "If it bleeds, it leads." It has become, "If someone alleges that it bleeds, it leads."

To be fair, risk analysis is complex and difficult. Ideally, the risks faced by a group of people exposed to a particular substance, activity or lifestyle are assessed by the "scientific method" -- careful experimentation carried out according to certain agreed-upon rules. But too often, the media characterize risks simply by anecdotal reports or using a kind of superficial shorthand: "Do cell phones cause brain tumors?" "Are designer genes in your lunch?" Details at 11!"

Stories about health care products are a case in point. By their very nature, drugs, vaccines and medical devices often exert profound effects and are intimately involved in life and death situations, so it is not surprising that they present some risks. For that reason, all drugs and vaccines undergo rigorous testing, and are approved by government regulators only if they show net benefit.

Assessing risk is a matter of probability, not precision. Solid scientific evidence is the result of painstaking research and analysis which are, in the end, fraught with some degree of uncertainty. In the end, decisions about the relative risks and benefits of drugs require complex judgments -- on the part of patients, physicians and government regulators, all of whom have a role in determining when such risks should be undertaken.

The Uses and Limits of Epidemiology

When we are concerned with the effects, beneficial or otherwise, of certain kinds of actions, one recognized way to identify risk is through epidemiological study. These are the studies, often performed by researchers at institutions like schools of public health, that attempt to ascertain which foods or behaviors (such as green vegetables or exercise) alter the chance of contracting cancer or some other disease.

Epidemiologists look at groups of people and what they have in common, and using carefully designed protocols, tease out important correlations. Is moderate wine consumption correlated with any difference in the incidence of heart disease, for example, compared to abstinence? This kind of information about correlations between events is useful, but not conclusive; the next, more informative step in analysis tries to demonstrate "causation." People frequently fall into the tempting trap of what physicians call the post hoc ergo propter hoc ("after this, therefore because of it") fallacy, which is based upon the mistaken notion that simply because one event follows another, the first caused the second. Many events follow sequential patterns without being causally related -- for example, eating chicken soup and the disappearance of a cold, or the discovery of a fire and the arrival of fire engines.

Epidemiology is an important tool in determining what is and what is not a public health concern. For example, this branch of science has shown that smokers are more likely to suffer from lung cancer or emphysema than non-smokers. At the same time, epidemiology also has been a key factor in debunking a number of public health scares through the years. Although anecdotal media reports at one time implied that silicone breast implants, cellular phones and electromagnetic radiation from power lines cause disease, extensive epidemiological studies have indicated that there is, in fact, no correlation. In these cases, a rush to judgment in the news media -- and in the case of breast implants, by government regulators -- created public panic (and stimulated litigation) that years later was shown by sound scientific research to be unfounded.

The Emotional Dimension of Risk Perception and Response

It is difficult, especially for the non-expert, to be dispassionate and objective about technological risk, whether the issue is a new drug or a technology used to produce new genetic varieties of plants for food. This "emotional dimension" of concerns about technology's potential risk to public health or the environment is less readily addressed but can have a profound impact on consumers' acceptance of new technology.

As the government makes decisions about consumer products, attempts from various quarters to mislead and intimidate the public may distort the accurate assessment of risks, benefits and possible alternatives. This can lead to decisions that are harmful from both an economic and humanitarian perspective. Therefore, understanding the emotional dimension can help experts and non-experts alike to confront largely emotional responses by the public, in order that everyone can make more clear-headed decisions free from manipulation.

Several subjective factors that can cloud thinking about risks and have been prominent in various controversies:

Uncertainty, ambiguity, and (gasp!) statistics. Studies of risk perception have shown that people tend to overestimate risks that are unfamiliar, hard to understand, invisible, involuntary, and/or potentially catastrophic -- and vice versa. Thus, they overestimate invisible "threats" such as electromagnetic radiation and trace amounts of pesticides in foods, because these phenomena inspire uncertainty and fear sometimes verging on superstition. Conversely, they tend to underestimate risks whose nature is relatively clear and comprehensible, such as using a chain saw or riding a motorcycle.

Contributing to these emotions may be poor scientific literacy in general and unfamiliarity with the statistical aspects of risk in particular. For example, the American Cancer Society's report, Cancer Facts & Figures 2001 notes that obesity raises breast cancer risk in postmenopausal women by 50 per cent. How significant is that degree of increased risk? Should the overall level of risk be of concern?

The impact of increased risk -- in this example by 50 percent -- depends on the starting point: A doubling from 0.01 per cent to 0.02 per cent is very different in its real-world potential impact than a doubling from 40 per cent to 80 per cent. But in this example, what does the increased risk mean to an individual? It is useful to analyze this kind of risk conundrum in terms of the probability of not experiencing the event or disease in question. With respect to obesity and breast cancer, the overall probability of a woman contracting breast cancer after menopause is about eight per cent; therefore, the possibility she will not contract it is 92 per cent. If, as reported by the ACS, obesity increases the cancer risk by 50 per cent, to 12 per cent, the probability of not contracting breast cancer drops from 92 to 88 per cent, which most people would find considerable.

Information overload. At best, non-experts are likely to understand only a limited number of aspects of a risk analysis problem, so they are easily overloaded with data. Information overload of the public is a strategy often used by those who would elicit fear about or disparage new technology. In one short peroration on gene-spliced, or genetically modified (GM) foods, for example, an anti-technology activist might address a large number of complex and unrelated issues: the consumer's "right to know" via product labeling, the "vegetarian issue" of fish genes introduced into tomatoes, the safety and socioeconomic issues of a protein used to increase milk production in dairy cows, and the alleged environmental hazards of crop plants used to synthesize pharmaceuticals.

Activists regularly deluge the public with irrelevant, untrue, or partly true information that leaves the non-expert overwhelmed and bewildered, and this can lead to poor judgment and snap decisions.

We have seen this, perhaps without any intent to deceive, as Americans have been confronted by a profusion of conflicting studies on diet. This has caused a "nutrition backlash," marked by many people returning to unhealthy eating, according to a study in the Journal of the American Dietetic Association. The study by researchers at the Fred Hutchinson Cancer Center in Seattle, Washington, found that over 40 per cent of respondents to a survey were confused by and tired of hearing about what foods they should or should not eat, and that, perhaps as a result, the diet of this group contains more fat and less fruit and vegetables than average.

Splitting and projection. A common response to fear and uncertainty is to split those involved in controversy into opposite camps -- us vs. them -- and to project onto them conspiratorial and iniquitous intentions. Psychologically, this is an attempt to reduce anxiety and re-impose certainty and clarity. These defense mechanisms may be activated especially easily when the "enemy" is painted as faceless, profit-hungry, multinational companies that will benefit handsomely from the sale of products. But such mechanisms are unproductive, because they polarize thinking, encourage one-sidedness and actually distort sound decision-making.

Yearning for a return to purity and innocence. This romantic, juvenile view, reflecting a wish to escape from complex realities and choices, can give rise to a kind of puritanical, anti-technological view of the world. Purity and simplicity become desired ends in themselves, to the exclusion of other goals such as maximizing individual choice and thinking quantitatively. Perhaps this is part of the motivation for the movement in Europe and America toward "natural products" like herbs as a substitute for conventional medicines, although their quality control is often deficient, few have been proven at all efficacious, and many have serious side effects. At best, only a handful of the hundreds of herbal supplements available have met the standards required for drugs.

Manipulation of environmental and health anxieties. Most Americans (including this writer) consider themselves to be "environmentalists," but the hidden agenda of many of those who have attempted the "greening" of Western societies and governments -- environmental organizations, certain political leaders, and the media -- appears to be their own self-interest. An unfortunate by-product of their misinformation campaigns is progressively more widespread acceptance of junk science. A deplorable example is the environmental movement's crusade to rid society of chlorinated compounds, which has extended even to opposing the chlorination of drinking water. By the late 1980s, environmental activists were attempting to convince water authorities around the world of the possibility that carcinogenic byproducts of chlorination made drinking water a potential cancer risk. Facing a revenue shortfall during a budget crisis, Peruvian officials used this supposed threat to public health as a justification to stop chlorinating much of their country's drinking water. That decision contributed to the acceleration and spread of Latin America's 1991_1996 cholera epidemic, which afflicted more than 1.3 million people and killed at least 11,000.

What is being lost by muddled thinking about risk is the ability to discriminate between plausibility and provability, plausibility and reality.

Think Critically

What can non-experts do, then, in order better to understand and reconcile the emotional aspects of risk?

First, be skeptical of language that is inflammatory but vague. For example, when activists raise questions about public health or environmental effects of a new technology, insist that certain questions be answered. "How different is this product from what has come before? What are its advantages and potential benefits? How extensively has it been tested? How likely, in fact, are the purported risks? What are the risks of not using the technology?"

Second, in order to distinguish genuine health or environmental concerns from scare tactics, seek answers from genuine experts who can convey information in a way that is scrupulously honest but also simple enough to be understood. Concrete examples, especially relevant historical analogies, are often useful.

Third, become active when issues of risk are discussed and debated. That's preferable to leaving the platform to the extremists.

Fourth, demand your right to have and to choose among new, innovative products in the marketplace, subject to scientific and sensible government regulation that is free from condescension and misinformation.

In the end, while fears of new technology and the risks associated with them may be inevitable, they must be tempered with knowledge. Recall Sherlock Holmes' admonition in A Scandal in Bohemia that "it is a capital mistake to theorize before one has data."

Henry I. Miller, a physician and molecular biologist, is a fellow at the Hoover Institution. He was at the NIH and FDA from 1077-1994. His next book, "The Myth of Frankenfood: How Protest and Politics Threaten the Biotech Revolution," will be published in the Spring by Praeger Publishers.


Categories:
|

TCS Daily Archives