TCS Daily


Filter This Article

By Ryan H. Sager - April 1, 2002 12:00 AM

"Sex," "breast," "XXX" -- any of these words, embedded in the text of this Web page, could prevent you from reading this article if you were using a computer set up to filter out obscene content from the Internet (or peak your interest to read further if you were a twelve-year-old boy). Regardless of the general inaccuracy of filtering software, however, Congress has decided that starting later this year public libraries and schools must begin censoring the Web or risk losing federal funding for Internet access.

The law in question, called the Children's Internet Protection Act (CIPA), is the latest incarnation of the Communications Decency Act and the Child Online Protection Act, both of which were products of the late-nineties panic over the fact that the Internet had suddenly made all sorts of naughty pictures and videos and Flash animations available on a home theater near you.

Both CDA and COPA met with ignominious fates: CDA, passed in 1996, was tossed out after a quickie by the Supreme Court, having been found to have violated the First Amendment; COPA, passed in 1998, has been tied up with an injunction by the high court for years, and is expected to be snuffed a few months from now in a final ruling.

Like the two previous laws, CIPA is now headed for a court challenge. Unlike its predecessors, however, CIPA has a fairly good chance of eventually seeing the light of day.

Since CIPA is not a criminal law and only effects funding (mainly funds given to libraries and schools for Internet access under the E-rate program), it may well pass muster with the Supreme Court, regardless of what happens in Philadelphia this week as the American Civil Liberties Union and the American Library Association pursue their case against the law in a federal courtroom.

In fact, there hardly seems to be a legal question here since the Supreme Court has already made clear that the line between restricting speech and restricting funding of speech is a clear and important one. In an 8-1 decision in 1998 the court ruled that the National Endowment for the Arts could be legally required to follow "general standards of decency and respect" (even if it couldn't be legally required to do anything worthwhile). It would seem that the same logic would allow Congress to tie funding of Internet access to filtering, if it so chooses.

Of course, as is too often forgotten in political debates in America, just because something is constitutional doesn't necessarily mean it's a good idea.

There are plenty of non-constitutional reasons Congress should scrap this amazingly restrictive, one-size-fits all regulation. While the general thrust behind the law is defensible -- we probably don't want perverts sitting around libraries looking at www.pagebillclintongoestowhenhillarysoutoftown.com, and we probably also don't want our kids to find out what's supposed to happen when they type in www.whitehouse.com -- there has to be a better way.

Filtering is simply not the answer, at least not when dealing with adults. It is an inherently problematic technology, as it attempts to have software sort through mountains of Web sites and make largely subjective judgments based on keyword searches and more complex, but still automated, processes. For instance, can a computer program tell the difference between Playboy.com and a site about breast health? Probably, though not definitely. And what about a site with information about sexually transmitted diseases? A sex advice column? An article about Internet pornography?

A Stanford University linguist testifying at the trial in Philadelphia on Tuesday spent hours submitting around 300 sites to a software filter, and found that it rejected as "sex" and "pornography" the Web site of a State University of California sex research center, a Canadian AIDS study and Salon.com.

There are human-based filtering systems, where screeners sift through Web pages and create safe lists of non-offensive sites -- but such undertakings are expensive and result in the creation of closely held proprietary lists, which nonetheless remain perpetually out of date. Furthermore, these lists are even more subjective than software-created lists, as they are rooted in the biases of those who compile them.

With filtering as problematic as it is, libraries and schools should be allowed to take it or leave it as they see fit. Many would likely take it, at least for limited uses. For instance, a very reasonable solution would seem to be to install filtering software on computers for minors, while leaving adults to use less or un-restricted terminals. Such is the strategy of the Fort Vancouver, Washington library, whose associate director testified last week.

The San Francisco Public Library is also considering using filtering technology in its children's areas, though it intends to eschew any filtering for adults. As for adults, San Francisco, as is the city's character, has a fairly laid back policy -- if someone complains that your surfing is offending them, you'll be asked to move to another terminal.

Alas, the tap on the shoulder could give librarians around the country something to do other than hiss "shhh!" at rowdy teenagers, and should be a more than sufficient solution for most small town libraries. Larger libraries and schools could use filtering to whatever extent they find necessary. There's simply no reason to think that local institutions won't act without Congress's prodding; and there's every reason to think that local solutions will best fit local sensibilities.

If there's a problem with people perusing porn in public, are towns and cities going to sit around waiting for Congress to act? Of course not. But simply allowing towns and cities to adhere to their own common sense doesn't give lawmakers anything to write home about -- thus, we get a bad law instead of what we really need: no law. Now that's obscene.
Categories:
|

TCS Daily Archives