Art
  Censorship
  Censorship
  History
  Censorship
  of Youth
  Copyright   Internet   Media
  Policy
  Political
  Speech
  Sex and   Censorship     Violence in   the Media

  Home
  About Us
Archives
  Commentaries
  Contact Us
  Court and Agency Briefs
  Fact Sheets
  Issues
  Links
  News
  Policy Reports
  Press
  Reviews


Search FEPP



COMMENTS SUBMITTED TO THE NATIONAL TELECOMMUNICATIONS AND INFORMATION ADMINISTRATION, U.S. DEPARTMENT OF COMMERCE:

INTERNET PROTECTION MEASURES AND SAFETY POLICIES

Introduction

The Free Expression Policy Project (FEPP) is a two year-old think tank on artistic and intellectual freedom. Its goals are to provide empirical research and policy development on difficult censorship issues, and to seek free speech-friendly solutions to the concerns that drive censorship campaigns. In these comments, FEPP will outline the serious educational problems that are inherent in Internet filtering technology, and will suggest more effective ways of addressing concerns about minors' access to the wide variety of content on the World Wide Web.

The issue of Internet filtering and other forms of "technology protection" have, in the past few years, consumed legislators, educators, advocates, study committees, and, most recently, courts. The recent National Research Council report, Youth, Pornography, and the Internet; the extensive fact-findings by the federal district court in American Library Association v. United States, and many other sources now provide thoughtful documentation of the myriad problems with Internet filtering, and discuss alternative, non-censorial approaches to educating youth about disapproved messages and other disturbing content online.

The Free Expression Policy Project has contributed to this discussion by publishing two in-depth reports, Internet Filters: A Public Policy Report and Media Literacy: An Alternative to Censorship; by submitting a white paper to the National Research Council; and by distributing other materials on our Web site (www.fepproject.org). With this submission, we enclose the white paper and the two policy reports.

Background: The Growth of Internet Filtering

Manufacturers began developing Internet filtering software in the mid-1990s, initially as devices for parents to control the Web-surfing of their offspring. But the companies soon began marketing the software to schools and libraries as well, and the Clinton Administration actively encouraged this marketing in the wake of the Supreme Court's 1997 decision striking down the Communications Decency Act (CDA), Congress's first attempt to shield minors from online pornography.1 Filtering, at this time, was touted as a "less restrictive alternative" to criminal laws such as the CDA.

Even then, however, the over-blocking tendencies of Internet filters were known. With a rapidly expanding Web (approaching a billion sites by the late 1990s), filters had to rely on key words and phrases to identify sites that would generally be thought inappropriate for minors. Examples of clearly erroneous, and often ludicrous, blocking were legion, from information on breast cancer, to Beaver College, pussy willows, magna cum laude, and the Web site of Congressman Dick Armey.2

Although today, some manufacturers claim to have corrected these problems, and describe their operations as "artificial intelligence" rather than key word blocking, filtering still, by necessity, must rely on mechanical methods to identify potentially troublesome sites. Numerous studies and reports document vast amounts of both over- and under-blocking – for filters also fail to block many pornographic sites.3

Most filtering companies do have employees who engage in some, albeit necessarily brief, human review of Web sites. Their judgments, however, are subjective, and reflect the manufacturers' social and political views. Some filtering systems reflect conservative religious views.4 No filter can identify and block only illegal obscenity, or sexual material that is sufficiently "patently offensive," "prurient," and lacking in "serious value" to fit within the fluid but legally cognizable category of "harmful to minors."5

Despite these well-known problems, the majority of public schools had already adopted some form of Internet filtering even before Congress required them to do so as part of the 2000 Children's Internet Protection Act (CIPA). As a result, students now accessing the Internet at most schools are dramatically restricted in their research. Educational decisions by teachers and school librarians have been replaced by censorship decisions of private companies that generally do not disclose their operating methods, their political biases, or their lists of blocked sites.6 Moreover, filters exacerbate the "digital divide," because well-off students can get full Internet access at home, but those in lower economic brackets must rely on the restricted, often irrational operation of filtering software.

The National Research Council Report; Alternatives to Censorship

On May 2, 2002, the National Research Council (part of the prestigious National Academy of Sciences) released a 402-page report in response to a request from Congress back in 1998. Congress had asked for research and advice on "Tools and Strategies for Protecting Kids From Pornography and Their Applicability to Other Inappropriate Internet Content." The NRC's Computer Science and Telecommunications Board established a committee that held hearings, conducted research, and put out a call for white papers from anyone with information to contribute on this highly charged subject.

In response, the Free Expression Policy Project submitted a white paper that made three basic points. First, there is no evidence minors are actually harmed if they seek out or stumble upon sexually explicit content on the Internet. Public concerns about unrestricted Internet access are based on widely shared (and by no means unimportant) views of proper moral education, not on demonstrated proof of harm. Second, there are profound problems with Internet filters as a solution to these social concerns, not only because filters block out vast amounts of valuable Internet content, but because technological quick fixes do nothing to educate youth about responsible and healthy sexuality. And third, there are non-censorious, educational ways of addressing concerns about youth access to sexually explicit content -- among them, media literacy, Internet Acceptable Use policies, and comprehensive sexuality education.7

In its report, the NRC essentially agreed with all three of these points. As to "protecting" minors, it noted that although "people have very strong beliefs on the topic," there is no consensus among experts about the impact of sexually explicit material. The committee said that there are some hardcore depictions "of extreme sexual behavior whose viewing by children would violate and offend" its "collective moral and ethical sensibilities, though this sentiment would not be based on scientific grounds." As to other explicit, but less extreme, sexual material, "protagonists in the debate would be likely to part company" on whether it is inappropriate or harmful.8

As for technological tools for "protecting kids from pornography and other inappropriate Internet content," the NRC report noted the drawbacks of filters. Identifying "inappropriate material" through filters, it explained, "must rely either on an automated, machine-executable process for determining inappropriate content or on a presumption that everything that is not explicitly identified by a human being as appropriate is inappropriate. An approach based on machine-executable rules abstracted from human judgments inevitably misses nuances in those human judgments, ... while the presumption-based approach necessarily identifies a large volume of appropriate material as inappropriate." Either way, misclassifications inevitably occur, even putting aside differences in human judgment as to what is appropriate. Moreover, "technology solutions are brittle, in the sense that when they fail, they fail catastrophically"; they are expensive; and most youngsters can circumvent filters if they want to.9

Finally, the NRC report emphasized the importance of education -- especially in media literacy and sexuality -- as the only effective ways to address concerns about youth access to sexually explicit content. "Information and media literacy," it said, "provide children with skills in recognizing when information is needed and how to locate, evaluate, and use it effectively, ... and in critically evaluating the content inherent in media messages. A child with these skills is less likely to stumble across inappropriate material and more likely to be better able to put it into context if and when he or she does. Media literacy helps children and adolescents question whether sexually explicit images are realistic, and why they are being shown. A media-literate individual understands how to evaluate the truthfulness and reliability of a media message," and "the motivations and intent of the parties responsible" for it.10

Similarly, the NRC found, sexuality education "offers a useful context for interpreting sexually explicit material." European children, who receive "early, frequent, and comprehensive sex education in a way that is not typical in the United States," have been "exposed to nudity and explicit material at a relatively young age" and "do not show higher levels of sexual addiction or teen pregnancy."11

The committee compared these educational approaches to swimming. "Swimming pools can be dangerous for children. To protect them, one can install locks, put up fences, and deploy pool alarms. All of these measures are helpful, but by far the most important thing that one can do for one's children is teach them to swim."12

The NRC's emphasis on media literacy education was fleshed out in the Free Expression Policy Project's 2002 report, Media Literacy: An Alternative to Censorship. The report, which is enclosed with this submission, is a detailed survey of the history and current state of media literacy education and why it is far preferable to "indecency" laws, Internet filters, or other censorship devices as a means of addressing concerns about youthful access to inappropriate content. The report includes five recommendations for public policies to advance media literacy.13


The Court's Findings in American Library Association v. United States

The Children's Internet Protection Act (or CIPA), passed late in 2000, mandates filtering software in all schools and libraries as a condition of several different types of federal aid, including e-rate discounts for Internet connections. A constitutional challenge was brought to the library provisions of CIPA, and in May 2002, a three-judge federal district court ruled them unconstitutional. The decision in American Library Association v. United States contained extensive fact-findings about the current operation of filtering software.

The court found that the major filters used by libraries (Cyber Patrol, SmartFilter, and the N2H2 company's Bess) both over-block vast amounts of valuable expression and fail to block substantial amounts of pornography. These deficiencies are inherent in the nature of filtering, in particular its reliance on "artificial intelligence" (key words or strings of words) to identify disapproved Internet sites. The court gave dozens of examples of over-blocking, from a Knights of Columbus site, misidentified by Cyber Patrol as "adult/sexually explicit" to a site on fly fishing, misidentified by Bess as "pornography."14

The court described in detail the process of "harvesting" and "winnowing" that filtering software employs. It explained how large portions of the Web are never even reached by the harvesting and winnowing programs. It noted that although all filtering companies deposed in the case insisted that they use "some form of human review" before blocking sites, some 10,000-30,000 Web pages enter the "work queue" each day, to be reviewed by staffs of between eight and "a few dozen" people. "Given the speed at which human reviewers must work to keep up with even a fraction of the approximately 1.5 million pages added to the publicly indexable Web each day, human error is inevitable." None of the staff are trained in legal categories such as obscenity. Filters block all pages on a site, no matter how innocent, based on a "root URL." Likewise, one item of disapproved content – for example, a sexuality column on Salon.com – will result in blockage of the entire site.15

Following the Supreme Court's decision in Reno v. ACLU, the three-judge court noted that the Internet is a "vast democratic forum," with expression on subjects "as diverse as human thought." Filters keep certain ideas and viewpoints out of this public forum, in contravention of basic First Amendment values. Even as a condition of funding, government cannot requires libraries to censor expression in this way.16

Significantly, the court struck down CIPA as applied to both adults and minors. It found that there are a variety of "less restrictive" ways for libraries to address concerns about illegal obscenity on the Internet, and about minors' access to material that most adults consider inappropriate for them -- including Acceptable Use policies, Internet use logs, and supervision by library staff.17

The American Library Association case did not deal with the portion of CIPA that requires filters in schools, and those provisions are now in effect. But much of the reasoning, and all of the fact-finding, in this careful and lengthy court decision has application to educational institutions. Filters are mechanical devices that mindlessly block vast amounts of useful -- if sometimes controversial -- ideas and information. They are not even effective in blocking pornography. They are easily circumvented by sophisticated youngsters. They undoubtedly make blocked sites more attractive. And they inhibit, rather than advance, education.

Conclusion

America does a disservice to its youth when it imposes Internet filters instead of teaching responsible Web surfing, solid research techniques, and critical thinking skills. That said, there are differences among filters, which schools subject to CIPA should keep in mind.

First, CIPA only requires a filter that will block obscenity, child pornography, and "harmful to minors" material – i.e., material with "prurient" sexually explicit content. Of course, no filter can do this. But filtering products do have a wide variety of different subject matter categories. Schools should only use filters with a category that, at least in theory, is limited to "adult" or "pornography." No other category needs to be activated.

Second, some filters are more censorious than others; different filtering companies have different ideological preferences. Administrators should look for a company that is not associated with a particular religious or political perspective.

Third, some filters have client-side programs that permit customers to unblock particular sites. Although this does not address the problem that arises when students doing research simply do not know what sites are not showing up in their searches, it can help when a student (or teacher or librarian) complains that a particular site they are seeking has been blocked.

Finally, those interested in working toward a repeal of CIPA should keep records of the problems with filters – expense, administrative burden, inappropriate and viewpoint-based blocking – and should collect evidence of the effectiveness of non-censorial alternatives. Ultimately, educators are in the best position to persuade both Congress and the public of the perniciousness of Internet filtering.

August 26, 2002

NOTES

1. For a description of these events, see Marjorie Heins, Not in Front of the Children: "Indecency," Censorship, and the Innocence of Youth" (Hill & Wang, 2001), pp. 180-86. The Communications Decency Act essentially criminalized "indecent" or "patently offensive" communications online; see the Supreme Court decision, Reno v. ACLU, 521 U.S. 844 (1997).

2. See Free Expression Policy Project, Internet Filters: A Public Policy Report (fall 2001).

3. Internet Filters: A Public Policy Report summarizes the results of more than 70 such studies. See also the Commission on Child Online Protection (COPA), Report to Congress (Oct. 20, 2000), http://www.copacommission.org/report/; the federal district court decisions in Mainstream Loudoun v. Board of Trustees of the Loudoun County Library, 2 F. Supp.2d 783 (E.D.Va. 1998); 24 F. Supp. 2d 552 (E.D.Va. 1998); Lawrence Lessig, "What Things Regulate Speech: CDA 2.0 vs. Filtering," 38 Jurimetrics 629 (1998).

4. See Nancy Willard, Filtering Software: The Religious Connection (Responsible Netizen, 2002).

5. See Ashcroft v. ACLU, 122 S.Ct. 1700 (2002).

6. See, e.g., Anemona Hartocollis, "School Officials Defend Web Site Filtering," New York Times, Nov. 11, 1999 (students studying the Middle Ages were barred fromWeb sites about medieval weapons, including the American Museum of Natural History; and other educational sites such as Planned Parenthood, CNN, The Daily News, and sites discussing anorexia and bulimia).

7. Free Expression Policy Project, "Identifying What is Harmful or Inappropriate for Minors," white paper to the National Research Council (Mar. 5, 2001).

8. National Research Council, Computer Science and Telecommunications Board, Youth, Pornography, and the Internet (National Academy Press, 2002), Executive Summary. See also ch. 10 ("Perhaps the most important point for adults to keep in mind is that many children may be better able to handle exposure to inappropriate material than adults give them credit for. ... [M]ost of the older teenagers with whom the committee spoke reported that today much of the sexually explicit material they encountered online was not a big deal to them. Even younger teenagers –– in particular the teen Cyberangels who testified to the committee –– seem to have been exposed to such material without apparent harm").

9. Id., Executive Summary; ch. 11, 12.

10. Id., Executive Summary; ch. 10.

11. Id., ch. 6.

12. Id., Executive Summary.

13. Free Expression Policy Project, Media Literacy: An Alternative to Censorship (2002).

14. American Library Association v. United States, 201 F. Supp.2d 401, 431-48 (E.D. Pa. 2002).

15. Id.

16. Id. at 469.

17. Id. at 480-84.


The Free Expression Policy Project began in 2000 as a project of the National Coalition Against Censorship, to provide empirical research and policy development on tough censorship issues and seek free speech-friendly solutions to the concerns that drive censorship campaigns. In 2004-2007, it was part of the Brennan Center for Justice at NYU School of Law. Past funders have included the Robert Sterling Clark Foundation, the Nathan Cummings Foundation, the Rockefeller Foundation, the Educational Foundation of America, the Open Society Institute, and the Andy Warhol Foundation for the Visual Arts.

All material on this site is covered by a Creative Commons "Attribution - No Derivs - NonCommercial" license. (See http://creativecommons.org) You may copy it in its entirely as long as you credit the Free Expression Policy Project and provide a link to the Project's Web site. You may not edit or revise it, or copy portions, without permission (except, of course, for fair use). Please let us know if you reprint!