ELECTRONIC FRONTIER FOUNDATION
[Join EFF] [Act Now] [Sign Up] [About EFF]

INTERNET FREE EXPRESSION ALLIANCE


Joint Statement for the Record on
"Kids and the Internet: The Promise and the Perils"

Submitted to the National Commission on Library and Information Science by:

American Booksellers Foundation for Free Expression
American Civil Liberties Union
Computer Professionals for Social Responsibility
Electronic Frontier Foundation
Electronic Privacy Information Center
Journalism Education Association
National Campaign for Freedom of Expression
National Coalition Against Censorship
NetAction
Oregon Coalition for Free Expression

December 14, 1998

INTRODUCTION AND OVERVIEW

Undeniably, the Internet provides access to knowledge and expression in ways never before possible. It is a venue where "any person can become a town crier with a voice that resonates farther than it could from any soapbox," as the U.S. Supreme Court observed last year in its landmark decision in Reno v. ACLU. From remote learning classes to digital art museums to library collections and news from all over the world, Internet technologies provide an essential tool for learning and communication.

Recognizing the increased use and access provided by public libraries, the Commission has stated that it is holding this inquiry on use of the Internet in libraries to produce a report containing recommendations to assist library managers in addressing problems arising from public access Internet terminals in libraries where children may use them. The Commission has stated that "foremost of these problems is the potential for predation by pedophiles," and that its report will also deal with the concerns of parents about their children having access to inappropriate material, and privacy issues surrounding direct marketing efforts targeted at minors. We applaud the Commission for conducting this inquiry and believe that promoting safe and effective use of online resources is a laudable objective. We submit these comments not only to offer suggestions on protecting children's safety and privacy online, but also to put this debate in proper perspective and to avoid over-emphasis on exaggerated claims of abuse of online information. For example, we are not convinced that there is any correlation between library Internet access and predation by pedophiles. Sexual abuse of children, child pornography and obscenity are all illegal -- online and offline -- and are not constitutionally protected. In addition, existing laws prohibiting the dissemination of these materials are being enforced aggressively in cyberspace.

Moreover, we caution the Commission against drawing the conclusion that the online availability of constitutionally protected speech that some people find objectionable mandates the adoption of restrictive methods that contravene first amendment principles. Indeed, the recent court decision in Mainstream Loudoun v. Loudoun County Library Board, holding that a library policy mandating the use of filtering software in all library terminals was unconstitutional, underscores the importance of not rushing to embrace overly restrictive solutions. The Mainstream Loudoun opinion (written by a judge who is a former librarian) found little or no evidence that there are any harms presented by providing unfiltered access to constitutionally protected material in public libraries. The decision states, in pertinent part:

The only evidence to which defendant can point in support of its argument that the Policy is necessary consists of a record of a single complaint arising from Internet use in another Virginia library and reports of isolated incidents in three other libraries across the country. In the Bedford County Central Public Library in Bedford County, Virginia, a patron complained that she had observed a boy viewing what she believed were pornographic pictures on the Internet. This incident was the only one defendant discovered within Virginia and the only one in the 16 months in which the Bedford County public library system had offered unfiltered public access to the Internet. After the incident, the library merely installed privacy screens on its Internet terminals which, according to the librarian, "work great."

The only other evidence of problems arising from unfiltered Internet access is described by David Burt, defendant's expert, who was only able to find three libraries that allegedly had experienced such problems, one in Los Angeles County, another in Orange County, Florida, and one in Austin, Texas. There is no evidence in the record establishing that any other libraries have encountered problems: rather, Burt's own statements indicate that such problems are practically nonexistent. (See Burt Rep. at 253-55 acknowledging that an e-mail requesting information about sexual harassment complaints relating to Internet use that he sent to "several thousand" librarians did not yield a single serious response). Significantly, defendant has not pointed to a single incident in which a library employee or patron has complained that material being accessed on the Internet was harassing or created a hostile environment. As a matter of law, we find this evidence insufficient to sustain defendant's burden of showing that the Policy is reasonably necessary. No reasonable trier of fact could conclude that three isolated incidents nationally, one very minor isolated incident in Virginia, no evidence whatsoever of problems in Loudoun County, and not a single employee complaint from anywhere in the country establish that the Policy is necessary to prevent sexual harassment or access to obscenity or child pornography. (citations omitted)

In finding the Loudoun County mandatory filtering policy unconstitutional, the court provided important guidance that should be a starting point for the Commission in making recommendations about how best to provide guidance on Internet use in libraries. The court noted that:

In submitting this statement, we therefore seek to ensure that the Commission does not unnecessarily limit the vibrancy and openness of the Internet as a communication medium by embracing standards or techniques, such as blocking and filtering technologies, that may provide some members of the public with a false sense of security, while blocking access to valuable content.

Our testimony outlines:

  1. Reasons why libraries provide such a crucial access point for many segments of our population, especially the poor and why that means that providing unfiltered access is so important.
  2. Why educational information and programs are both better from a public policy perspective and are constitutionally sound.
  3. Why blocking and filtering programs remove decision-making authority about material selection from librarians and parents.
  4. Why blocking software under-blocks speech that some may find objectionable while blocking valuable and constitutionally protected speech.
  5. Finally, we provide a section that includes suggestions for less restrictive alternatives that may better address concerns about privacy and safety online.

We conclude that blocking and filtering software programs cannot possibly filter out all objectionable material and instead may provide communities with a false sense of security about providing access. We believe that no filter can offer the protections provided by education and training.

I. LIBRARIES PROVIDE THE ONLY ACCESS TO THE INTERNET FOR MANY INDIVIDUALS

While Internet communications are increasingly recognized as dynamic learning vehicles, it is also true that the gap between Americans with access -- either in their homes or libraries -- and those without such access, has widened. A recent report by the National Telecommunications Information Administration, "Falling Through the Net II: New Data on the Digital Divide," found that:

[d]espite th[e] significant growth in computer ownership and usage overall, the growth has occurred to a greater extent within some income levels, demographic groups, and geographic areas, than in others. In fact, the "digital divide" between certain groups of Americans has increased between 1994 and 1997 so that there is now an even greater disparity in penetration levels among some groups. There is a widening gap, for example, between those at upper and lower income levels....

Just as libraries have always been great equalizers, providing books and other information resources to help people of all ages and backgrounds live, learn and work, today libraries provide critical access to the wealth of information in the digital world. Indeed, the NTIA concluded that libraries and other community access centers will play a vital role in connecting many computer "have nots" until it is possible to make universal service for all households a reality. This finding has also been supported by a recent American Library Association and National Commission on Libraries and Information Sciences survey that found that one in five public libraries serve populations with a poverty level of 20 percent or more and one in ten serve rural areas with greater than 20 percent poverty level. Thus, safeguarding Internet access provided by libraries is an extremely important objective, not only because libraries have traditionally served as a forum for expressive activity, but because today they serve as the only access point to the vast world of online resources for large segments of our population.

II. EDUCATING USERS, NOT BLOCKING SPEECH IS SOUND AND EFFECTIVE PUBLIC POLICY

One year ago, the Supreme Court struck down the Communications Decency Act ("CDA"), which would have made it a crime to communicate "indecent" materials on the Internet. Reno v. ACLU, 521 U.S. ___, 117 S. Ct. 2329, 138 L. Ed. 2d 874 (1997). The Court found that the CDA violated the First Amendment and indicated that "the interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship." 138 L. Ed. 2d at 906. In its historic decision, the Supreme Court recognized that the Internet, as much as books and newspapers found in our public libraries, is entitled to the very highest level of First Amendment protection. While the Internet provides access to material that some parents and educators may find objectionable, protecting children's safety should not be equated with the use of filtering software in public libraries. The use of filtering software is simply inconsistent with constitutional mandates and good public policy, as it does not only block material that is legally obscene or "inappropriate" for minors; it blocks a much wider spectrum of speech and is simply incapable of discerning between constitutionally protected and unprotected speech.

Clumsy and ineffective blocking programs are "quick fix" solutions to parental concerns. They provide a false sense of security that minors will be protected from all material that some parents may find inappropriate. At the same time, filtering software restricts access to valuable, constitutionally protected online speech about topics ranging from safe sex, AIDS, gay and lesbian issues, news articles, and women's rights. Religious groups such as the Society of Friends and the Glide United Methodist Church have had online resources blocked by these imperfect censorship tools, as have policy groups like the American Family Association. This type of arbitrary censorship, when used in public libraries, is a blatant violation of the First Amendment.

Instead, parents and educators should be fully informed about the responsibilities they must shoulder and the potential abuses of which they should be aware. No software or restriction on content can fulfill those goals, but education of parents, of students and of educators and librarians can give them all the tools to provide appropriate safeguards. We believe that:

To promote user safety and responsibility, we have provided a suggestions section in the final section of this statement that offers concrete steps that libraries may take to foster safe and effective use of online resources. The following section provides further detailed findings on the problems relating to the use of filters and blocking software in libraries.

III. BLOCKING AND FILTERING PROGRAMS REMOVE DECISION-MAKING AUTHORITY FROM EDUCATORS AND LIBRARIANS

In order to block Internet sites, a software vendor identifies categories of material to be restricted and then configures its software to block sites containing those categories of speech. Using its criteria, the software vendor compiles and maintains lists of "unacceptable" sites. Some software blocking vendors employ individuals who browse the Internet for sites to block, while others use automated searching tools to identify which sites to block. Some do both. However, all blocking software requires the exercise of subjective human judgment by the vendor to decide what speech is acceptable and what is unacceptable.

Blocking software providers generally do not disclose their lists of unacceptable sites because they regard the lists as proprietary. As a result, it is impossible for parents, educators and librarians to make decisions about what sites should be blocked, and what content should be available. Many of these programs do not even provide users with the ability to unblock sites that are inappropriately restricted.

Hence, using blocking programs means that librarians do not have the direct authority to determine what matter is "inappropriate for minors," and they may be virtually powerless to make this determination. Librarians do not participate in the original selection of material that is deemed "inappropriate," and cannot easily unblock restricted material. Such determinations are made by the software vendor, who regards the product of its determinations a trade secret that cannot be disclosed to anybody, including the affected library.

Public libraries' use of filtering software is fundamentally inconsistent with the role of the library as a storehouse of information. The use of such programs eliminates the essential role of parents, librarians, and teachers and places decision-making into the hands of commercial software vendors. The American Library Association has opposed the use of filters in libraries because it recognizes that it is the domain of parents not librarians, the government, or faceless software companies to oversee the use of the library by their children. See Resolution adopted by ALA Council, July 2, 1997.

For these reasons, free speech groups (including the American Civil Liberties Union and a grassroots civil liberties organization, Mainstream Loudoun) challenged the decision of the Loudoun County, Virginia Library Board to implement a policy that required the use of blocking software at all library computer terminals. The library's Internet policy purported to require the blocking of access to materials that are "pornographic" or "harmful to juveniles." However, what the plaintiffs found was that the software chosen by the Loudoun libraries blocked far more than this vague category of speech. The ACLU, which represented web site and content providers in the suit charged that the following sites were among those that were blocked by the filtering program:

The Safer Sex Page, operated by Chris Filkins;

Banned Books Online, created by John Ockerbloom;

American Association of University Women Maryland (AAUW Maryland);

Rob Morse, an award-winning columnist for the San Francisco Examiner;

Books for Gay and Lesbian Teens Youth Page, created by 18-year-old Jeremy Myers;

Sergio Arau, the popular Mexican artist and rock singer known as "El Padrino";

Renaissance Transgender Association, a group serving the transgendered community;

The Ethical Spectacle, created by author Jonathan Wallace.

The court therefore found that by using blocking software to implement the policy, the library board's action was the digital equivalent of "removing books from the shelves" of the library in violation of the Constitution, even though the affected speech has value to both adults and minors. At the same time, the judge cited the experience of other Virginia librarians in suggesting that Loudoun County could use a variety of less restrictive means to keep children from accessing inappropriate material online.

IV. BLOCKING SOFTWARE RESTRICTS ACCESS TO VALUABLE, PROTECTED SPEECH

The use of a blocking programs on library terminals used by students is also problematic because it may limit their ability to complete homework or research assignments on a variety of subjects ranging from sexual harassment, abortion and medical issues, to artwork and literature. Earlier this year, a California library that removed filtering software after public disapproval conceded that the filters presented an unconstitutional barrier to patrons seeking access to materials including legal opinions, medical information, political commentary, art, literature, information from women's organizations, and even portions of the American Civil Liberties Union's Freedom Network page on the World Wide Web.

The huge size of the Web makes it impossible for any individual to review all sites or to keep up with the exponential number of new sites that come online and change daily. Thus, developers of blocking software must rely on automated search tools to identify sites to block. These tools cannot block according to subject matter, and cannot evaluate pictures. They can only identify sites by searching for a particular word or string of words. As a result, these tools inevitably identify sites that do not contain the subject matter the producers of the software want to block.

This result is not surprising. Several reports compiled by educators, public interest organizations, and other interested groups have concluded that filtering software inappropriately blocks valuable, protected speech, and does not effectively block many of the sites they are intended to block.

One report, by the Electronic Privacy Information Center (EPIC), "Faulty Filters: How Content Filters Block Access to Kid-Friendly Information on the Internet," found that filtered search engines reduced access to constitutionally protected and valuable content available on the World Wide Web. The EPIC Report reviewed the impact on Internet access of "Family Search," a "family-friendly search engine" from the Net Shepherd firm. The EPIC Report compared the search results received from 100 Internet search inquiries using the popular search engine AltaVista to the search results received when the results for the same 100 searches were filtered through Family Search. The search terms included topics that students might be likely to research, such as "American Red Cross," "Thomas Edison" and "Bill of Rights." It concluded that the "family-friendly search engine ... typically blocked access to 95-99 percent of the material available on the Internet that might be of interest to young people." The report also determined that the search engine did not seem to restrict access to topics regarded as sensitive with respect to young people any more than it restricted access to matters of general interest.

Many other public interest groups and news organizations have reported similar findings. We have attached to this submission (along with the EPIC report "Faulty Filters") the following reports:

"Censorship in a Box: Why Blocking Software is Wrong for Public Libraries," by the American Civil Liberties Union. Proposes five guidelines for libraries and schools looking for alternatives to clumsy and ineffective blocking software. The report also includes a two-page "Q&A" on blocking software and examples of sites that have been blocked by various products.

"Access Denied: The Impact of Internet Filtering Software on the Gay and Lesbian Community," by the Gay and Lesbian Alliance Against Defamation (GLAAD). This report concludes that most filtering products categorize and block all information about gays and lesbians in the same manner that they block sexually explicit and pornographic material. The report discloses that Cyber Patrol blocks two huge collections of user pages maintained by Internet Service Providers, the West Hollywood pages of Geocities and members.tripod.com. Cyber Patrol blocked large amounts of innocent content (fifty thousand Web pages in West Hollywood, and more than a million at Tripod) because some users maintained a few explicit pages. This report also shows that wrongfully blocked sites are often inaccurately described by Cyber Patrol.

"Censorship's Tools Du Jour: V-Chips, TV Ratings, PICS and Internet Filters," by the National Coalition Against Censorship.

"The Censor's Sensitivity," Declan McCullagh, Netly News, December 22, 1997. This article reports that the filtering product "X-Stop" restricts access to a variety of important and constitutionally protected information on various websites, including the sites of the American Association of University Women, the AIDS Quilt, and information about the Quaker religion.

As these reports and news articles demonstrate, use of filtering programs by public institutions can result in discrimination against speakers based on their viewpoints, and can restrict access to a variety of constitutionally protected and important speech, even as to minors. (These reports and others can also be found online at the Web site of the Internet Free Expression Alliance at <http://www.ifea.net>).

V. MORE EFFECTIVE MEASURES FOR PROMOTING SAFETY EXIST

Parents should be informed that the blind reliance on filtering and blocking programs cannot effectively safeguard children from "inappropriate" material. They should be made aware of the studies that show that blocking software may allow access to material that some parents believe is objectionable while restricting access to otherwise innocuous or educational speech.

We further suggest that:

CONCLUSION

The advent of new forms of communication technology is always a cause for public anxiety and unease. This was as true for the printing press and the telephone as it was for the radio and the television. But the constitutional ideal is immutable regardless of the medium: a free society is based on the principle that each and every individual has the right to decide what kind of information he or she wants -- or does not want -- to receive or create.

We encourage the Commission to further take the lead in fostering a discussion that will help local communities find the best answers to providing greater access to the Internet. Members of the Internet Free Expression Alliance and the Free Expression Network have concluded that blocking and filtering software programs cannot possibly filter out all objectionable material and instead may provide communities with a false sense of security about providing access. We believe that no filter can offer the protections provided by education and training. Thus, we believe that the importance of user education should not be underestimated as a critical tool for teaching safe, responsible, and rewarding use of cyber-communications and encourage the Commission to encourage an educational approach.


Please send any questions or comments to webmaster@eff.org

Return to IFEA Home Page

Return to   EFF   Welcome Page