CONSTITUTIONAL PROBLEMS WITH THE COMMUNICATIONS DECENCY AMENDMENT: A LEGISLATIVE ANALYSIS BY THE ELECTRONIC FRONTIER FOUNDATION June 16, 1995 INTRODUCTION On June 14, 1995, the United States Senate approved by a vote of 84-16 an amendment to the Senate's omnibus telecommunications-deregulation bill that raises grave Constitutional questions and poses great risks for the future of freedom of speech on the nation's computer-communications forums. Sponsored by Sen. Jim Exon (D-Nebraska), the amendment originated as an independent bill titled Communications Decency Act of 1995 (CDA), and is intended, according to its sponsor, both to prohibit "the [computer] equivalent of obscene telephone calls" and to prohibit the distribution to children of materials with sexual content. As drafted, however, the legislation not only fails to solve the problems it is intended to address, but it also imposes content restrictions on computer communications that would chill First-Amendment-protected speech and, in effect, restrict adults in the public forums of computer networks to writing and reading only such content as is suitable for children. SPECIFIC PROVISIONS OF THE CDA The Communications Decency Act would change the language of Title 47, United States Code, Section 223, a section that primarily does two things: 1) it prohibits "obscene or harassing" phone calls and other, similar, abusive uses of the telephone, and 2) it imposes regulation (promulgated and administered by the Federal Communications Commission) on telephone services that provide so-called "indecent" content and prohibits those services from providing legally obscene content. The amending language drafted by Sen. Exon and passed by the Senate substantially restructures and alters the provisions of this section in an effort to bring computer communications under the statute. If the Senate-approved language becomes law, provisions in the amended statute will: (a) Expand the scope of the statute from telephones to "telecommunications devices" (such as computers, modems, and the data servers and conferencing systems used by Internet sites and by commercial providers like America Online and CompuServe); (b) Define as a criminal offense any communication that is legally obscene or indecent if that communication is sent over a telecommunications device "with intent to annoy, abuse, threaten, or harass another person"; (c) Penalize any person or entity who, by use of a telecommunication device, "knowingly ... makes or makes available" any content or material that is legally obscene; and (d) Penalize any person or entity who "knowingly ... makes or makes available" to a person under the age of 18 any content or material that is "indecent." The CDA outlines affirmative defenses for persons or entities who might otherwise be liable under the statute's criminal provisions. In spite of the efforts of Sen. Exon to address in this revision of his legislation those criticisms and constitutional issues raised by earlier drafts of it, the language of the CDA as passed by the Senate is riddled with flaws that threaten the First Amendment rights both of online service providers and of individual citizens. THE CDA WOULD CRIMINALIZE CONSTITUTIONALLY PROTECTED SPEECH. None of the CDA's prohibitions of "obscene" communications raise any constitutional issues; it is well-settled law that obscene content is not protected under the Constitution. In contrast, CDA's restrictions on "indecent" speech are deeply problematic. What is "indecent" speech and what is its significance? In general, "indecent" speech includes nonobscene material that deals explicitly with sex or that uses profane language. The Supreme Court has repeatedly stated that such "indecency", since it is not obscene, is Constitutionally protected. Further, the Court has stated that indecent communications cannot be banned altogether from the view of the general public -- not even in broadcasting, the single communications medium in which the federal government Constitutionally holds broad powers of content control. The section of the CDA dealing with "obscene or harassing" communications penalizes not only the sending of "obscene" communications, but also those that are "indecent." This prohibition of indecent content, even though limited somewhat in application by the section's intent requirement, is unconstitutional on its face. In _Sable_Communications_v._FCC_ (1989), a case involving dial-in phone-sex services, the U.S. Supreme Court held that, even though a ban on *obscenity* in "dial-a-porn" services is constitutional, a ban on *indecency* is not. Citing earlier holdings, the Court said that "[t]he government may not reduce the adult population to only what is fit for children." What are some examples of "indecent" content? The most famous example probably is the George Carlin comedy monologue that was the basis of the Supreme Court case _FCC_v._Pacifica_Foundation_ (1978). In that monologue, Carlin discusses the "Seven Dirty Words" (i.e., certain profane language) that cannot be uttered in broadcast media. Other examples of "indecency" could include passages from John Updike or Erica Jong novels, certain rock lyrics, and Dr. Ruth Westheimer's sexual-advice column. Under the CDA, it would be criminal to "knowingly" publish such material on the Internet unless children were affirmatively denied access to it. It's as if the manager of a Barnes & Noble bookstore could be sent to jail simply because children were able to wander the store's aisles and search for the racy passages in a Judith Krantz or Harold Robbins novel. The Supreme Court has consistently held, both before and after its landmark obscenity decision in _Miller_v._California_ (1973), that while sexual material and profane language can be regulated in some specifically defined contexts (e.g., the FCC can require that "indecent" content in broadcasting be limited to certain hours of the broadcasting schedule when children are somewhat less likely to be exposed), in general indecency is fully protected by the First Amendment. The Court has even recognized that profane language may be essential to political speech, since the emotional power of particular words may be as important as their intellectual content. As Justice Harlan commented in _Cohen_v._California_ (1971), a case in which a young man was prosecuted for wearing a profane anti-draft slogan on his jacket, "One man's vulgarity is another's lyric." It's important to note that not every application of this part of the CDA would be unconstitutional. If the "obscene or harassing" offense language had been limited to instances in which the speaker intends to "threaten," for example, it would have raised no constitutional problems. (A threat of blackmail or physical violence, is not protected speech.) But the CDA goes beyond threats or harassment -- it criminalizes the use of "indecent" language even when the speaker merely intends for his content to be "annoying," and this prohibition treads squarely on speakers' First Amendment rights. After all, the First Amendment was drafted to protect offensive, annoying, and disturbing speech -- there is little need for protection of pleasant and uncontroversial speech, since few people feel impelled to ban it. As Justice Douglas observed in _Terminiello_v._Chicago_ (1949), free speech "may best serve its high purpose when it induces a condition of unrest, creates dissatisfaction with conditions as they are, or even stirs people to anger." For example, a citizen offended by the passage of the CDA who shouts an indecent comment at his U.S. Senator may very well intend to annoy the Senator -- nevertheless, such expression is protected under the First Amendment. It is constitutionally absurd that speech that would be protected if shouted on the street would turn the speaker into a felon if sent by e-mail. BY GRANTING THE FCC REGULATORY CONTROL OVER THE CONTENT AND AVAILABILITY OF COMPUTER COMMUNICATIONS, THE CDA VIOLATES THE FIRST AMENDMENT. Is it constitutional for Congress to declare that computer communications are a medium like broadcasting, where it is allowable for the FCC to impose content-related regulations? Clearly not. Prior to Sen. Exon's proposed changes to Section 223, the FCC has had content control over only two specific types of communications media: (1) broadcasting media like TV and radio (and broadcasting-related technologies, such as cable TV), and (2) the narrow class of telephone-based commercial services that requires the assistance and support of government-regulated common carriers. In all other communications media, the government has no constitutional authority to impose broad regulation of indecent content. The justification for the federal government's special role in regulation of broadcasting is twofold. The first rationale for such a broad regulatory role was the "scarcity of frequencies" argument, which appears in the Supreme Court's decision in _Red_Lion_Broadcasting_Co._v._FCC_ (1969). In that case, the Court held that there is a finite number of useful broadcasting frequencies, and that the scarcity of this important public resource entails that the airwaves be allocated and supervised by the federal government in ways that best serve the public interest. The second rationale for a special government role in broadcasting appears in _FCC_v._Pacifica_Foundation_ (the "Seven Dirty Words" case discussed above). In this case the Court argued that broadcasting is an especially "pervasive" medium that intrudes into the privacy of the home, creating a constant risk that adults will be exposed to offensive material, and children to indecent material, without warning. The justification for regulation of the telephone-based services is grounded in the government's special role in supervising common carriers. Since the telephone systems of this country, many of which amount to monopolies, are common carriers, they are appropriately under the jurisdiction of the FCC. It arguably makes sense for phone-sex services, which rely on the cooperation of common carriers, to fall under FCC jurisdiction as well. *Neither the broadcasting rationales nor the common-carrier rationale support government content control over computer communications.* First, the new medium of computer-based communications -- which may take place over everything from large-scale Internet access providers and commercial conferencing systems to the PC-based bulletin-board system running in a hobbyist's basement -- isn't afflicted with "scarcity." Computing hardware itself is increasingly inexpensive, for example, and one of the basic facts of modern computer communication is that whenever you add a computer to the Internet, you *increase* the Internet's size and capabilities. Secondly, computer-based communications aren't "pervasive" as that term is used in the Pacifica case. In the world of broadcasting, content is "pushed" at audiences by TV and radio stations and broadcasting networks -- audiences are primarily passive recipients of programming. In computer communications, in contrast, content is *pulled* by users from various locations and resources around the globe through the Internet or from the huge data servers maintained by services like Prodigy and American Online. Exposure to content is primarily *driven by user choice*. For users with even minimal experience, there is little risk of unwitting exposure to offensive or indecent material. Finally, online service providers aren't common carriers and don't want to be -- it is the nature of this kind of service that providers must reserve the right to make certain basic choices about content. In contrast, a common carrier like AT&T or BellSouth has to "take all comers." (If online service providers were treated as common carriers, we might imagine a day when the FCC requires that an NAACP-sponsored BBS carry racist messages from members of the Ku Klux Klan.) It is indisputable that the narrow constitutional justifications for content regulation of two specific types of media do not extend to the all media generally, even though all communications media -- including newspapers, magazines, books, films, and oral conversations -- create some risk that children will be exposed to indecent content. That general principle applies here as well: There is no Constitutional rationale for extending intrusive content-regulatory control to online communications. This means that the CDA's "shoehorning" of online communications into the jurisdiction of the FCC is itself unconstitutional. It is clear that Congress could not constitutionally grant the FCC the power to tell The _New_Yorker_ not to print profane language -- even though *children* might come across a copy of The _New_Yorker_. Surely it is equally clear that Congress cannot grant the FCC the authority to dictate how providers like Netcom and CompuServe handle content that contains such language. COMPUTER COMMUNICATIONS POSE DIFFERENT PROBLEMS, AND REQUIRE DIFFERENT SOLUTIONS, FROM THOSE OF OTHER MEDIA. Even if the federal government did have the constitutional authority to regulate indecency in computer communications, it would be required by the First Amendment to employ only the "least restrictive means" in doing so. In the Sable case, the Court noted that there are less restrictive means than a total ban for protecting children from indecent content on phone-sex services. These include such measures as requiring various procedures to verify customers' ages and to deny services to minors. The Exon language creates an affirmative defense for online service providers who implement the same types of procedures that the FCC now requires of phone-sex services. But what works for phone-sex services clearly would not work for computer-communications services. In this fundamentally different medium, those FCC-enforced procedures are not a "least restrictive means" -- in fact, they are potentially among the most restrictive. To take only one example: The language that penalizes anyone who "makes or makes available" indecent content to a minor would require an Internet access provider like Netcom to cease carrying the virtually all of the "alt.sex.*" discussions forums of the distributed global conferencing system known as Usenet. This would be true even though the great majority of the content in those forums is First-Amendment-protected speech. Netcom would be compelled to take such action because the affirmative defenses proposed by the CDA would in practice function as little or no defense at all. Suppose for example that Netcom tried to avail itself of legal immunity for transmitting indecency by, say, limiting subscriber access to the "indecent" Usenet newsgroups to Netcom subscribers age 18 or over. Since Netcom, as a typical Internet access provider, is also a Usenet distribution node, *the company would still face criminal liability.* You see, Usenet nodes like Netcom don't just provide Usenet content to subscribers; they also forward that content to other parts of the Internet, and their doing so is necessary for Usenet to function. Even if Netcom had minor-screening procedures in place for its own subscribers, if it continued to operate as a typical Usenet distribution node by passing "indecent" Usenet traffic through to the rest of the Net, it would "knowingly ... make available" that indecent content to minors elsewhere on the Net who aren't Netcom customers. Note: this analysis is not meant to imply that *no* government regulation of computer communications would meet the "least restrictive means" requirement. As a practical matter, this medium is *uniquely suited* to measures that simultaneously protect sensitive users and children from offensive content and allow the full range of constitutionally protected speech on the Net. Since both the computers that users employ to read the Net and those that providers use to administer the Net are highly intelligent and programmable devices, it is relatively easy to design tools that individuals can use to filter offensive content and that parents can use to screen content for their children. The government's promotion of the development and implementation of such tools, if done in a way consistent with First Amendment guarantees, would likely qualify as a "least restrictive means." Furthermore, there are constitutional reasons for favoring policies that empower individuals and families to make their own content choices. In _Wisconsin_v._Yoder_ (1972), the Supreme Court acknowledged that the right of parents to determine what is appropriate for their children is constitutionally protected. "Filtering" software tools could be the fundamental means for parents to preserve family values while allowing their children exploring global computer networks. ADULTS CANNOT CONSTITUTIONALLY BE LIMITED IN PUBLIC FORUMS TO READING AND WRITING ONLY SUCH CONTENT AS IS "SAFE" FOR CHILDREN. The effect of the CDA's provisions regarding indecent content and minors would be both dramatic and disastrous. If enacted, the CDA would effectively turn all the public areas of the Net -- and all of the distributed global conferencing system known as Usenet -- into the equivalent of the Children's Room at the public library. Traditionally, every large public library has a Children's Room -- a confined area of the library with content deemed safe for children. Outside of the Children's Room, the rest of the library is geared toward, and available to, adults. The Exon language would turn the Net as a whole into the *inverse* of the public library -- the Global public spaces, including Usenet, would be limited to Children's-Room-standard content in order to be safe for children. Adult users would have limit their talk about adult subjects (detailed discussions of sexual content in the work of James Joyce, explanations of Shakespeare's bawdy puns, or descriptions of proper techniques for safe sex, to name some examples) in confined, nonpublic (and probably non-global) subforums or "rooms." There would be no more wide-ranging debates with the full set of potential international participants about the significance of _The_Satanic_Verses_ -- after all, that book has indecent content. We'd have to be satisfied with the narrower range of participants we could lure to an "adult" room on CompuServe or AOL -- a small group of paying subscribers rather than a large population of discussants from commercial and noncommercial systems alike. The CDA would diminish and perhaps destroy the intellectual diversity and vibrancy of the Net. CONCLUSION The CDA represents the kind of "top-down," government-centered attempt to regulate online content that demonstrates a lack of understanding of the nature of this new medium. Legislative efforts like the CDA -- particularly when based on regulatory approaches designed for wholly different media -- are certain to create more practical and constitutional problems than they solve. It is especially ironic that the Exon amendment, which would chill the development of online services and communities and "dumb down" the content of the Net's public spaces to a grade-school level, has been attached to a bill deregulating our communications industry and infrastructure. This deregulation has been presented as a boost to the pace of development of the very technology that supports the current broad range of content, services, and and communities that populate the Net -- the CDA's practical effect would be to greatly diminish that diversity of resources that is supposed to be the principal benefit of deregulation. EFF believes that parents, not Congress or the FCC, have the primary responsibility and constitutionally protected prerogative to determine what is appropriate for their children to see. Furthermore, it has long been understood that government has no general authority to make outlaws out of adults for engaging in constitutionally protected public speech merely because minors in public spaces might be exposed to inappropriate content. As Supreme Court Justice Felix Frankfurter ruled in _Butler_v._Michigan_ (1957): "The State insists that, by thus quarantining the general reading public against books not too rugged for grown men and women in order to shield juvenile innocence, it is exercising its power to promote the general welfare. Surely this is to burn the house to roast the pig. The incidence of this enactment is to reduce the adult population of Michigan to reading only what is fit for children." And a legislative approach that was bad for the adult population of Michigan nearly 40 years ago is surely just as bad for the adult population of the Net today. For More Information Contact: Electronic Frontier Foundation Mike Godwin Shari Steele +1.202.861.7700 (voice) For more information on the "Communications Decency Act" and other legislative attempts at Internet censorship, see: http://www.eff.org/pub/Alerts/ ftp.eff.org, /pub/Alerts/ gopher.eff.org, 1/Alerts ****************************************************************** COMMUNICATIONS DECENCY AMENDMENT -- FULL TEXT OF FINAL LANGUAGE PASSED BY THE U.S. SENATE ON JUNE 14, 1995 The text of the Communications Decency Amendment, sponsored by Sen. Jim Exon (D-Nebraska). This language was passed by the US Senate on June 14th. ------------------------------------------------------- This strikes all of Title IV of S. 652 and replaces it with the following: Sec.___ OBSCENE OR HARASSING USE OF TELECOMMUNICATIONS FACILITIES UNDER THE COMMUNICATIONS ACT OF 1934 Section 223 (47 U.S.C. 223) is amended -- (1) by striking subsection (a) and inserting in lieu thereof: ``(a) Whoever-- ``(1) in the District of Columbia or in interstate or foreign communications ``(A) by means of telecommunications device knowingly-- ``(i) makes, creates, or solicits, and ``(ii) initiates the transmission of, any comment, request, suggestion, proposal, image, or other communication which is obscene, lewd, lascivious, filthy, or indecent, with intent to annoy, abuse, threaten, or harass another person; ``(B) makes a telephone call or utilizes a telecommunications device, whether or not conversation or communication ensues, without disclosing his identity and with intent to annoy, abuse, threaten, or harass any person at the called number or who receives the communication; ``(C) makes or causes the telephone of another repeatedly or continuously to ring, with intent to harass any person at the called number; or ``(D) makes repeated telephone calls or repeatedly initiates communication with a telecommunications device, during which conversation or communication ensues, solely to harass any person at the called number or who receives the communication; or ``(2) knowingly permits any telecommunications facility under his control to be used for any activity prohibited by paragraph (1) with the intent that it be used for such activity, shall be fined not more than $100,000 or imprisoned not more than two years, or both.''; and (2) by adding at the end the following new subsections: ``(d) Whoever-- ``(1) knowingly within the United States or in foreign communications with the United States by means of telecommunications device makes or makes available any obscene communication in any form including any comment, request, suggestion, proposal, image, regardless of whether the maker of such communication placed the call or initiated the communications; or ``(2) knowingly permits any telecommunications facility under such person's control to be used for an activity prohibited by subsection (d)(1) with the intent that it be used for such activity; shall be fined not more than $100,000 or imprisoned not more than two years or both. ``(e) Whoever-- ``(1) knowingly within the United States or in foreign communications with the United States by means of telecommunications device makes or makes available any indecent comment, request, suggestion, proposal, image to any person under 18 years of age regardless of whether the maker of such communication placed the call or initiated the communication; or ``(2) knowingly permits any telecommunications facility under such person's control to be used for an activity prohibited by paragraph (1) with the intent that it be used for such activity, shall be fined not more than $100,000 or imprisoned not more than two years or both. ``(f) Defenses to the subsections (a), (d), and (e), restrictions on access, judicial remedies respecting restrictions for persons providing information services and access to information services-- (1) No person shall be held to have violated subsections (a), (d), or (e) solely for providing access or connection to or from a facility, system, or network over which that person has no control, including related capabilities which are incidental to providing access or connection. This subsection shall not be applicatable to an individual controlled by, or a conspirator with, an entity actively involved in the creation, editing or knowing distribution of communications which violate this section. (2) No employer shall be held liable under this section for the actions of an employee or agent unless the employee's or agent's conduct is within the scope of his employment or agency and the employer has knowledge of, authorizes, or ratifies the employee's or agent's conduct. (3) It is a defense to prosecution under subsection (a), (d)(2), or (e) that a person has taken reasonable, effective and appropriate actions in good faith to restrict or prevent the transmission of or access to a communication specified in such subsections, or complied with procedures as the Commission may prescribe in furtherance of this section. Until such regulations become effective, it is a defense to prosecution that the person has complied with the procedures prescribed by regulation pursuant to subsection (b)(3). Nothing in this subsection shall be construed to treat enhanced information services as common carriage. (4) No cause of action may be brought in any court or any administrative agency against any person on account of any action which in not in violation of any law punishable by criminal penalty, which activity the person has taken in good faith to implement a defense authorized under this section or otherwise to restrict or prevent the transmission of, or access to, a communication specified in this section. (g) no state or local government may impose any liability for commercial activities or actions by commercial entities in connection with an activity or action which constitutes a violation described in subsection (a)(2), (d)(2), or (e)(2) that is inconsistent with the treatment of those activities or actions under this section provided, however, that nothin herein shall preclude any State or local government from enacting and enforcing complementary oversight, liability, and regulatory systems, procedures, and requirements so long as such systems, procedures, and requirements govern only intrastate services and do not result in the imposition of inconsistent rights, duties or obligations on the provision of interstate services. Nothing in this subsection shall preclude any State or local government from governing conduct not covered by this section. (h) Nothing in subsection (a), (d), (e), or (f) or in the defenses to prosecution under (a), (d), or (e) shall be construed to affect or limit the application or enforcement of any other Federal law. (i) The use of the term 'telecommunications device' in this section shall not impose new obligations on (one-way) broadcast radio or (one-way) broadcast television operators licensed by the Commission or (one-way) cable services registered with the Federal Communications Commission and covered by obscenity and indecency provisions elsewhere in this Act. (j) Within two years from the date of enactment and every two years thereafter, the Commission shall report on the effectiveness of this section. Sec. ____ OBSCENE PROGRAMMING ON CABLE TELEVISION. Section 639 (47 U.S.C> 559) is amended by striking "10,000" and inserting "$100,000" Sec. ___ BROADCASTING OBSCENE LANGUAGE ON THE RADIO. Section 1466 of Title 18, United States Code, is amended by striking out "$10,00" and inserting "$100,000". Sec. ___ SEPARABILITY "(a) If any provision of this Title, including amendments to this Title or the application thereof to any person or circumstance is held invalid, the remainder of this Title and the application of such provision to other persons or circumstances shall not be affected thereby." ------------------------------