EFFector Vol. 13, No. 12 Dec. 22, 2000 editor@eff.org
A Publication of the Electronic Frontier Foundation ISSN 1062-9424
For more information on EFF activities & alerts: http://www.eff.org
[Note: If you recently attempted to join or donate via our Web form and it didn't work, please try again at http://www.eff.org/support- the problem has been fixed.]
Dear EFFector Reader,Imagine making an anonymous, off-the-cuff criticism about your employer in an Internet chatroom, and then learning that your ISP has been served with a subpoena by that employer requiring it to reveal your identity.
Imagine direct marketers tracking your Internet browsing patterns and personal information and offering that information for sale to the highest bidder.
Imagine being sued by big players in the movie industry for linking to software on someone else's computer that you believe is perfectly legal.
Who can help when you find your civil liberties being threatened because of your use of technology? The Electronic Frontier Foundation.
Presently, only about 10% of our newsletter readers are members. If all of our non-member subscribers joined EFF today, at just the $20 student/low-income level, this would provide us the funding to hire FOUR more full-time professional staff members or to take on another major project or legal case. If all of our readers joined at the $65 rate, it would approximately double our entire funding base! While EFF does receive a few corporate and foundation grants, we are heavily dependent on your membership dues and individual donations to continue our work.
For the past ten years, EFF has been there to provide legal counsel and assistance to people just like you--users of new technologies who get caught on the front line where technology and law collide.
As our world becomes increasingly dependent on technologic innovations, new threats to free speech, privacy, and free and open communications arise with alarming speed. EFF, a nonprofit, member-supported organization, is working every day to protect your rights in the digital world.
There's a new tool being used to discover the identity of anonymous Internet posters--the civil subpoena. Companies or individuals who want to know the identity of an anonymous poster have begun serving legal documents on the poster's service provider. After receiving the identity, the companies take matters into their own hands, often firing disgruntled employee posters and dropping all lawsuits.
Unlike criminal warrants, civil subpoenas do not need a showing of probable cause to be issued. In fact, in some jurisdictions, lawsuits do not even need to be filed for a civil subpoena to be issued. Revealing the identity of anonymous critics without requiring proof of legal wrongdoing is giving companies the discretion to shut down protected speech.
EFF is currently working on two cases where we are opposing these civil subpoenas involving innocent posters being harassed by employers. We're also starting a webpage with tips for people who learn that their ISPs have been served with this type of subpoena.
When you browse the Internet, information about you is transmitted to and stored by many entities, often without your knowledge or permission. The Internet permits new types of marketing data to be collected--data that not only reveals what you purchase but even what you look at and how long you look.
Over the past year, EFF has advised the Federal Trade Commission on privacy concerns of Internet users and of the inadequacy of permitting companies to self-regulate. We have opposed P3P and other standards proposed to be used to protect privacy that actually permit increased monitoring of personal behavior. We also advised plaintiffs bringing a legal action against online advertiser Doubleclick as to the constitutional implications of Doubleclick's behavior.
The music and movie industries have been overly zealous about expanding copyright law in cyberspace. Over the past 200 years, a delicate balance has been reached between the rights of creators to compensation for their works and the rights of the public to use those works. Many of the public's rights have been embodied in the notion of fair use, which permits people to make uses of works without the copyright holder's consent.
The music and movie studios, which have ownership rights to a lot of popular content, have mounted an all out attack against Internet users, suing them for posting and linking to computer software that would enable DVDs to be played on computers running the Linux operating system, suing digital libraries that permit users to distribute potentially copyrighted works, and developing "standards" that they are requiring all hardware manufacturers to implement if they want access to the content.
EFF is the only group that has consistently stood up to these well-funded bullies, defending an electronic publication that linked to computer code and organizing a boycott of the music industry's "Hack SDMI" challenge. Over the next year, we will continue to work to ensure that fair use survives in the digital world. Through CAFE, EFF will also be educating artists and students as to their rights regarding electronic publication and copying. We are also in the process of creating an "open audio license" that musicians can use when authorizing distribution of their works online.
Time and time again, EFF has been the first organization to see the legal implications of technology, and we've been in the trenches, fighting to prevent the erosion of personal freedom. But we can't do it alone. We need your support to keep us going.
As the end of the year approaches, please consider furthering your support of our efforts by making a special donation to fund our legal cases and educational work. Donations to EFF, including membership dues, are tax deductible (in the US).
To join EFF or give an additional year-end donation, please Web over to:
http://www.eff.org/support
EFF uses a secure server to protect your credit card data. We can accept Visa, Mastercard, American Express, plus PayPal and e-gold.
Thank you for your continuing support; it means a great deal to us.
With warm wishes for the coming year,
Shari Steele,
Executive Director
P.S.--With a gift of $65 or more, we'll send you a special tenth anniversary EFF t-shirt, and with a $100 or higher donation we'll also send you an EFF "raid" cap (styled like an FBI cap, but says "EFF"). You may of course opt out of these member benefits and ensure that 100% of your donation goes to EFF work.
Around the end of Oct. 2000, Sen. John McCain, Rep. Ernest Istook, various other legislators, and the White House, cut a deal to include a controversial and misguided mandatory library content filtering "rider" on a major Labor, HHS & Education appropriations bill, H.R. 4577 (which was in House/Senate conference committee for months, and passed by Congress earlier in Dec. The bill is now before the President, who is highly unlikely to veto it.)
Legislators McCain and Istook, among several others, have for three years pushed various versions of legislation to grant FCC regulatory control over the Internet and to force public and private libraries (and schools) that receive any of several federal funding sources to install Internet content filtering software, or else be denied a variety of vital federal funding (including ESEA Title III ["Focused On Technology"], LSTA, and E-Rate funds). Istook's version in the House and McCain's version in the Senate were attached to H.R. 4577 before the bill passed to the conference committee. Both were removed with all other "riders" (small bills attached to a large one in hopes that they'll pass as part of the major bill). While the concerns raised, across the political spectrum, about this legislation probably had little impact on the rider removal decision, many expected the censorware proposal to die at this point (until next year, at least). But, the chairman of the conference committee offered the disputing McCain and Istook the opportunity to hammer out a joint version of the filtering language. This was done, and the new result was put back in the bill. After further refinements to satisfy the President and VP, passage into law is virtually guaranteed at this point, since the larger funding measure has passed with this rider.
At this juncture, the "Child Internet Protection Act" and "Neighborhood Child Internet Protection Act" (two related provisions of the filtering legislation) will have to have to be challenged in court, on First Amendment and other grounds.
The legislation is broadly opposed by liberal, conservative and nonpartisan organizations, from the ACLU and the American Library Association to the Eagle Forum and the Christian Coalition. Congress's own Child Online Protection Act Commission rejected mandatory filtering in their recommendations to the legislature last month.
Despite some early religious-right support for the notion of censorware, conservative groups now raise virtually identical concerns with this legislation as their liberal counterparts. A right-wing coalitional letter to key legislators stated, "[t]here is growing concern within the conservative community regarding the use of filtering systems by schools and libraries that deliberately filter out web sites and information that promote conservative values. There have been many reported incidents of schoolteachers and administrators targeting ... pro-life organizations with filtering software to prevent students from hearing alternative approaches to those issues." One begins to wonder just who, outside of a handful of legislators (and censorware marketers), believes in censorware any more.
For several years Congress has sought to impose some form of mandatory or "pseudo-voluntary" content filtering on all public libraries and public schools. The idea seems to sound nice to legislators and to a large segment of the general public, because they simply do not understand how the technology works (and, more importantly, how it fails to work.)
The principal problems with the proposal are inherent in the software and services themselves. These include:
a) subjective filtering criteria, in which a software company (i.e. a government contractor, subject to the First Amendment) gets to decide broadly what is and is not available to some or all library patrons via library Internet terminals;
b) biased (typically politically-motivated) filtering decisions, in which software company employees or their consultants (who are again covered by First Amendment requirements because they are doing a job for the government), choose to block material that is not even covered by any stated filtering criteria of the product/service in question; such biases have blocked everything from EFF's own site to gay-rights news stories to Christian church Web pages;
c) harm to the First Amendment-protected right to read, in an unprecedented system in which unaccountable software companies deny access to materials that are constitutionally protected (including material that no court has ever deemed indecent, obscene, or harmful to minors, as well as content not restricted by any legal category at all, such as "intolerant" material;
d) mistaken blocking of innumerable sites as "pornographic", "violent", "intolerant" or otherwise "wrong", when in fact they contain no such content at all;
e) mistaken blocking of names, non-vulgar words, and other material due to bad keyword matching algorithms;
f) overly broad blocking in which entire directory structures or entire Web sites with thousands of users/authors are wholly blocked for content only found on one page;
g) alteration of content in mid-stream, often in such a way as to either leave no indication that material has been censored, or to make the material nonsensical because material has been removed (e.g., in mid-sentence); this technique also raises issues of author's copyright-derived rights to control the distribution of "derivative works", when their words are "sanitized" by filtering software;
h) provision of few (in many cases, no) options for selecting blocking criteria other than those pre-configured in the software; imposition of censorware would effectively force everyone to adhere to someone else's morality, in clear violation of the Freedom of Religion clause;
i) dismal ineffectiveness at actually doing what they are advertised to do (block out sexually explicit and certain other kinds of content); no filtering service or product on the market has anywhere near even a 90% effectiveness rate, resulting in a completely false sense of security, and a "solution" that fixes nothing at all;
j) blocking of materials that are constitutionally protected even for minors, as well as for adults;
k) imposition of technological censorship measures that have already been ruled unconstitutional, in the Mainstream Loudoun v. Loudoun Co. [VA] Library case.
Seth Finkelstein, the programmer principally responsible for the investigation of X-Stop filtering software and its flaws, vital to the landmark Mainstream Loudon victory, observes: "The claims made by censorware vendors are technologically absurd and mathematically impossible. If people argue endlessly over what is art, how can a shoddy computer program ever have an answer? Imagine if a bigoted organization could, at the touch of a button, secretly remove from a school or library all books they deemed objectionable. That is the reality of censorware. This is book-burning on the Internet, by unaccountable blacklisters."
In short, censorware simply does not perform as advertised, and substitutes simple-minded algorithms and a faceless one-size-fits-all morality for complex, context-dependent and highly personal human judgement. It does not get the job done, and the cost to library patrons' freedom to read (and authors' rights, as well) is far to great to bear for such a broken so-called solution to a problem (minors' access to inappropriate material) that is, at heart, one of parental rule-setting and oversight, not federal government regulation.
There are additional political problems that arise with such a proposal including:
1) It is an unfunded mandate that will ironically cost libraries more to implement that they will receive in federal funding in many cases (especially once all costs are included, such as software/service price, training, staff time dealing with complaints, consultant & system administration costs, and, of course, litigation).
2) It would usurp the responsibilities, and disregards the capabilities, of local libraries/library boards and state bodies to deal with these issues as local citizens demand. It would turn the Supreme Court-approved "community standards" content regulation system on its head, permitting the Federal government generally, and national and international corporations in great detail, to dictate what is and is not okay to read in city and county libraries.
3) It would impose a "one-size-fits-all" system of morality over the entire nation - precisely what the First Amendment exists to prevent - disallowing parental discretion and upsetting years of local efforts to set acceptable use policies and practices for libraries (over 90% of public libraries already have such policies in place).
4) It would turn librarians into snooping content police, and thereby threatens both the integrity of the library profession, and patron privacy.
5) It would hit hardest precisely those libraries that most need the withheld funding. Inner-city, rural and other low-income libraries would incur the most difficulty and expense to comply with the law, for the least returns, making it a lose-lose proposition.
6) It would use the definition of "harmful to minors" found in the Child Online Protection Act (COPA), which is currently under a federal injunction against enforcement on the grounds that it is most likely unconstitutional (pending the court's final decision).
7) It would "hard-code" into the law requirements for specific technologies that are both ineffective and likely to become obsolete within a very short timeframe (many believe they are already) - technologies incapable of anything remotely resembling human judgement. At the same time, it would disallow measures such as locally-determined acceptable use policies, family education, or future technologies, as alternatives.
8) Last, but by no means least, it poses a severe threat to children's privacy. The law would mandate the (ab)use of monitoring software (which will necessarily entail detailed logs) to track minors' Internet participation. While this is in and of itself draconian, the matter is far worse that it seems at first. Courts are already deciding (as in the recent James M. Knight v. Kingston NH School Administrative Unit No. 16 case) that students' Internet logs are matters of public record. It is both ironic and alarming that a law with "Children's Protection" it its title would do more to harm minors than protect them.
The issues, thus, go far beyond the more obvious freedom of expression concerns. In a coalition letter to Congress from 17 educational organizations (including NEA, PTA National, and national principals' and school boards' associations) noted, "[w]hile nearly every school in the United States already supervises minors' online activity, promoting the use of technological monitoring software raises serious privacy and security concerns that have not been examined by Congress.... Federal filtering mandates disregard local policymaking prerogatives. Instead they require local decisionmakers to select among a few marketable national norms developed as business plans by filtering software companies."
Aside from the general concerns raised above about the legislation as a whole, there are many devils in the details. Some of the most troubling provisions of the bill are outlined below. Problems are listed as they first appear. Many recurr later in the legislation, much of which is duplicative of previous sections, principally to make legal challenge more difficult. (I.e., if we challenge the library provisions and have them struck down, the school provisions still stand until separately and successfully challenged on their own, unless a broad enough case can be brought against all of the provisions at once.)
In Title I:
* The "DISABLING DURING ADULT USE" section imposes conditions that in effect require librarians to ascertain that an adult patron's use of library computers is for "bona fide research or other lawful purposes" before they are permitted to disable the filtering software. If something like this should be done at all (which is highly questionable), this is the job of a judge, not a librarian, and is a massive attack on patrons' privacy and right to read. Worse yet, filtering is not required to be disabled by adult request (even after these impossible criteria are met); disabling is only "permitted", non-bindingly. As if this were not bad enough, the language has a loophole that could easily exclude actual librarians from having authority to turn off filters at all, requiring the aproval of library administrators.
* The "GENERAL RULE" provision is worded such that NO ONE - not librarians, not even parents directly supervising their own children - may turn off the filters for a minor, no matter what it might be mis-blocking.
* The "GENERAL RULE" section also mandates that the software be able to block obscenity, child pornography and material harmful to minors. This is physically impossible - no software can determine what does or does not fall into these legal categories (only a court can), and cannot block even most let alone all of such material without blocking orders of magnitude more material than necessary (i.e. anything that *might* conceivably fall into such a category, and lots more besides). Censorware drags a very large net behind it.
* The "DEFINITIONS" section treats all persons under 17 years of age as if they were the same as 4-year-old children, making no distinction between maturity levels. The Supreme Court has already expressed grave concern with this legal concept, in reviewing "harmful to minors" laws. This new legislation raises this problem much more clearly than any previous laws.
* The "EFFECTIVE DATE" section gives libraries and schools only 120 days to comply with the impossible, or begin to lose funding unless they qualify for special extensions.
* The "OTHER MATERIALS" section permits (though does not require) libraries to block even more material (i.e., material that is not legally deemed obscene, harmful-to-minors or child-pornographic.) This is a recipe for outrageous amounts of needless litigation, and political attempts by censorious groups to seize control of library boards.
In Title II:
* Provision (iii) of the "INTERNET FILTERING" section appears to apply its requirements to private as well as public schools.
* The "CERTIFICATION WITH RESPECT TO ADULTS" section makes it clear that libraries are required to filter ALL library terminals even for adults (again, with an literally impossible requirement that the filters block certain legal categories that no software can accurately detect or identify). This section and the related one with regard to minors, require under no uncertain terms that libraries have and "enforce" policies to ensure that filters are on, used, and not bypassed. This turns librarians into spying Internet cops, violating both their own professional ethics and patrons' privacy. Resistant libraries will immediately be punished by the "FAILURE TO COMPLY WITH CERTIFICATION" clause: "Any [school or library] that knowingly fails to ensure the use of its computers in accordance with [the censorware mandate] shall reimburse all funds received in violation thereof."
In Title III:
* This additional section, the rather inexplicably named "Neighborhood Children's Internet Protection Act", requires stringent acceptable use policies (aspects of which are federally pre-ordained) for local school and library computer usage, in addition to, rather than as an alternative to, mandatory censorware.
* The deceptive "LOCAL DETERMINATION OF CONTENT" section has three major promblems, the first of which is that the federal government is in fact establishing standards of what must be blocked even though the section title says it isn't. Secondly, this provision is a blanket encouragement of more conservative libary and school districts' violation of the First Amendment with impunity by blocking anything they want. Third, even the vague and lax restraints that there would be on federal dictating of content regulations are put on hold until mid-2001.
* The "STUDY" section is ironic and hypocritical in requiring an NTIA study "evaluating whether or not currently available commercial internet [sic] blocking and filtering software adequately addresses the needs of educational institutions...and...evaluating the development and effectiveness of local Internet use policies that are currently in operation after community input." This should have been done BEFORE, not after, considering mandatory censorware laws! The study would also make "recommendations on how to foster the development of" more censorware - highly questionable as something to be legitimately done at taxpayer expense.
* The "IMPLEMENTING REGULATIONS" section gives the Federal Communications Commission the authority and responsibilty of implementing the new law. This is probably the real, hidden purpose of the legislation - to give the FCC authority to regulate the Internet like it regulates (censors and permits oligopolistic control of) broadcasting. There is big and particularly anti-democratic corporate money lurking behind this measure.
The one and only good thing anywhere in this legislation is a requirement for expedited court review, similar to the review provision in the Communications Decency Act, which enabled the EFF/ACLU/CIEC legal effort to overturn the CDA on constitutional grounds rapidly, before much harm was done.
EFF's 2000 Internet Censorship Legislation Archive:
http://www.eff.org/Censorship/Internet_censorship_bills/2000
(includes ACLU, conservative coalition & education coalition letters to Congress, and "before and after" versions of the legislation.)
The Internet Free Expression Alliance:
http://www.ifea.net
(IFEA is an international coalition that opposes governmental imposition of filtering software and content labelling & rating systems.)
PeaceFire:
http://www.peacefire.org
(PeaceFire is a student-run organization that exposes the flaws in censorware and opposes suppression of student free speech rights.)
Last issue's CFN for the EFF Pioneer Awards contained two errors. The first was saying that the ceremony would be held in both Massachusetts and Canada. The real location is Boston, MA. The second was an incorrect affiliation/attribution line for one of the judges (Barbara Simons), who was until recently the ACM president. Corrected text is included below, and can be pasted into any extant copies of the CFN, in case you are republishing it or intend to do so.
To make a nomination, please see:
http://www.eff.org/awards/pioneer.html
The Tenth Annual EFF Pioneer Awards will be presented in Boston, Massachusetts, at the 11th Conference on Computers, Freedom, and Privacy (see http://www.cfp2001.org ). The ceremony will be held on the evening of Thu., March 8, 2001, at the Boston Aquarium. All nominations will be reviewed by a panel of judges chosen for their knowledge of the technical, legal, and social issues associated with information technology, some of them Pioneer Award recipients themselves.
This year's EFF Pioneer Awards judges are:
EFFector is published by:
The Electronic Frontier Foundation
454 Shotwell Street
San Francisco CA 94110-1914 USA
+1 415 436 9333 (voice)
+1 415 436 9993 (fax)
http://www.eff.org
Editor: Stanton McCandlish, EFF Advocacy Director/Webmaster (editor@eff.org)
Membership & donations: membership@eff.org
General EFF, legal, policy or online resources queries: ask@eff.org
Reproduction of this publication in electronic media is encouraged. Signed articles do not necessarily represent the views of EFF. To reproduce signed articles individually, please contact the authors for their express permission. Press releases and EFF announcements & articles may be reproduced individually at will.
To subscribe to EFFector via e-mail, send message BODY (not subject) of:
subscribe effector
to majordomo@eff.org, which will send you a confirmation code and then add you to a subscription list for EFFector (after you return the confirmation code; instructions will be in the e-mail).
To unsubscribe, send a similar message body to the same address, like so:
unsubscribe effector
Please ask listmaster@eff.org">listmaster@eff.org to manually add you to or remove you from the list if this does not work for you for some reason.
Back issues are available at:
http://www.eff.org/effector
To get the latest issue, send any message to effector-reflector@eff.org (or er@eff.org), and it will be mailed to you automagically. You can also get, via the web:
http://www.eff.org/pub/EFF/Newsletters/EFFector/current.html
Return to EFFector Newsletter Menu
Please send any questions or comments to webmaster@eff.org