ELECTRONIC FRONTIER FOUNDATION
                                                         
                                                        

Declaration of Bruce Schneier

in Felten v. RIAA (Aug. 13, 2001)

Grayson Barber (GB 0034)
Grayson Barber, L.L.C.
68 Locust Lane
Princeton, NJ 08540
(609) 921-0391

Frank L. Corrado (FLC 9895)
Rossi, Barry, Corrado & Grassi, P.C.
2700 Pacific Avenue
Wildwood, NJ 08260
(609) 729-1333

(Additional Counsel listed on signature page)
Attorneys for Plaintiffs

IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF NEW JERSEY

EDWARD W. FELTEN; BEDE LIU;
SCOTT A. CRAVER; MIN WU; DAN S.
WALLACH; BEN SWARTZLANDER;
ADAM STUBBLEFIELD; RICHARD
DREWS DEAN; and USENIX
ASSOCIATION, a Delaware non-profit
non-stock corporation,

     Plaintiffs,

)
)
)
)
)
)
)
)
)

   Hon. Garrett E. Brown, Jr.
   Case No. CV-01-2669 (GEB)
   Civil Action

vs.

)
)
)

   DECLARATION OF
   BRUCE SCHNEIER
 

RECORDING INDUSTRY
ASSOCIATION OF AMERICA, INC.;
SECURE DIGITAL MUSIC INITIATIVE
FOUNDATION; VERANCE
CORPORATION; JOHN ASHCROFT, in
his official capacity as ATTORNEY
GENERAL OF THE UNITED STATES;
DOES 1 through 4, inclusive,

     Defendants.

)
)
)
)
)
)
)
)
)
)
 

I, BRUCE SCHNEIER, of full age hereby declare:

1. I am Chief Technical Officer of Counterpane Internet Security, Inc., a company I founded in 1993 to address the critical need for strong, cost-effective, and resilient network security. From 1993 to 1999, I was President of Counterpane Systems, a cryptography and security consulting company. In that capacity, I have advised sophisticated clients such as Microsoft, Citibank, and the National Security Agency on information security products and markets. I have also designed and analyzed commercial hardware and software cryptographic systems since 1993.

2. I have taught technical and business courses related to the fields of cryptography and computer security at a variety of technical conferences since 1993. I hold a Master of Science degree in computer science from American University and a Bachelor's of Science in physics from the University of Rochester.

3. In 1993, I designed the popular Blowfish encryption system. It has achieved wide use because the design was open and public, it was unpatented, and it has not been broken since that time. Another encryption algorithm I designed, Twofish, was a finalist for the U.S. government's new Federal Advanced Encryption Standard (AES) in 2000.

4. I have authored six books, including "Applied Cryptography," which is the seminal work in its field. This book is now in its second edition, and has sold over 130,000 copies worldwide. In October 2000 I published the book "Secrets & Lies: Digital Security in a Networked World," which has already sold over 70,000 copies.

5. Since 1998 I have written the monthly e-mail newsletter "Crypto-Gram", which has over 60,000 subscribers. I regularly present papers at international conferences and am a frequent writer, contributing editor, and lecturer on the topics of cryptography, computer security, and privacy.

6. Previously I have served on the board of directors of the International Association for Cryptologic Research, and am currently on the Advisory Board for the Electronic Privacy Information Center.

7. The disciplines of cryptography and computer security are offshoots of mathematics and computer science, which in itself is an offshoot of engineering, and is a scientific field in its own right. Universities offer degree programs in cryptography and computer security, which are deeply intertwined fields; for simplicity I will refer to both in this declaration as “security”.

8. There are dozens of annual academic conferences in security. It is a dynamic and lively academic discipline, encompassing researchers at all academic levels, with different academic backgrounds, working in different countries, and employed by universities, governments, and private industry.

9. Unlike many academic disciplines, security is inherently adversarial. Researchers who invent security systems are always competing with those who break security systems. Due to the nature of how security works, it is impossible to categorically state that a security system is secure. It may be secure against all known attacks, but there is no guarantee that a successful attack will not be invented tomorrow. Despite not being able to prove security, it is quite possible to definitively show insecurity, by explaining how to break a system, or by publicly demonstrating one's ability to do so. Since the presence of a negative result (break-in) shows that a security system is insecure, security can only be demonstrated by the lack of such results.

10. If, over a period of years, many security researchers attempt to break a security system and fail, researchers come to believe that it is indeed secure. This process is, by its very nature, vague and inexact. There is no well-defined threshold that a security system must pass through in order to be widely believed as secure by the scientific community. It is not enough for a security system to simply be available for analysis for a period of a few years, since that is no guarantee that any researchers have indeed analyzed it. It is not enough for a security system to be analyzed by a certain number of researchers, since that cannot be a guarantee that the researchers were of a caliber capable of breaking the system, or that they tried hard enough to break the system.

11. Security researchers are widely mistrustful of systems that have not undergone this kind of peer review. Occasionally some systems are designed, developed, and deployed in secret. These systems have, without exception, been found to have flaws when they were eventually made public. Examples include Microsoft's PPTP protocol that is part of Microsoft Windows, the WEP protocol that is part of the 802.11 wireless Ethernet protocol, DVD Content Scrambling System, Verance/SDMI watermarking system, and every proprietary digital cellular security protocol. Because of the impossibility of proving a positive security result, there is no substitute for peer review in the creation of secure systems.

12. When a security researcher breaks a security system, he has learned something of scientific value and advances the science. He has learned something specific about the faults of the system he has just broken, but he may have also learned something about how to avoid the same faults in the design of future systems. In fact, the only way for a security researcher to learn how to design secure systems is for him to break secure systems. It is a common maxim in cryptography and computer security that "anyone can design a system that he himself cannot break." If someone designs a security system that he believes to be secure, the first question we should ask is how good is he at breaking security systems. The more, and more different, security systems a researcher breaks the better he is at designing a good security system.

13. It is a regular practice in the science for a security researcher to learn from other peoples' breaks. In order for the academic disciplines of cryptography and computer security to advance, a researcher who breaks a security system needs to make his result available to the rest of the research community. As in any other academic discipline, this unfettered free exchange of ideas and research results is the means by which the entire field may benefit from one person's research, strengthening all of society's security as a result.

14. Oftentimes, in order to fully explain a security idea or break, it is necessary to publish computer code. Cryptography and computer science require absolute precision, and sometimes the easiest and most effective way to achieve that is through a computer program. There are ambiguities in natural language that computer source code do not have. A reader's ability to see the source code for himself is necessary in order to evaluate and build upon the researcher's work. For these reasons, many academic papers in cryptography and computer security either contain source code or provide links to a web page where source and/or executable code can be found. Several of my own books on cryptography contained pages and pages of source code intermixed with the text; it was the only way to communicate my ideas effectively.

15. The academic freedom enjoyed by security researchers has resulted in spectacular advances in the fields of cryptography and computer security. Our encryption algorithms are much stronger today than they were twenty years ago because researchers evaluated and broke (or tried to break) many different encryption algorithms, widely publishing their results and learning from each success and failure. Our firewalls are more secure today than they were ten years ago for the same reasons. Intrusion detection systems, digital watermarking, public-key infrastructures, and every other security technology has benefited from researchers having unfettered access to systems and unimpeded ability to publish their findings for others to learn from and build upon.

16. The DMCA attempts to prevent this kind of research, subjecting researchers to criminal or civil liability for the regular practice of this science. DMCA limits access to and does not provide opportunity to study security systems, stifling scientific progress. Before the DMCA, it was neither necessary nor customary to get permission from the security-system owner or copyright holder in order to test the security system. Before the DMCA, it was not necessary or typical to get permission from the security-system owner or copyright holder in order publish the strengths and weaknesses of the security system. This is simply how science has historically advanced in this field.

17. The requirement that a researcher should obtain permission before analyzing a security system presupposes that the research knows who to obtain this permission from. In reality, this may not be the case. The business world is complicated enough that it is often difficult to determine who the copyright owner or other rights holder of a system is. It could be that the researcher thinks he knows who the rights holder is, but is mistaken. The ability to do mathematics research should not be contingent upon the ability to track down legal ownership of the object studied.

18. I have reviewed a draft version of the academic paper written by Princeton University Professor Edward Felten and his research team. The topic studied, the efficacy of different digital "watermarking" schemes, is a typical and perfectly reasonable thing for security researchers to study. The scientists' paper is an interesting, enlightening, and perfectly reasonable explanation of their findings. The publication of this paper will further the science of computer security, and will allow future researchers to design more secure systems.

19. It would customary for Prof. Felten or any of the researchers to present this paper at an academic conference, publish it in a scientific journal, or make it available on his or her personal web site. In fact, it would be customary within the scientific and academic communities for him or her to do several of these things.

20. Technology conferences, such as those run by the USENIX organization, regularly publish papers such as this one. They regularly discuss the results of testing such technologies as Prof. Felten and his science team would like to do.

21. The fact that the music industry could use the DMCA to block publication of this paper has profound negative effects on the scientific research community and the fields of cryptography and computer security in general. Many researchers are not wealthy and do not have large universities or corporations providing for their legal defense, and the mere threat of a lawsuit creates a chilling effect on security research. After the enactment of the DMCA, scientists can no longer engage in their usual endeavors and explorations into security technologies; they must continually worry about who owns what security system and whether they must get permission to evaluate the security and publish their results. If they get permission to evaluate but not publish, they will probably study something else instead.

22. The DMCA will also result in less secure systems for the public as a whole. The only way to demonstrate the security of a system is for many security researchers, over the course of years, to try to break it and fail. Security systems that are designed in secret are all of poor quality exactly because of the lack of peer review. The DMCA creates a world where security systems are designed in secret and the public is left unprotected. Moreover, security researchers are prohibited from evaluating systems without the permission of their owners. This state of affairs allows a system designer to produce insecure systems and use the DMCA to hide the insecurity from the scientific community and the public. Researchers cannot learn from the mistakes they make. We will not learn how to build more secure systems.

23. The robust and healthy advance of the sciences of cryptography and computer security are critical to our electronic future. If we are to turn computers and computer networks into serious business and social tools, we need strong security systems. The only way to build strong security systems is through open design and peer review. The only way to facilitate peer review is to allow unfettered access to security systems for analysis, and unimpeded allowances to publish results: both positive and negative. One cannot simply pretend that a system is secure by making it illegal to evaluate its security. A system is either secure or not, and everyone benefits when the truth is known.

I declare under penalty of perjury that the foregoing is true and correct and was executed at _________________on this the ___ day of ________, 2001.

______________________________ Bruce Schneier
Grayson Barber (GB 0034)
Grayson Barber L.L.C.
68 Locust Lane
Princeton, NJ 08540
phone (609) 921-0391
fax (609) 921-7405
    
Frank L. Corrado (FLC 9895)
Rossi, Barry, Corrado & Grassi, PC
2700 Pacific Avenue,
Wildwood, NJ 08260
phone (609) 729-1333
fax (609) 522-4927
Gino J. Scarselli
664 Allison Drive
Richmond Hts., OH 44143
(216) 291-8601 (phone and fax)
    
James S. Tyre
10736 Jefferson Blvd., # 512
Culver City, CA 90230-4969
phone (310) 839-4114
fax (310) 839-4602
Cindy A. Cohn
Lee Tien
Robin Gross
Electronic Frontier Foundation
454 Shotwell St.
San Francisco, CA 94110
phone (415) 436-9333
fax (415) 436-9993
    
Joseph P. Liu
Boston College Law School
885 Centre Street
Newton, MA 02459
phone (617) 552-8550

Attorneys for Plaintiffs



Please send any questions or comments to webmaster@eff.org.