<?php

include("eff_setup2.php");

$smarty = new EFFSmarty;

$smarty->assign('title','General Comments on Analog Reconversion Discussion Group');

// if breadcrumb == true, then it fill in the right trail in the issue
// array
$smarty->assign('breadcrumb','false');

// example:
//$issue = array("Issues" => "/issues/", "Privacy" => "/issues/privacy/", "TIA" => "/issues/privacy/tia/");

//Creative Commons - If you need to turn OFF the CC license, set cc = false
//$smarty->assign('cc',"false");

$smarty->assign('issue',$issue);

$content  = '
<div id="featuretext">

<h1>Analog Reconversion Discussion Group</h1>
<h2>General Comments of Electronic Frontier Foundation</h2>

<br />

<p><strong>By Seth Schoen, Staff Technologist<br />
November 24, 2003<br />
<a href="ardg_eff_comments.pdf">Download PDF</a> (36k)</strong></p>

  <h3>Executive Summary</h3>
  <ul type=disc>
    <li>The &quot;analog hole&quot; question raises important public policy issues
      not discussed within ARDG.</li>
    <li>Technologies proposed to ARDG are likely inadequate to prevent copyright
      infringement, especially Internet-based copyright infringement.</li>
    <li>Both watermark and VBI-based technologies may be weak and easy to defeat;
      several technologies can be defeated using techniques that are already
      public.</li>
    <li>Assessing the efficacy of watermark technologies requires public peer
      review, but the public lacks enough information to make an informed judgment
      about watermark vendors\' claims presented to ARDG.</li>
    <li>Deploying a VBI technology may create new risks for hearing-impaired
      viewers and others who rely on closed caption information.</li>
  </ul>
  <h1><strong>Public Policy Issues</strong></h1>
  <p>Public policy issues and the larger context surrounding the &quot;analog
    hole&quot; question were not discussed at ARDG. This practice leaves important
    issues unaddressed. These issues are of fundamental importance to the Constitutionally-mandated
    balance between copyright holder\'s interests and the public interest in U.S.
    copyright law.</p>
  <p>For instance, questions to technology proponents specifically about effects
    on lawful uses of copyrighted works suggested by Public Knowledge and the
    Center for Democracy and Technology were removed from the ARDG analysis matrix
    before it was circulated. As a result, technology vendors were never invited
    to consider these important questions publicly.</p>
  <p>One representative question concerns what happens when someone has the legal
    right to make a copy of an audiovisual work in a situation where a publisher,
    broadcaster, or other party has arranged for the work to be marked &quot;Copy
    Never&quot;. Will &quot;compliant&quot; hardware preclude the lawful creation
    of a copy or excerpt of such work? What recourse will the would-be user have
    if the technology does forestall a legitimate use?</p>
  <p>Such questions are particularly important because copyright holders have,
    in the past, trumpeted the existence of the &quot;analog hole&quot; as a
    means of protecting the public\'s rights to make copies for fair use and other
    purposes. When the effects of digital rights management technologies and
    legal anticircumvention measures on the public\'s traditional access rights
    were questioned in litigation and before the United States Copyright Office,
    entertainment interests pointed to the existence of the &quot;analog hole&quot; as
    a safety valve protecting the public interest. But this argument and these
    assurances will no longer hold water if the rights of the public to use and
    benefit from analog are substantially altered.</p>
  <p> Should the &quot;analog hole&quot; be utterly stopped up, the public may
    finally be unable Ð as a technical matter Ð to make many lawful uses of audiovisual
    works by any means whatsoever.</p>
  <h1><strong>General Inadequacy of Proposed Technologies</strong></h1>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; It is
    not at all clear that the proposed technologies will be technologically capable
    of blocking the analog hole Ð even if all of them performed as advertised.</p>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; It is
    difficult to see how the public\'s current level of access to video digitizing
    technologies can be taken away Ð &quot;put back in the tube&quot;, so to
    speak. Even redesigning a substantial amount of hardware (with a system which
    we assume for the moment is very cleverly designed) would have only limited
    effectiveness. Enormous numbers of video digitizers with unencrypted outputs
    are already deployed, the market for such hardware is large, and the current
    cost of these devices is low, at least at standard-definition resolutions.</p>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Some
    analog-to-digital conversion hardware is not specifically designed for video
    applications, but it may be possible to repurpose &quot;non-video&quot; devices
    so that they can usefully process video. The number of analog-to-digital
    conversion microchips in the world has long exceeded the planet\'s human population,
    and the ADC chips are still multiplying faster than we are.</p>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Digitizing
    hardware can also be fabricated from scratch from other components.<a
href="#_ftn1" name="_ftnref1" title="">[1]</a> While
    there is some dispute about the cost of creating an analog-to-digital converter
    with particular specifications, this technology is widely available, widely
    deployed, and widely understood. We have argued that hobbyists are likely
    to be able to design and create video digitizers. In addition, consumers
    will continue to be able to import digitizers from outside the U.S., where
    billions of dollars of goods sold every year contain analog-to-digital conversion
    capabilities.</p>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Technological
    controls may also be ineffective for a different reason. If, as entertainment
    interests frequency contend, the Internet is an extremely rapid and efficient
    distribution system for illegal copies, then even a small number of sources
    could illegally make works widely available, so that restricting the average
    person\'s ability to obtain unrestricted copies of a work may have little
    or no effect on the availability of unlawful copies of works on-line.<a href="#_ftn2"
name="_ftnref2" title="">[2]</a></p>
  <p class=MsoBodyText align=center style=\'text-align:center\'><strong>Limitations
      of Technical Approaches: Watermark-Based Technologies</strong></p>
  <p>Several proposed technologies submitted to ARDG are based on mandated detection
    of some digital watermark at the point of analog-to-digital conversion. For
    the reasons outlined in more detail below, there is a paucity of information
    about whether these technologies will work as advertised (and good reason
    to believe that they will not).</p>
  <p>Watermarks, like other security systems, can only be evaluated scientifically
    when they are published for peer review. While no one has found a way to
    prove conclusively that a particular watermark is secure, it is possible
    for skilled reviewers to determine whether the state of the art would allow
    the creation of a straightforward attack against that watermark in a particular
    application.</p>
  <p>Digital watermarking is the subject of serious scientific analysis, and
    substantial scientific research. Much research about watermarking applications
    aims at evaluating systems by attempting to discover attacks against them.
    (Attacks against watermarks and watermark-detection systems may include,
    without limitation, devising technical means to remove or alter a watermark,
    or to hide a watermark from a detector so that the detector fails to detect
    the watermark correctly.)</p>
  <p>In the past, many watermarks (even those developed by highly trained scientists
    and engineers) were quickly defeated by others after they were published.
    The results of this process fill the proceedings of mathematics and computer
    security conferences and adorn the pages of peer-reviewed journals. They
    emphasize just how fundamental public peer review is in assessing security;
    the creators of security systems are often poor judges of how those systems
    would hold up in the real world. The iterative nature of the open scientific
    process frequently leads to major improvements and new discoveries in security.<a href="#_ftn3" name="_ftnref3"
title="">[3]</a></p>
  <p>The range of mathematical tools available to an attacker is significant.
    Yet dozens of watermarks have been defeated by extremely simple methods such
    as a tiny distortion of the picture, or a slight change in the speed of playback.</p>
  <p>When the SDMI Forum, which was considering audio watermark technologies
    for copy-control applications, published limited information about some of
    the proposed technologies, researchers were soon able to point out weaknesses
    in many of them.<a
href="#_ftn4" name="_ftnref4" title="">[4]</a></p>
  <p>Some of the researchers studying the technologies believed that too little
    information had been published to allow a truly informed assessment.</p>
  <p>Unfortunately, most of the techniques presented here have never been published
    or reviewed. The analysis matrices submitted by proponents generally indicate
    either that a technology has never been published or disclosed at all, or
    that it has been disclosed only to a few parties under a confidentiality
    agreement.</p>
  <p>As a result, the public evidence for these technologies\' efficacy is typically
    limited to vendor claims, with no independent analysis or verification. This
    is obviously a poor way of evaluating and selecting security systems. Even
    a well-intentioned security technology creator is rarely able to anticipate
    all of the sources of weakness in his or her own invention.<a href="#_ftn5" name="_ftnref5" title="">[5]</a> In some cases, various parties have
    evaluated certain technologies secretly under confidentiality agreements.
    (For example, the DVD-CCA watermark evaluation process allowed a few companies Ð but
    not the general public Ð to perform evaluations of proposed technologies.)
    By their nature, these confidentiality agreements preclude informed assessment
    by the public of the quality of the evaluation performed. They prevent us
    from repeating experiments and do not allow us to tell whether particular
    weaknesses have been considered, or how carefully. All we can tell is that
    unknown persons performed an unknown amount of unknown research, and later
    pronounced themselves satisfied. This provides an insufficient basis for
    public confidence.</p>
  <p>By contrast, the NIST AES competition, by which the U.S. government selected
    the new Federal standard data encryption algorithm, was conducted as a public
    competition. Submitters had to publish the details of their proposals at
    the outset; other submitters (and scientists from around the world) then
    analyzed the submissions for weaknesses. In many cases, significant weaknesses
    in proposals were uncovered as a result of this process. Several government-sponsored
    conferences saw the submission of dozens of papers, highlighting many flaws
    that had not been apparent to the original inventors of AES candidate technologies.
    Eventually, NIST was able to select an encryption technique (Rijndael) as
    the AES standard, FIPS 197. Today, AES enjoys a high degree of public confidence
    because of the transparency of the process and the substantiality of the
    peer review of candidate technologies. NIST would not have accepted AES proposals
    from entities that refused to submit to a public review process or kept their
    technologies secret.<a href="#_ftn6" name="_ftnref6" title="">[6]</a></p>
  <p>Because few proposed watermark technologies submitted to ARDG have been
    subject to public analysis, there has been no opportunity to detect and publicize
    erroneous claims by technology proponents (as routinely took place during
    the AES selection process).</p>
  <p>Princeton University watermark researcher Scott Craver, in a presentation
    to ARDG, noted that new watermark designs are frequently attacked successfully
    soon after publication, and that watermarks may have limitations making them
    unsuitable for use in copy-control applications. He concluded that, for controlling
    analog reconversion of audiovisual works, the &quot;state of the art favors
    analysis&quot; (i.e., the attempt to remove or obscure a watermark).<a href="#_ftn7" name="_ftnref7"
title="">[7]</a> Targeted attacks against
    particular watermarks are often available and effective.</p>
  <p>Moreover, regardless of its strength or security, the suitability of any
    digital watermark for restricting digitization at the point of analog-to-digital
    conversion in a personal computer is questionable at best. In a computer
    environment, many attacks try to conceal the presence of the watermark from
    the detector, rather than removing or altering the watermark itself. These
    attacks may not even depend for their success on the technical details of
    the watermark itself; as the proverb has it, &quot;all cats are gray at night&quot;.</p>
  <p>This attack can be mounted easily against a watermark detector present at
    an analog input to a PC. In one version, the watermarked signal is scrambled
    while still in analog form, before providing it to a video input on the PC.
    The scrambling is performed using analog components according to a reversible
    scrambling method whose details are known to the attacker. When the scrambled
    video is digitized, the watermark detector will be unable to see the watermark
    hidden within the scrambled video signal. After digitization is complete,
    and the attacker has a digital copy of the scrambled video saved on the PC\'s
    hard drive, the attacker simply reverses the scrambling in software to obtain
    an unrestricted clear copy.<a href="#_ftn8"
name="_ftnref8" title="">[8]</a></p>
  <p>Broadly speaking, this attack works because watermark designers try to create
    watermarks that cannot be removed without &quot;destroying the perceptual
    quality of the signal&quot; or &quot;making the video unwatchable&quot;.
    If a method of removing or hiding the watermark makes the video unwatchable,
    watermark designers typically assume that nobody would find a reason to apply
    that method. However, analog encryption schemes deliberately destroy the
    quality of a signal or deliberately make it unwatchable in a predictable
    and reversible way. Thus, the signal scrambling can be reversed after digitization
    has already taken place Ð making the video watchable again.</p>
  <p>Because it need not preserve the appearance or watchability of an image,
    analog scrambling can alter any feature of a video signal. It can even, like
    the World War II-era SIGSALY system, incorporate an analog key to control
    the scrambling process.<a href="#_ftn9"
name="_ftnref9" title="">[9]</a> The
    output of the scrambling process will then appear to be random noise or static
    from the point of view of a watermark detector Ð but someone in possession
    of the scrambling key can reverse the process, and can reconstitute the original
    signal. While analog encryption is not widely known today (because of the
    greater convenience of digital encryption for security applications), it
    was developed in considerable detail in the pre-transistor era and can likely
    be implemented for this application at relatively low cost using only analog
    components.</p>
  <p>Ingemar Cox and Jean-Paul M. G. Linnartz may have been the first to describe
    this process, in their 1998 paper &quot;Some general methods for tampering
    with watermarks&quot;.<a href="#_ftn10"
name="_ftnref10" title="">[10]</a> Cox
    and Linnartz describe the attack as follows:</p>
  <p>[C]opy protection based on watermarking content has a further fundamental
    weakness [in addition to several discussed earlier]. The watermark detection
    process is designed to detect the watermark when the video is perceptually
    meaningful. Thus, a user may apply a weak form of scrambling to copy protected
    video, e.g. inverting the pixel intensities, prior to recording. The scrambled
    video is unwatchable and the recorder will fail to detect a watermark and
    consequently allow a copy to be made. Of course, on playback, the video signal
    will be scrambled, but the user may then simpl[y] invert or descramble the
    video in order to watch a perfect and illegal copy of a video. Simple scrambling
    and descrambling hardware would be very inexpensive [É] One way to avoid
    such circumvention for digital recording is to only allow the recording of
    content in a recognized file format. Of course this would severely limit
    the functionality of the storage device.<a href="#_ftn11" name="_ftnref11" title="">[11]</a></p>
  <p>It does not appear to be feasible to defend against this attack simply by
    improving the strength of watermark technologies against targeted attacks.
    This attack is not, strictly speaking, a question of the strength or weakness
    of particular watermarks. Instead, it is a limitation on the applicability
    of watermarking technology for particular purposes.</p>
  <h1><strong>Limitations of Technical Approaches: VBI Signaling Technologies</strong></h1>
  <p>Several schemes proposed to ARDG use the vertical blanking interval (VBI)
    of an analog video signal to embed digital labels in a predictable and standardized
    way. Because the vertical blanking interval does not contain video picture
    data, it is possible to use it to convey a limited amount of digital data
    out-of-band with respect to the picture. The best-known application for the
    VBI data is closed captioning (CC). The VBI data could also include copy-control
    labels, and, as some presenters observed, some standards already permit (but
    do not require) the use of portions of the VBI in certain video interfaces
    to convey copy-control labels. This approach is also vulnerable to straightforward
    attacks.</p>
  <p>Although some VBI signaling proposals are combined with watermarks,</p>
  <p>VBI signaling is importantly different from watermarking. Watermarking attempts
    to hide a mark within a signal; VBI signaling does not attempt to hide the
    label at all. In every digital VBI signaling scheme, the copy label is in
    a publicly documented format at a publicly documented location within the
    VBI. This implies that removing or altering the copy label can, as a technical
    matter, be done using only public information.</p>
  <p>Copy labels in the vertical blanking interval can be removed or altered
    either accidentally or deliberately. Some existing products strip out the
    entire VBI portion of certain video signals. Because the copy labels appear
    at a single known location in the video signal, it is technically easy to
    remove them using known techniques. (CGMS-A proponents note this: for instance,
    they observe that &quot;[b]lanking or stripping those lines from the VBI
    that contain CGMS-A and RCI would be technically the easiest way to attack
    CGMS-A Plus RC&quot; and suggest that doing so &quot;is not difficult with
    low-cost specially purposed boxes or circuitry&quot;<a href="#_ftn12" name="_ftnref12" title="">[12]</a>.</p>
  <p>Indeed, products available today, both lawfully and unlawfully, can likely
    be used to remove or alter the contents of CGMS-A labels or other copy-control
    labels in the vertical blanking interval. For example, devices for the insertion
    and editing of closed-caption data may allow line-by-line editing of the
    contents of all VBI lines in a particular video format. Some devices strip
    the entire VBI, or particular lines, inadvertently or in order to impair
    copy-control applications.</p>
  <h1><strong>Accessibility for the Disabled: Consequences of VBI Signaling Technologies</strong></h1>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Currently,
    hearing-impaired people are concerned about the preservation of closed-caption
    data in video signals. Using the VBI, and especially VBI lines shared with
    closed-caption data, as a location for signaling in a widely deployed copy-restriction
    system may create unintended consequences for hearing-impaired and other
    users of closed captions.</p>
  <p>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Suppose
    a general decides to locate a military operation next to a hospital. It is
    illegal to bombard a hospital, so the general may hope to protect the military
    operation by placing it nearby the hospital.<a href="#_ftn13" name="_ftnref13"
title="">[13]</a> Even though this tactic
    may tend to protect the military operation, it simultaneously puts the hospital
    at risk. Where beforehand there was no incentive for fighting or bombardment
    in the vicinity of the hospital, the proximity of a military target means
    that the hospital could well become &quot;collateral damage&quot;. The military
    operation\'s presence creates risk for the hospital that was previously nonexistent.</p>
  <p>In this case, placing valuable data desired by consumers (CC data) adjacent
    to data they have an incentive to obliterate (copy-control labels) simultaneously
    diminishes the chance that the latter will be removed and increases the chance
    that the former will be removed. While ARDG participants suggested that regulations
    do, or will, protect against alterations to VBI data, it is worth considering
    the market impact with or without regulation. To return to our analogy, putting
    military operations near hospitals increases risks even in the presence of
    a legal standard (such as the Geneva Convention) forbidding the bombardment
    of hospitals. Today, there is generally no incentive to strip the VBI or
    any particular line in a video signal. (Indeed, consumers currently have
    good reason to prefer devices that preserve the VBI to devices that strip
    it.) If the VBI is widely used for an application that consumers would prefer
    be absent, they will for the first time have an incentive to develop, use,
    or purchase devices that strip some or all of the VBI. That could cause such
    devices to proliferate in the marketplace, to the detriment of the availability
    of CC data and everyone who relies upon it. The wisest course in order to
    ensure that all devices preserve the VBI data would be to avoid creating
    any new incentive to alter it.</p>

  <h1><strong>Remarks on Particular Technologies</strong></h1>

  <p>In addition to our submissions in the ARDG analysis matrices, we make the
    following brief observations about three particular technologies.</p>
  <h2>Macrovision Corporation</h2>
  <p>Macrovision Corporation proposes to use existing Macrovision analog copy-restriction
    technologies in various combinations as a means of signaling copy-control
    states.</p>
  <p>One consequence of this approach is that older devices that happen to be
    vulnerable to Macrovision signals will treat all marked materials as Copy
    Never. Federal law requires certain recording devices to exhibit this vulnerability,
    so, absent a change in the law, many new devices will continue to have this
    problem.<a href="#_ftn14"
name="_ftnref14" title="">[14]</a></p>
  <p>At the same time, the effectiveness of using any existing Macrovision analog
    copy-restriction technology to deter deliberate infringement may be limited,
    since means of removing Macrovision\'s video signal degradations are widely
    known. Indeed, Macrovision has published several such techniques in patents.</p>
  <h2>Dwight Cavendish Systems</h2>
  <p>Dwight Cavendish Systems claims to have a technology capable of interfering
    with the functionality of current digitizer hardware, without requiring such
    hardware to be redesigned.<a
href="#_ftn15" name="_ftnref15" title="">[15]</a></p>
  <p>To our knowledge, this technology remains unpublished and publicly unproven
    to date. Dwight Cavendish did not provide relevant technical details to ARDG.</p>
  <h2>VEIL Interactive Technologies</h2>
  <p>VEIL\'s proposal involves a feature called a Visual Rights Assertion Mark,
    or V-RAM. VEIL\'s explanation in the course of its presentation at ARDG suggests
    that the V-RAM has many properties in common with a traditional video watermark
    and therefore that our concerns about video watermarking schemes generally
    apply to VEIL\'s scheme as well.</p>
<hr size=1> 
<div id=ftn1>
  <p class=MsoFootnoteText><a href="#_ftnref1" name="_ftn1" title="">[1]</a> See, e.g., Andrew Huang, &quot;Myths
    and Misconceptions About Hardware Hacking&quot;, presentation to Analog Reconversion
    Discussion Group, May 28, 2003, available at <a href="http://www.cptwg.org/Assets/Presentations/ARDG/ARDGHardware_hack05-28-03.pdf">http://www.cptwg.org/Assets/Presentations/ARDG/ARDGHardware_hack05-28-03.pdf</a>.</p>
</div>
<div id=ftn2>
  <p class=MsoFootnoteText><a href="#_ftnref2" name="_ftn2" title="">[2]</a> See Peter Biddle, Paul England, Marcus
    Peinado, and Bryan Willman, &quot;The Darknet and the Future of Content Distribution&quot; (2002),
    available at <a href="http://crypto.stanford.edu/DRM2002/darknet5.doc">http://crypto.stanford.edu/DRM2002/darknet5.doc</a>.</p>
</div>
<div id=ftn3>
  <p class=MsoFootnoteText><a href="#_ftnref3" name="_ftn3" title="">[3]</a> See, e.g., Brief of <i>Amici Curiae</i> Dr.
    Steven Bellovin <i>et al.</i>, at 23-30, Universal City Studios v. Eric Corley, 273
    F.3d 429 (2nd Cir. 2001)  (No.
    00-9185) (discussing importance of public peer review to development of cryptography
    and computer security). Similar points are made by several authors on security
    engineering; see, for example, Bruce Schneier, <i>Crypto-Gram</i>, May 15,
    2002, available at <a href="http://www.schneier.com/crypto-gram-0205.html">http://www.schneier.com/crypto-gram-0205.html</a>. Schneier
    was also a party to the Bellovin <i>et al.</i> brief.</p>
</div>
<div id=ftn4>
  <p class=MsoFootnoteText><a href="#_ftnref4" name="_ftn4" title="">[4]</a> See Scott Craver <i>et al.</i>, &quot;Reading
    Between the Lines: Lessons from the SDMI Challenge&quot;, Proceedings of
    the 10<sup>th</sup> USENIX Security Symposium (August 13-17, 2001), available at <a href="http://www.usenix.org/events/sec01/craver.pdf">http://www.usenix.org/events/sec01/craver.pdf</a>.</p>
</div>
<div id=ftn5>
  <p class=MsoFootnoteText><a href="#_ftnref5" name="_ftn5" title="">[5]</a> See Bruce Schneier, <i>Crypto-Gram</i>,
    February 15, 1999, available at 
<a href="http://www.schneier.com/crypto-gram-9902.html">http://www.schneier.com/crypto-gram-9902.html</a>
    (discussing inability of inventors to evaluate their own inventions\' security
    properties, and the prevalence of inaccurate claims of security on the part
    of vendors).</p>
</div>
<div id=ftn6>
  <p><a href="#_ftnref6" name="_ftn6" title="">[6]</a> See National Institute
    of Standards and Technology, &quot;Announcing Request for Candidate Algorithm
    Nominations for the Advanced Encryption Standard (AES)&quot;, 62 Fed. Reg.
    48051 (September 12, 1997) (explaining public review process and submission
    requirements including detailed technical disclosures); &quot;Specification
    for the Advanced Encryption Standard (AES)&quot;, Federal Information Processing
    Standards Pub. 197 (November 26, 2001) (codifying encryption standard selected
    as a result of hat process); Schneier (explaining why public review process,
    including independent third party analysis, yielded a more secure AES with
    improved public confidence). For the AES conferences, at which flaws or potential
    flaws in candidate technologies were identified, see 
<a href="http://csrc.nist.gov/CryptoToolkit/aes/round1/conf1/aes1conf.htm">http://csrc.nist.gov/CryptoToolkit/aes/round1/conf1/aes1conf.htm</a>,
    <a href="http://csrc.nist.gov/CryptoToolkit/aes/round1/conf2/aes2conf.htm">http://csrc.nist.gov/CryptoToolkit/aes/round1/conf2/aes2conf.htm</a>, and
<a href="http://csrc.nist.gov/CryptoToolkit/aes/round2/conf3/aes3conf.htm">http://csrc.nist.gov/CryptoToolkit/aes/round2/conf3/aes3conf.htm</a>.</p>
</div>
<div id=ftn7>
  <p class=MsoFootnoteText><a href="#_ftnref7" name="_ftn7" title="">[7]</a> Scott Craver, &quot;What We Expect
    from Watermarking&quot;, presentation to Analog Reconversion Discussion Group,
    May 7, 2003, available at 
<a href="http://www.cptwg.org/Assets/Presentations/ARDG/watermarking5-7-03.ppt">http://www.cptwg.org/Assets/Presentations/ARDG/watermarking5-7-03.ppt</a>.</p>
</div>
<div id=ftn8>
  <p class=MsoFootnoteText><a href="#_ftnref8" name="_ftn8" title="">[8]</a> Many different scrambling techniques
    are available. One approach mentioned by Ingemar and Linnartz, <i>infra</i>,
    and Craver, <i>supra</i>, is inverting some feature of each pixel or group
    of pixels, such as its luminance. Another method might be the addition of
    a complicated periodic signal known to the attacker, or even the addition
    of a nonperiodic and essentially random signal. In the ideal case, the scrambled
    signal is totally uncorrelated with the original signal, so the watermark
    is completely undetectable. Some analog scrambling techniques may affect
    quality if they expand the bandwidth or dynamic range of the signal, but
    it has not to our knowledge been shown or argued that any significant loss
    of quality must occur. As we suggest below, the techniques of analog scrambling
    previously practiced before the digital era could be revived for this purpose;
    many old techniques would have useful properties for this application. Instead
    of concealing an analog voice recording\'s contents against eavesdropping
    by foreign agents, this system would conceal an analog video signal\'s watermark
    against detection by a watermark detector.</p>
</div>
<div id=ftn9>
  <p class=MsoFootnoteText><a href="#_ftnref9" name="_ftn9" title="">[9]</a> SIGSALY was a hybrid analog and digital
    system. Its keying material was made up of pairs of recordings of random
    thermal noise on identical phonograph records. SIGSALY and other systems
    of its era demonstrate that modern digital hardware and digital computers
    are not necessary in order to scramble an analog signal usefully Ð and reversibly.</p>
</div>
<div id=ftn10>
  <p class=MsoFootnoteText><a href="#_ftnref10" name="_ftn10" title="">[10]</a> Ingemar Cox and Jean-Paul M. G. Linnartz, &quot;Some
    general methods for tampering with watermarks&quot;, 16 IEEE Journal on Selected
    Areas of Communications 583 (1998), available at</p>
  <p class=MsoFootnoteText>
<a href="http://www.neci.nj.nec.com/homepages/ingemar/papers/jsac98.pdf">http://www.neci.nj.nec.com/homepages/ingemar/papers/jsac98.pdf</a>.
    See also Craver, &quot;What We Expect from Watermarking&quot;.</p>
</div>
<div id=ftn11>
  <p class=MsoFootnoteText><a href="#_ftnref11" name="_ftn11" title="">[11]</a> Cox and Linnartz, Section 6.4. The
    assumption that the result of this technique is necessarily an &quot;illegal
    copy&quot; is unwarranted.</p>
</div>
<div id=ftn12>
  <p class=MsoFootnoteText><a href="#_ftnref12" name="_ftn12" title="">[12]</a> Analysis Matrix submitted to Analog
    Reconversion Discussion Group by CGMS-A Plus RC proponents, answers to questions
    2.8 and 3.3.</p>
</div>
<div id=ftn13>
  <p class=MsoFootnoteText><a href="#_ftnref13" name="_ftn13" title="">[13]</a> Convention for the Amelioration of
    the Condition of the Wounded on the Field of Battle (Geneva, August 22, 1864),
    Art. 1. However, the Convention conditions the protection of hospitals on
    their use for a non-military purpose; their protection &quot;shall cease
    if the ambulances or hospitals should be held by a military force&quot;.</p>
</div>
<div id=ftn14>
  <p class=MsoFootnoteText><a href="#_ftnref14" name="_ftn14" title="">[14]</a> See 17 USC 1201(k) (requiring analog
    VCRs to &quot;conform&quot; to Macrovision technologies by refusing to record
    or &quot;exhibit[ing] a meaningfully distorted or degraded display&quot;).</p>
</div>
<div id=ftn15>
  <p class=MsoFootnoteText><a href="#_ftnref15" name="_ftn15" title="">[15]</a> Dwight Cavendish\'s presentation at
    the October ARDG meeting says its technology is &quot;Effective on legacy
    equipment [i]ncluding legacy capture cards&quot;. Its Analysis Matrix, in
    the answer to question 2.5, similarly asserts that the technology &quot;can
    [É] provide some control over legacy devices&quot;.</p>
</div>
</div>
';

global $REQUEST_URI;
$smarty->assign('content',$content);
$smarty->display('generic.tpl',$REQUEST_URI);

?>
