Art
  Censorship
  Censorship
  History
  Censorship
  of Youth
  Copyright   Internet   Media
  Policy
  Political
  Speech
  Sex and   Censorship     Violence in   the Media

  Home
  About Us
Archives
  Commentaries
  Contact Us
  Court and Agency Briefs
  Fact Sheets
  Issues
  Links
  News
  Policy Reports
  Press
  Reviews


Search FEPP



Commentary

Bad Facts Make Bad Law: The Perils of Filtering in a post-Grokster World

By Marjorie Heins

Music-loving teenagers may have been disappointed by the Supreme Court's decision last year in MGM v. Grokster, condemning peer-to-peer file-sharing networks. But the full implications of that case are just beginning to be played out.

One of the most troubling such implications has to do with the deceptively simple "quick fix" of using filtering software to prevent access to copyrighted works. A decision from the federal district court last month, on remand from the Supreme Court in Grokster, dramatizes the filtering dilemma.

Grokster was the entertainment industry's challenge to the second generation of P2P technology, developed after first-generation Napster's demise. Napster had been found liable for vicarious and contributory copyright infringement because of its knowing facilitation of unlawful file-sharing by its users.

The industry argued that the second-generation networks were just as guilty as Napster. Grokster, StreamCast, and the other networks countered that they were different from Napster because they simply supplied a technology that could be used for both legal and illegal purposes; they didn't facilitate the sharing of copyrighted music or other digital files through centralized indexes.

Grokster and its co-defendants won in the lower courts. Because their technology was capable of substantial noncopyright-infringing uses, the courts said, they were protected by the Supreme Court's 1984 decision in Sony v. Universal City Studios. Sony held that the makers of video cassette recorders, which the industry wanted to outlaw, shouldn't be liable for unlawful uses of the VCR so long as it had substantial non-infringing uses, such as recording TV shows for later viewing. (The irony of Sony, of course, was that the same industry that wanted to stop the VCR later made a fortune renting and selling videos for use with just this technology.)

The lower courts in Grokster put aside, for later adjudication, questions about whether the defendants' conduct in their start-up days amounted to contributory or vicarious infringement. This conduct included eagerly courting Napster subscribers and selling advertising whose price increased according to the volume of (mostly illegal) files that were shared. Instead, the lower courts focused on the purely legal question of whether the networks should be held liable for a technology that was capable of substantial non-infringing uses, and the answer to that, under Sony, was "no."

The Supreme Court vehemently disagreed. Although reaffirming the Sony principle, the justices were, quite frankly, so offended by the defendants' behavior - as amply detailed in the briefs of the industry and its supporters - that they imported the concept of "inducement" from the common law into the law of contributory copyright infringement. Thus, they ruled that "distribution of a product can itself give rise to liability where evidence shows that the distributor intended and encouraged the product to be used to infringe."

The justices were so offended that they even held it against Grokster and StreamCast that they made no effort "to filter copyrighted material from users' downloads or otherwise impede the sharing of copyrighted files." This was the most chilling part of the Grokster decision. For filters are notoriously inaccurate, and are likely to blocking noncopyrighted works as well as copyrighted ones. In addition, filters cannot figure out whether a free-expression safeguard in the law, such as fair use, might apply to a copyrighted work. Fair use allows the reproduction and distribution of copyright-protected materials without permission of the owner, for such purposes as commentary, criticism, and scholarship.

Copyright filters, whether they operate by "acoustic fingerprint technology" or by identifying metadata, are likely to overblock. Metadata filtering blocks all files that have common words, whether or not they are copyrighted. Acoustic fingerprint technology can also make mistakes, and relies on what may be inaccurate information about what is copyright-protected in the first place. Neither of these mechanistic devices can take account of the nuanced, fact-specific balancing process that goes on in copyright cases to determine whether fair use applies.

The Supreme Court did say, in the now-famous footnote 12 of its Grokster decision, that "in the absence of other evidence of intent, a court would be unable to find contributory infringement liability merely based on a failure to take affirmative steps to prevent infringement." But what did this really mean? Either evidence is relevant or not. If filters are unreliable, and block constitutionally protected communications, then failure to filter should not be relevant. And if the failure to filter becomes relevant once a copyright owner finds a scintilla of arguable intent to induce infringement (after discovery through internal emails, memos about engineering decisions, and the like), the only safe course will be for everyone to adopt filters, regardless of how inaccurate and overbroad they are, or how much damage they do to fair use and free speech.

These unfortunate implications have now been played out in the federal district court's September 2006 decision on remand in Grokster. The court granted "summary judgment" against StreamCast, the only remaining defendant from among the several P2P networks that the entertainment industry sued back in 2001. The court did so based on a broad - arguably overbroad - reading of the Supreme Court's Grokster decision, and in a way that is likely to force technology companies to install dubious filtering products in order to reduce their risk of liability for questionable things that some users do with their products.

Following the Supreme Court's cue, federal judge Stephen Wilson found ample evidence that StreamCast had induced copyright infringement by courting former Napster users, aiding them with downloads, and profiting through advertising sales. Added to these damning facts, said the judge, was StreamCast's "failure to prevent infringing use." Stretching the Supreme Court's reference to filtering, Judge Wilson said that "by implication," StreamCast "must at least make a good faith attempt to mitigate the massive infringement facilitated by its technology."

That is a big "implication." By using the word "must," Judge Wilson essentially imposed an affirmative requirement to filter. And he did so even as he acknowledged that metadata filtering blocks all files that share common words, whether copyrighted or not, and that there are significant issues of fact regarding acoustic fingerprinting as well.

These issues of fact clearly made the issue of filtering inappropriate for summary judgment - a judicial shortcut that avoids trial when there are no facts in dispute. Judge Wilson brushed aside the factual issues by saying: "Even if filtering technology does not work perfectly and contains negative side effects on usability, the fact that a defendant fails to make some effort to mitigate abusive use of its technology may still support an inference of intent to encourage infringement."

But this is a non sequitur. If the "effort to mitigate abusive use" consists of a technology that may shut down many legitimate and constitutionally protected uses of digital content, then failure to use it should not be held against software innovators.

Bad facts make bad law, and Judge Wilson, indignant at StreamCast's profiting from what has now been determined to be mostly illegal file-sharing, has created some very bad law on the issue of filtering.

One bad precedent, however, doesn't doom all further efforts to demonstrate the perils of filters. Researchers and advocates need to gather more information about the various products and the claims made by their manufacturers. Mechanical solutions may be an acceptable tradeoff when it comes to blocking spam and even more pernicious online perils such as viruses, but it's not an acceptable tradeoff when free speech, including fair use of copyrighted materials, hangs in the balance.

November 3, 2006

For background on the Grokster case, see "Two Defeats and a Silver Lining - The Grokster and Brand X Decisions."

For more on copyright and fair use, see "Will Fair Use Survive? Free Expression in the Age of Copyright Control," and the Fair Use Network Web site.


The Free Expression Policy Project began in 2000 to provide empirical research and policy development on tough censorship issues and seek free speech-friendly solutions to the concerns that drive censorship campaigns. In 2004-2007, it was part of the Brennan Center for Justice at NYU School of Law. The FEPP website is now hosted by the National Coalition Against Censorship. Past funders have included the Robert Sterling Clark Foundation, the Nathan Cummings Foundation, the Rockefeller Foundation, the Educational Foundation of America, the Open Society Institute, and the Andy Warhol Foundation for the Visual Arts.

All material on this site is covered by a Creative Commons "Attribution - No Derivs - NonCommercial" license. (See http://creativecommons.org) You may copy it in its entirely as long as you credit the Free Expression Policy Project and provide a link to the Project's Web site. You may not edit or revise it, or copy portions, without permission (except, of course, for fair use). Please let us know if you reprint!