Packingham and the Public Forum Doctrine: Implications for CopyrightJune 22, 2017
The Supreme Court's recent decision in Packingham v. North Carolina struck down, as unconstitutional under the First Amendment, a state law making it a felony for registered sex offenders to access social media websites. The decision has wide-ranging potential implications for technology law, especially on matters of rights to access the internet, which are particularly important for marginalized and disenfranchised voices in our society. Below, Harold Feld reviews the Packingham decision and explores its implications for one area of law: the Digital Millennium Copyright Act's provisions regarding termination of internet access for accused copyright infringers. This post was originally posted on Harold's personal blog, “Tales of the Sausage Factory,” on wetmachine.com.
On Monday, June 19, the Supreme Court issued two significant First Amendment decisions. Most of the press attention went to Matal v. Tam a.k.a. “The Slants'” case. But the far more significant case for my little neck of the woods was Packingham v. North Carolina. Because Packingham focused on criminal law, and did not have anything to do with the Washington Redskins keeping or changing their name, it garnered relatively little attention. But Packingham has much more importance for the future of the First Amendment online by recognizing the primary First Amendment right of subscribers to access broadband platforms and content. Indeed, Justice Kennedy’s paean to the internet as the modern public square echoes themes from the more “Madisonian” view of the First Amendment expounded by scholars such as Cass Sunstein (and prompted alarm from Alito, Roberts and Thomas in concurrence).
This has significant implications for all the crap the Digital Millennium Copyright Act (DMCA) has done to make it easy to kick users offline (and the whole future of “graduated response”/”3 strikes”) and the existing and fairly abusive notice and takedown regime (and efforts to extend it further). It may also have significant implications for the First Amendment argument over broadband, net neutrality, and the future of regulation of online platforms such as Facebook, but I will save that for Part II.
I unpack all this below…
So What Was the Packingham Case?
Packingham involved a test of North Carolina’s rather draconian law preventing anyone on the sex offender registry from ever using any kind of “social media.” The definition of a “social media website” was, as a number of folks (including my employer Public Knowledge) pointed out, so ridiculously broad that it included websites like the Washington Post.
In addition to the law being extremely broad, the facts of the case were extremely bad for enforcement. In 2002, at age 21, Lester Packingham had sex with a 13 year old girl, pled guilty, and did his time. He remained on the sex offender registration list and was therefore subject to the law. One day, Packingham persuaded a judge to dismiss a parking ticket. Packingham posted about his victory on Facebook. As luck would would have it for Packingham, the North Carolina police had decided to crack down on registered sex offenders using social media. Packingham was arrested and convicted for having a Facebook account and posting about his parking ticket.
Importantly for the case, there was never any evidence that Packingham had ever sought sex with a minor again or had in any way used any internet services for any kind of activity related to sex with minors. The sole question presented was that his 2002 conviction and subsequent registration on the NC sex offender list (which lasts for 30 years following release from prison) made it illegal for him to access Facebook (or any other social media site, as defined by the NC statute). Period. This made isolating the First Amendment analysis a lot easier.
What Did the Supreme Court Say?
Kennedy wrote the opinion for a majority of five (as noted below, Alito, Roberts and Thomas declined to join the majority opinion, although they concurred in judgement). In doing so, Kennedy had the chance to slip in a lot of his basic support for treating electronic access and common carriage under the public forum doctrine.
Hold up — What’s the Public Forum Doctrine?
The Public Forum Doctrine is a form of First Amendment analysis that puts great emphasis on protecting ways in which the public has traditionally gathered together to exchange ideas and information. If something is traditionally a public forum, even if it is privately held, it turns out to be hard to restrict people from continuing to use it as a public forum. States and the federal government can take steps to protect the rights of the public to use traditional public fora under the First Amendment (see, e.g., Pruneyard Shopping Ctr v. Robbins; U.S. v. Kokinda). Kennedy has always been very enthusiastic about the idea of the public forum, and has supported the idea that the “public forum” is not limited to something that traditionally has served the purpose of a public forum “since time out of mind” (like a public park), but that the concept evolves and expands with society as a whole.
In electronic media, the basic concepts of the “public forum” doctrine shows up in its most expansive form in Red Lion Broadcasting, Co. v. FCC, which is technically not in the least a public forum case but about “scarcity,” and under Arkansas Educational Television Commission v. Forbes TV is not a public forum mostly anyway. (This is why First Amendment scholars hate media law — it’s kind of a mess.) Kennedy has written two very important decision with respect to the vital importance of civic engagement in electronic media, Turner Broadcasting System v. FCC, 512 U.S. 622 (1994) (aka Turner I) and again in 520 U.S. 180 (Turner II). In addition, Kennedy wrote a very expansive theory of public forum doctrine and how once something has been a common carrier it becomes a public forum and it is a violation of the First Amendment to remove its common carrier status. See Denver Area Education Telecommunications Consortium v. FCC (concurring in part and dissenting in part).
OK, Back to Packingham.
Kennedy jumps directly to public forum doctrine and the right of citizens under the First Amendment to access and participate in the public forum. To quote:
“A fundamental principle of the First Amendment is that all persons have access to places where they can speak and listen, and then, after reflection, speak and listen once more. The Court has sought to protect the right to speak in this spatial context. A basic rule, for example, is that a street or a park is a quintessential forum for the exercise of First Amendment rights. . . . While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear. It is cyberspace—the ‘vast democratic forums of the Internet’ in general, Reno v. American Civil Liberties Union, 521 U. S. 844, 868 (1997), and social media in particular.”
With this as the starting point, Kennedy has no trouble striking down the NC law as unconstitutional interference with the ability of individuals to access and speak online via social media.
“In sum, to foreclose access to social media altogether is to prevent the user from engaging in the legitimate exercise of First Amendment rights. It is unsettling to suggest that only a limited set of websites can be used even by persons who have completed their sentences. Even convicted criminals—and in some instances especially convicted criminals—might receive legitimate benefits from these means for access to the world of ideas, in particular if they seek to reform and to pursue lawful and rewarding lives.”
The Court rejected the idea that the state could justify such a sweeping ban as necessary to prevent previously convicted sex offenders on the registry from using the internet to facilitate sexually assaulting minors. “For centuries now, inventions heralded as advances in human progress have been exploited by the criminal mind. New technologies, all too soon, can become instruments used to commit serious crimes. The railroad is one example, see M. Crichton, The Great Train Robbery, p. xv (1975), and the telephone another, see 18 U. S. C. §1343. So it will be with the Internet and social media.” While states could pass more narrowly tailored laws to prevent specific “conduct that often presages a sexual crime, like contacting a minor or using a website to gather information about a minor,” nothing justified a total ban on social media simply because a prior sex offender could use social media to commit another sexual assault on a minor.
A Somewhat Less Enthusiastic Concurrence.
The concurrence by Alito (joined by Roberts and Thomas) expressed considerable misgivings about what they called the “undisciplined dicta” with regard to the scope of whether social networking sites are public forums. Nevertheless, they concurred in judgment because the definition under the NC law was so broad that it included websites like the Washington Post, Amazon.com, and WebMD, and thus infringed on the First Amendment right of the subscriber to access speech and participate in speech online. “Placing this set of websites categorically off limits from registered sex offenders prohibits them from receiving or engaging in speech that the First Amendment protects and does not appreciably advance the State’s goal of protecting children from recidivist sex offenders,” concluded Alito.
So what does it all mean from my perspective? As always, it’s kind of hard to tell when a case comes down. But even the concurrence unambiguously recognized the right of subscribers to “receiv[e] or engag[e] in speech that the First Amendment protects” online. This has implications for places where the government has either compelled providers of these online fora to terminate access entirely, or proposes to give the providers of these fora an unlimited right to discriminate on the basis of viewpoint or economic incentive.
Implications for Copyright.
Lobbyists for Hollywood have made it a fundamental aspect of their advocacy that any technology or platform capable of making infringing copies is a Tool of Satan to which the untrusted hordes should have no access whatsoever. When access is grudgingly granted, it must only be under constant threat of expulsion from the internet as a whole. If you think I am exaggerating, I direct your attention to the BMG v. Cox case. In that case, copyright holders convinced a district court judge that because Cox refused to suspend the subscriptions of “repeat offenders” (read — people who were accused by a bot of downloading or posting infringing content but had not actually been found guilty of infringement of copyright by any court), Cox had forfeited its protection against secondary liability. As a result, by “enabling” subscribers (assuming the accusations against the subscribers were accurate) to infringe, Cox was liable for damages of $25 million. Again, that’s without actually proving that Cox’s subscribers actually did anything wrong. Cox was liable because the judge found its responsiveness to bot-generated takedown notices insufficiently enthusiatic and Cox tried to do everything in its power to stop infringement without terminating any subscriptions.
The DMCA was passed back in 1998, before internet access — let alone access to social networking sites — was universally recognized as having particular social significance. I won’t get into the whole mess which is the DMCA here. You can find a decent bibliography of criticism of the DMCA here. Instead, I will give a short and highly opinionated version so I can discuss the implications of Packingham.
DMCA Makes Intermediaries Responsible for Policing Copyright.
The content providers pretty much wanted to make everyone else in the universe responsible for “stopping piracy” because that would make life much easier for them. They also, unsurprisingly for profit maximizing firms, had zero concern about potential collateral damage to anyone else. So the DMCA creates a system where providers of internet access (Internet Service Providers) and other intermediaries like social networks can insulate themselves from liability for “secondary infringement” or actual infringement by complying with a bunch of “safe harbor” provisions. See 17 U.S.C. 512. To get safe harbor treatment, the law requires two things.
1. a) Entities covered by the safe harbor agree to the notice and takedown process covered by 512(g). This requires a provider to prevent access to any material identified as potentially infringing, regardless of whether it actually infringes. In theory, there is supposed to be a way to “counter notice” and require restoration of access pending a court determination. In practice, because the law penalizes covered entities that don’t hop to it on takedown requests but has no penalty for ignoring counter notices, covered entities generally just take stuff down when asked with no requirement of proof.
1. b) Section 512(i) requires that a covered entity must have a policy that “provides for the termination in appropriate circumstances of subscribers and account holders of the service provider’s system or network who are repeat infringers.” Or, to translate into English, to avoid liability, there has to be a way for a copyright holder to kick you off the internet (or off a social network) if you are a “repeat infringer.”
Packingham potentially impacts this in two major ways. First, I believe it makes Section 512(i) unconstitutional — at least as applied to ISPs. Second, it breathes considerable new life into the First Amendment challenges to the entire notice and takedown scheme as it exists today. At a minimum, it should make efforts to extend the notice and takedown scheme even further, proposed “takedown and stay down” regimes, downright unconstitutional.
Section 512(i)(1)(A) Probably Unconstitutional WRT ISPs, Less Clear for Social Media Sites.
Granted, Hollywood lobbyists and their wholly owned subsidiaries in Congress are capable of arguing with a straight face that copyright infringement is actually worse than child molestation and therefore the government purpose is sufficiently compelling to override all First Amendment concerns. And some judges, like the district court judge in the BMG v. Cox decision, would probably agree. (Read his opinion here to see if you agree.) But I’m doubtful that the majority of appeals court judges will agree. Whether or not one treats the majority opinion’s public forum analysis of social networks as “dicta” (which is legalese for “stuff in an opinion I don’t like so I don’t consider binding”), all eight Supreme Court justices agreed that subscribers have a First Amendment right to access information and speak online, and that the government cannot prohibit a person from accessing content that has nothing to do with preventing repeat offenses — even when the repeat offense is child molestation, and the evidence arguably supported that child molesters were particularly prone to repetition.
Sorry, if molesting minors doesn’t justify permanently kicking you off the internet, downloading three advance copies of Transformers: The Last Knight shouldn’t either. Congress cannot require ISPs to terminate subscribers accused of downloading pirating material (which is what Section 512(i) amounts to) anymore than it can criminalize accessing the internet after being accused of downloading pirated material. Nor do I expect Big Content to prevail by arguing that getting you thrown off your ISP isn’t blocking you from accessing the internet, because of all the amazing broadband options you have to replace your loss of service. While America boasted thousands of dial-up ISPs in 1998 when the DMCA was passed, most folks are lucky to have a choice of two landline providers capable of providing reliable, always on broadband of sufficient quality to allow me to engage in all my protected First Amendment online activity.
Whether Packingham makes Section 512(i)’s requirement that all social media sites and other “covered entities” have termination policies for “repeat infringers” is somewhat less clear. Taking the majority analysis as actual opinion rather than “undisciplined dicta,” then the answer is clearly yes for major social network sites and platforms including — wait for it — Youtube. It’s kind of hard to argue that the largest online video platform, whose videos include some of the most important raw footage of critical events and which has become a central location for debate, doesn’t qualify as the kind of online public forum Kennedy described. Nor does it make much sense to say access to Facebook and Twitter are protected under the First Amendment while access to Youtube isn’t.
On the other hand, I’m not sure the same analysis applies to cloud storage or other services that don’t share the attributes of a general public forum. And, of course, websites or services that are set up expressly to facilitate the exchange of infringing material don’t qualify for safe harbor protection anyway, so the hypothetical Doctor Evil Sing Along Piracy Exchange is already subject to liability.
Additionally, there is a question of whether the government interest in any specific case is compelling or not. Arguably, there is an interpretation of “repeat infringer” that doesn’t facially violate the First Amendment with regard to blocking access to a public forum to prevent ongoing infringement by a recidivist pirate. But that requires some serious as applied balancing and not the BMG v. Cox analysis which found Cox’s efforts to find ways short of suspending accounts to deal with infringers evidence of bad faith.
Notice and Takedown Regime Questionable Under the First Amendment as Currently Applied.
The First Amendment right all eight Justices agreed on included both the First Amendment right to find protected speech and the right to actually speak. As always, there is no right to “speak” through infringing copyright. But the notice and takedown regime occurs before there is any determination that a particular use infringes. There is a fairly substantial record at this point that many notice and takedown requests come from faulty/careless bots or can be used strategically to shut down criticism or time sensitive speech. In theory, the statute allows those subject to a notice and takedown request to file a counter notice and get something back up. In reality, the system weights insanely heavily toward those filing notices.
The way the current system tilts heavily toward taking down and away from anything resembling a balance is on display in BMG v. Cox. There, BMG employed a third party — Rightscorp — to monitor for infringement. As is the custom, Rightscorp employs automated software to do matching for possible illegal downloads. When Rightscorp finds a suspected illegal download, it sends an automated notification to the ISP to pass along to the customer accused of infringing. Rightscorp also included in the notice an option for the accused subscriber to settle by paying $20. Cox decided that the huge number of automated, virtually identical messages threatening subscribers and offering to drop charges for an immediate cash settlement were scams (what, mass emails claiming I’m in some sort of trouble and offering to settle if I pay a small fee are a scam? I shall ask my friend the Nigerian prince to hunt these villains down!). The BMG court rejected this defense and took Cox’s refusal to honor the bot takedown requests as evidence of bad faith, warranting $25 million in damages with no finding of any actual infringement.
Post-Packingham, I would argue that some greater balance and genuine effort to protect the First Amendment rights of subscribers is required. True, courts have in the past rejected both facial First Amendment challenges to Section 512(g) and as applied challenges. Also, unlike the mandatory cut-off in 512(i), notice and takedown is more narrowly tailored to address the actual harm of infringement. Nevertheless, courts have generally rejected efforts to use copyright to prevent publication (so called “prior restraint”) and required rights holders to seek injunctive relief — accompanied by sufficient proof to support the grant of relief. Significantly, outside the context of the DMCA, the Copyright Act puts the initial burden on the rights holder to meet the burden of proof that an injunction is needed. The DMCA assumes the temporary injunction standard is met with no evidentiary proof. Even in the best case scenario under the plain language of the statute, a covered entity must take the potentially infringing speech down until it receives a counter notice. And, as chronicled in the sources I linked to above, the law does not penalize providers for failure to restore access to the work even when a counter notice is filed, leading many providers to simply ignore counter notices.
All of this raises troubling First Amendment questions in the face of Packingham‘s clear and enthusiastic embrace of the user's First Amendment right to speak via social media and the internet as a whole. Packingham did not, in theory, change any of the applicable law (which in theory always recognized a First Amendment right to speak). But what Packingham did arguably do is put a big fat thumb on the scale side of the speaker/subscriber First Amendment side. This does not make notice and takedown under the DMCA unconstitutional on its face, but it does re-invigorate arguments around the First Amendment and the overall chilling effect of Section 512(g) as practiced.
I caution that calculating the impact of a Supreme Court decision’s broader implications — as opposed to the immediate outcome — is a chancy business. I certainly don’t expect lawyers and lobbyists for Big Content to suddenly undergo a conversion and acknowledge the critical importance of balancing preventing piracy with protecting the user’s First Amendment rights. To the contrary, if history is a guide, we can expect lots of sneering and ranting about evil socialists trying to destroy our purity of essence through infringement blah blah blah Google evil blah blah Facebook evil. We can also expect that those in Congress and in the Copyright Office who have trusted in the arguments of these lobbyists with unwavering faith will continue to do so.
But courts have a tendency to pay attention to what the Supreme Court says, even if it treats the opinion as “undisciplined dicta” rather than core holding. The Federal Court of Appeal for the 4th Circuit has before it Cox’s appeal from the District Court decision in BMG v. Cox. Although briefing in that case is now complete (Public Knowledge filed an Amicus Brief in support of Cox), Cox can still file a supplementary letter raising the Packingham issues. But even if the 4th Circuit declines to consider the impacts of Packingham, I expect it to emerge as a major influence in the ongoing debate on the conflict between the First Amendment and the ever expansive efforts of Hollywood to honor the motto of the DMCA: “with great power comes no responsibilities.”
About Harold Feld
Harold Feld is Public Knowledge’s Senior Vice President and author of “The Case for the Digital Platform Act,” a guide to what government can do to preserve competition and empower individual users in the huge swath of our economy now referred to as “Big Tech.” Former FCC Chairman Tom Wheeler described this book as, “[...] a tour de force of the issues raised by the digital economy and internet capitalism.” For more than 20 years, Feld has practiced law at the intersection of technology, broadband, and media policy in both the private sector and in the public interest community. Feld has an undergraduate degree from Princeton University, a law degree from Boston University, and clerked for the D.C. Court of Appeals. Feld also writes “Tales of the Sausage Factory,” a progressive blog on media and telecom policy. In 2007, Illinois Senator Dick Durbin praised him and his blog for “[doing] a lot of great work helping people understand how FCC decisions affect people and communities on the ground.”