Post Copyright Reform

The Trouble with Twitch’s Mass Takedown

November 12, 2020 , , , ,
twitch logo

On October 20, Twitch streamers woke up to find that thousands of their videos had been permanently, and without advance notice, wiped from the platform. Not only did Twitch blatantly violate the law (leaving it vulnerable to user lawsuits), the incident also shows how the Digital Millennium Copyright Act fails utterly at protecting individual speech online.

A Tale of Two 512s

When copyright policy folks talk about “section 512” of the DMCA, we’re generally referring to Section 512(c). Crucially, 512(c) deals with platforms’ legal exposure from rightsholder suits. It outlines a series of steps that, when followed properly, means that rightsholders cannot successfully sue those platforms for copyright infringement committed by that platform’s users. To avail itself of this (rightsholder-facing) safe harbor, a platform must, among other things, accept takedown notices[1], process those notices promptly[2], and remove or disable access to the referenced material[3].

But that isn’t the only safe harbor in Section 512 — because rightsholders aren’t the only parties that might suffer harm. The drafters of the law recognized that a user whose speech is removed under the DMCA may also have a legal (albeit non-copyright) claim of harm. Disappearing content may lead to breached contracts, loss of revenue for creators on the platform, or any number of (generally state law) damages. In order for a platform to avoid liability on that front, it must follow a second set of procedures whenever it removes content under the DMCA. It must accept counter-notices[4], and, when it receives a counter-notice, restore the content within 10-14 days[5].

In short, the DMCA provides platforms with not one, but two liability shields. So let’s take a look at what Twitch did, and see if you can spot all the ways Twitch botched its response along the way.

What Happened?

In late spring, Twitch began receiving a large volume of DMCA takedown notices, some claiming content that had been up for months or even years. Rather than acting on these notices, Twitch sat on them for several months — possibly in an attempt to negotiate with the Recording Industry Association of America, and with the International Federation of the Phonographic Industry (the RIAA’s international counterpart). We’re not sure what happened in the interim, but in mid-October, Twitch decided to act on several months of accumulated claims simultaneously.

On the morning of October 20, thousands of Twitch streamers woke up to emails informing them that content had been removed from their channels. The emails were vague, saying only that (a) the user’s channel had received a DMCA notice, and (b) Twitch had deleted the flagged content. Conspicuously missing from the email were indications of what had been flagged, when it had been flagged, what the content allegedly infringed upon, or any mention of a counter-notice procedure.

Twitch, it turns out, had gone through and permanently deleted every clip against which a claim had been filed. Those videos, according to Twitch’s official statements, have been permanently wiped from the platform’s servers, and cannot be replaced. They are gone forever. Because of this, there’s no way for users to file a counter-notice and have their content restored, as required by 512(g). Rather than actually complying with the law as written, Twitch instead informed streamers that works deleted during the mass purge would, generously, not count toward punishments under Twitch’s repeat infringer policy.

Streamers lost months, years, and sometimes entire channels’ worth of content that they created, in violation of the law; in exchange for this loss, they “generously” got a lecture about how they were getting off easy, and a warning not to do it again.

So What Protects Consumers Anyway?

In short: a few sentences of statute that platforms apparently think (or are being pressured to think) they can ignore at will.

The DMCA, as it’s written, has only thin procedural protections for users who are subject to a takedown notice. Those protections are embodied in 512(g) — and the reality is that a platform feels far more threatened by a copyright infringement lawsuit, with its legally-presumed damages of $150,000+ per instance, than it does by the risk of a lawsuit filed by a few users who have to prove actual financial damage. Add in the problem of abusive notices, and you have a recipe for mass-scale user disenfranchisement. While Twitch may not have had procedures in place to deal with this large volume of notices, Twitch’s users and creators shouldn’t be the ones to pay the price.

For all its flaws, the DMCA recognized that platforms are an intermediary standing between two separate but legally cognizable interests — users and copyright holders. Courts and content industry lobbying have watered down user protections for years. If we’re going to move forward at all, we need to give users a meaningful recourse to fight back against abuse — and to hold platforms accountable for throwing their user base under the bus.

[1] § 512(c)(1)(A)(iii)

[2] § 512(c)(1)(C)

[3] § 512(c)(1)(C)

[4] § 512(g)(2)(B)

[5] § 512(g)(2)(C)

 

Image credit: Wikimedia Commons


About Meredith Filak Rose

As Senior Policy Counsel, Meredith focuses on copyright, DMCA, intellectual property reform, and governance issues, as well as telecommunications regulatory matters. Prior to working at Public Knowledge, Meredith worked on consumer policy issues at the Federal Communications Commission, the Trans-Atlantic Consumer Dialogue, and Knowledge Ecology International. Meredith received her J.D. and A.B. from the University of Chicago. When not in the office, she’s an avid video gamer and desert hiker.