While it's no secret that the DMCA has had a noticeable chilling effect on a number of different classes of innovators during the last decade
, it's still disheartening to hear of creative uses of content that have been squelched by big copyright holders. Earlier this month, on the film blog The House Next Door
, film critic Matt Zoller Seitz wrote of fellow House Next Door contributor Kevin B. Lee's recent tussle with YouTube
. Lee had been posting video essays on YouTube that offered critical assessments of Hollywood films, both recent and classic. As part of his essays, Lee often included clips, of varying length, from the films that he discussed. Over the years, Lee had occasionally received DMCA takedown notices via YouTube and not knowing any better, had chosen not to contest them. On January 12th, however, he received his third and final notice and in accordance with YouTube's "three strikes" policy, his account was locked and all 140 of his video essays were made instantly unavailable.
While it's too late to view Lee's video essays, based on the information available on them online, we can surmise that they would have been considered perfectly legal under the doctrine of fair use
. Generally speaking, if a video offers a critique of an existing work, it is permissible for that video to incorporate clips from that existing work and in fact, the law specifically mentions criticism and comment as protected uses. What this usually means, in practical terms, is that as long as the new work offers some comment on the original source and does not use clips of an excessive length, it is generally considered to be a fair use.
So, if Lee's videos seem to be legal under copyright law, why were they taken down from YouTube? While it's not entirely clear, the assumption is that the DMCA takedown notice that YouTube received was generated by an automated system, not a human. Techniques like digital watermarking
and digital fingerprinting
can identify even short video clips online and are widely used. By using such systems, copyright holders can automatically identify uses of their content on the web and send DMCA takedown notices to the sites hosting that content. Whether or not the use in question is legal is not considered by the computer filing the notice--in this case, the burden of proof is placed on the user, if he both understands that he is able to file a counter-notice and chooses to do so. Guilty until proven innocent, if you will.
In the event that a user files a counter-notice, essentially claiming that the work in question was, in fact, legal, the copyright holder who filed the initial claim has 10-14 days to take legal action against the alleged infringer. If the copyright holder does not do so, the work must be restored within 10-14 days of receipt of the counter-notice.
So, as we've seen, a studio can use an automated system to send takedown notices asking that any uses of their content be removed--whether or not those uses are legal. Worst case scenario, that content disappears for 10-14 days, at which point it reappears, if the alleged infringer has filed a counter-notice. At this point, a human at the studio would likely review the content in question, to determine whether or not it's worth pursuing some sort of legal action.
However, if the user in question does not fully understand the notice and takedown system--and it's fairly safe to assume that the average user does not--a counter-notice might not be filed and the content, whether legal or not, may never be restored. This effectively provides rights holders with a mechanism whereby they can have content removed from the web with a mere suggestion that it violates copyright.
In theory, however, behaving in such a manner could land a copyright owner in hot water. Section 512(f) of the DMCA states
that anyone who "knowingly materially misrepresents" that "material or activity is infringing" will be "liable for any damages, including costs and attorneys' fees, incurred by the alleged infringer". This raises the question of whether the use of this type of automated notification system is (or should be) legal, assuming that a computer is not qualified to determine whether or not a work constitutes an example of fair use--a question which has yet to be answered.
Legal questions aside, the real irony here lies in the fact that fair use remixes--like Lee's--only serve to increase the public's interest in a work and likely have a positive
impact on sales. Take the recent example of famed comedy troupe Monty Python, who saw a 23,000% increase in DVD sales when they made many of their sketches freely available for viewing on YouTube
. That tired old trope, which states that making content available online in any way, shape or form hurts sales, only becomes less true with every passing year. It's high time that Hollywood abandon such antiquated arguments and embrace the myriad promotional opportunities that user-generated content offers.
And what of Kevin B. Lee, the would-be film critic? After his story received a fair amount of media attention, he was contacted by YouTube with instructions on how to file a counter-notice, even though his account had been disabled. His account has since been reinstated
, though his most recent video essay is still "under review" by the site and as such, all of his videos remain unavailable. In the wake of his experience, Lee posted a list of lessons that he learned through this process
, a post that reads like a list of best practices for video film critics. It's unfortunate that, thanks to the DMCA, critics must heed warnings like "Don’t Take Services for Granted and Be Prepared to Defend Yourself" but at the moment, that's just the way it is.