Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Vimeo Sued For Encouraging Infringement Via 'Lipdubs' (2009)

from the inducement? dept

Summary: Vimeo is a video hosting site that was originally founded in 2004 as an offshoot of CollegeHumor. IAC acquired the company early on and it tried to position itself as an alternative to YouTube that was more for creators. In the early years, one thing that was common on Vimeo was so-called ?lipdubs.? These were music videos often made by groups of people lipsync?ing to popular songs.

Vimeo itself effectively launched this trend with its staff doing a lipdub of the Harvey Danger song ?Flagpole Sitta.? At the end of the video, you hear one Vimeo employee say ? we just made a million dollars, people!? And, indeed the phenomenon helped establish Vimeo?s place in the market. Soon there were many other lipdubs all over Vimeo.

However, lipdubs also caught the attention of the music industry, which noticed that the songs in these lipdubs had not been licensed. In 2009, EMI subsidiary Capitol Records sued Vimeo for copyright infringement. Like many web services hosting user generated content, Vimeo relied on the safe harbors of the Digital Millennium Copyright Act (DMCA) in the US, which holds that if you meet certain conditions, you cannot be held liable for user uploads of infringing works.

Capitol Records argued that Vimeo was not protected under the DMCA for a variety of reasons, starting with the fact that it uploaded its own lipdub, which then effectively encouraged others to upload similar lipdubs. The complaint noted that the DMCA only protects against user uploads, not ones done by the company itself. It also claimed that since Vimeo employees were seen to have ?liked? or commented on many other lipdubs, often speaking approvingly of the videos, it also meant that the company had so-called ?red flag knowledge? of the infringing content — which might also remove the DMCA?s safe harbor protections from the company.

Decisions to be made by Vimeo:

  • Should employees have created their own lipdub to kick off this kind of trend? If they did, should they have first licensed the music?
  • How should the company handle copyright-covered music?
  • Should employees like or comment videos of users if they?re uploading music for which it?s likely they do not hold the rights?
  • Should the company continue to rely on the DMCA or seek out licenses?
  • Should the company explore alternatives for moderation beyond just taking down videos based on notices?

Questions and policy implications to consider:

  • The songs included in lipdubs often ended up getting lots of public recognition and renewed attention. Given the promotional value of these videos, should these kinds of videos be seen as beneficial, rather than infringing?
  • Should a lipdub be considered fair use?

Resolution: The Capitol Records v. Vimeo lawsuit went on for many years, bouncing around the courts. In an initial ruling, the district court judge noted that while some videos were likely protected under the DMCA?s safe harbors, many others were not (including those uploaded by the company itself).

On appeal, however, the 2nd circuit found that just because some employees within a company had clearly seen the videos in question, that did not mean the company had ?red flag knowledge?, and the DMCA?s safe harbors still applied.

The mere fact that an employee of the service provider has viewed a video posted by a user (absent specific information regarding how much of the video the employee saw or the reason for which it was viewed), and that the video contains all or nearly all of a copyrighted song that is ?recognizable,? would be insufficient for many reasons to make infringement obvious to an ordinary reasonable person, who is not an expert in music or the law of copyright.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , , , ,
Companies: vimeo

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Vimeo Sued For Encouraging Infringement Via 'Lipdubs' (2009)”

Subscribe: RSS Leave a comment
3 Comments

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow