Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: NASA Footage Taken Down By YouTube Moderation (2012)

from the moderating-the-public-domain dept

Summary: NASA’s historic landing of a mobile rover on the surface of Mars created many newsworthy moments. Unfortunately, it also generated some embarrassing takedowns of NASA’s own footage by YouTube’s copyright flagging system, ContentID.

NASA uploaded its own footage of the landing — footage that was public domain given that it was a work created by the U.S. government. US copyright law says that any work created by the US government cannot be granted a copyright. It is, instead, in the public domain, and can be used by anyone without restriction. Unfortunately, the use of this public domain footage in news broadcasts created by Scripps (owner of multiple TV stations as well as a significant ABC stakeholder) put a chain of events in motion that ultimately saw NASA’s footage taken down by YouTube.

The problem was YouTube’s ContentID. Once Scripps uploaded its footage, ContentID bots began crawling the site seeking matches. ContentID is structured in such a way that it believes that if content is uploaded by a private entity, that entity has a legitimate copyright over it — even if the content is in the public domain, shareable freely by all. Shortly thereafter, NASA footage began disappearing while Scripps’ broadcasts utilizing NASA footage stayed live.

This unfortunate outcome is common enough NASA reps are used to having to contact YouTube to get their own public domain uploads restored.

Decisions to be made by YouTube:

  • Should this collateral damage to footage uploaded by government agencies continue to be an acceptable outcome of automated moderation?

  • Should YouTube consider a “three strikes” system that punishes false copyright claims, rather than just those wrongly targeted by ContentID matches?

  • Should YouTube require more input and detail from copyright claimants in order to head off errors like these?

  • How should ContentID deal with material that is in the public domain?

Questions and policy implications to consider:

  • Does erring on the side of copyright holders (even when the rightsholders are wrong) decrease the chance of YouTube being involved in copyright litigation?

  • Is it possible to backstop newsworthy events involving government agencies (and their recordings) with more human moderators? Or is it impossible to decide what’s newsworthy or not given the immense amount of footage uploaded to the site every day?

  • Is overblocking — including public domain works — a reasonable price to pay to avoid infringement?

Resolution: While these sorts of mistaken takedowns are less frequent, they still occur. The most recent removal of NASA footage from YouTube was the result of a (likely automated) copyright claim by National Geographic, which had incorporated NASA’s recordings into its own production. Given YouTube’s dominant position in the video marketplace — which has resulted in it processing thousands of hours of uploads every day — it seems unlikely a solution that pleases every competing stakeholder will ever arrive.

Originally published to the Trust & Safety Foundation website.

Filed Under: , , , , , ,
Companies: youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: NASA Footage Taken Down By YouTube Moderation (2012)”

Subscribe: RSS Leave a comment
13 Comments
Anonymous Coward says:

No it wont, certainly while youtube continuously relies on continuously flawed bots rather than actually trying to sort the problem out. However, the law change that’s desperately needed is one that seriously penalises false takedowns by copyright frauds! While there’s no punishment for doing this, it’s just gonna continue to happen, taking control of the net ever closer to the entertainment industries!

n00bdragon (profile) says:

Re: Re: Re: Re:

It’s not impossible for the NASA youtube channel to post copyright infringing material. They could upload full Disney movies to their channel if they wanted and that wouldn’t make them public domain. Assuming NASA is "less likely" to commit copyright infringement than anyone else on the site is treating them in privileged manner. Everyone should be getting fairly, not just the big important names that can get an article written about them.

Anonymous Coward says:

Re: Re: Re:2 Re:

Using that definition by its very nature ContentID/YouTube already treats some users as privileged. All my suggestion was is to add NASA to that list just in the other direction. Instead of saying "If anyone is reposting this video is infringing Scripp’s Local News copyright" it would be "If anyone is reposting this video, it’s ok because it is likely public domain". Not sure what the butt hurt is over that.

If NASA starts posting Disney videos then we could reassess and Disney can send YouTube an official DMCA notice.

This comment has been deemed insightful by the community.
Tin-Foil-Hat says:

Self reporting

Perhaps the private entities should have a way of self reporting public domain footage incorporated in their videos with a penalty for non compliance when the creator’s video is removed in error. Using time ranges where content id excludes anything within the area.

Content creators should be able to ban problem users from incorporating the footage.

GHB (profile) says:

infringment or not, that is itself based on context

While it is easy to know infringement when content gets reuploaded illegally (like, unedited material), but just because a match does not mean infringement. Heck, not all copying is illegal. I found one solution to this question:

How should ContentID deal with material that is in the public domain?

Is have a whitelist database:
(1) User attempt to upload something (Is_this_PD?) that is believed in the public domain to this database.
(2) Moderators check Is_this_PD is public domain. They may ask the user for additional information and ask for more evidence to verify if it is PD.
(3) If it passes (PD confirmed), it gets added to the whitelist database.
(4) When any new video, or on the next “content ID re-scan”[1], is using footage that matches with the PD database, that portion of the video matching should be ignored for contentID to block, monetize or other means of affecting the video towards the uploader and watchers.

[1] According to the EFF, videos can get flagged during or after the video is uploaded. So after a video is uploaded successfully, the system could get updated (either algorithm changes or new files by rightsholders were uploaded) down the line and “re-scans” video.

ECA (profile) says:

Really?

"Should YouTube consider a "three strikes" system that punishes false copyright claims, rather than just those wrongly targeted by ContentID matches? "

HOW about 1000?
How about 10,000?
Just NAME a number and stick to it.
The Odds are that Most False claims from the corps will hit it in 1 year or less.

Billions of ntices every year, and what are the odds, ALL of the major agencies will hit 10,000 false claims?
AND lets not just Count YT. Google is big enough to see A very large share of the False hits. Count them.

DannyB (profile) says:

Something YouTube should require

YouTube should require identification of two parties: (1) the copyright owner, (2) the party who is requesting the takedown.

Those parties should be disclosed to whoever posted the YT video which is being taken down.

It is only fair that the person being falsely accused of copyright infringement be able to identify (and possibly sue) whoever is falsely accusing them.

catsmoke (profile) says:

A reasonable price

Is overblocking — including public domain works — a reasonable price to pay to avoid infringement?

Don’t we already have an answer to this question, in the realm of justice in public policy?

“It is better for one hundred guilty people to go free, than for one innocent person to be punished.”

Perhaps I fail to ken fine details of the intersection between a moral and a financial “price to pay,” but one of those two nodes dwarfs the other unto insignificance.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow