Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Removes A Picture Of A Famous Danish Mermaid Statue (2016)

from the the-nudity-issue dept

Summary: For over a century, Edvard Eriksen?s bronze statue of The Little Mermaid becoming human has been installed on a rock along the water in Copenhagen, Denmark. The statue was designed to represent the Hans Christian Anderson fairy tale, and has become a tourist attraction and landmark.


Image used here under CC BY-SA 3.0 License, taken by Avda-Berlin.

In 2016, Danish politician Mette Gjerskov used Facebook to post a link to her own blog post on the TV2 website, which included an image of the statue. Facebook automatically displayed the image with the link, leading the company to then take down the link. The explanation provided by Facebook was that the image had “too much bare skin or sexual undertones.”

Gjerskov highlighted the absurdity of the situation, calling it ?ridiculous? that the image caused Facebook?s moderators to block the link. Many people appeared to agree, and as the story began to get more attention, Facebook quickly backtracked and admitted the removal was in error. It restored the link.

Many of the news reports on the story concluded with Facebook?s reversal, but the image actually did not return to Facebook. Due to copyright law in Denmark, the statue is still considered to be covered by copyright (until 2029, 70 years after Eriksen?s death), and his estate has been fairly aggressive in demanding licensing and royalty payments. Because of that, TV2, which hosted Gjerskov?s blog, chose to remove the image that caused the takedown in the first place — not to appease Facebook?s moderation, but to avoid a copyright issue from the Erikson estate, even though a copyright on the statue itself is different from copyright on images of the statue.

Decisions to be made by Facebook:

  • How do you write rules regarding nudity that take into account art or cultural landmarks?

  • Is taking down a link due to images that are automatically embedded via the OpenGraph feature the best solution? Would it make sense to simply remove that image while leaving the link, or have a different image show?

Questions and policy implications to consider:

  • The line between artistic works that depict nudity and works that violate a nudity policy or include ?sexual overtones? is often a very subjective judgment call. How can companies craft rules that are enforced consistently across a diverse set of moderators, often with different cultural backgrounds and experiences?

  • A strict policy against nudity is likely to capture many artistic situations. Is that a reasonable trade-off for websites that seek to be family friendly?

  • How can copyright intersect with other types of challenges regarding content moderation?

Resolution: As noted in the case study, the link was restored after Facebook admitted error, but the image was taken off the website (and, thus, the link on Facebook) due to copyright concerns from TV2. Facebook?s policies already allow many forms of artistic nudity, but mistaken removals for nudity still feel common, given the huge scale of review decisions made on a daily basis. The statue has continued to be a cultural landmark in Denmark, and is often used for making political statements, leading to more photographs being shared of it. Just recently it was vandalized to promote democracy in Hong Kong and to protest racism.

Originally posted to the Trust & Safety Foundation website

Filed Under: , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Facebook Removes A Picture Of A Famous Danish Mermaid Statue (2016)”

Subscribe: RSS Leave a comment
12 Comments
Samuel Abram (profile) says:

Artistic Nudity versus pornography.

I realize Content Moderation at Scale is Impossible™, and that’s why there are these moderation snafus listed. However, I just find it interesting that Nintendo has the fully nude renaissance masterpieces (as well as their fun counterfeits) in their Animal Crossing: New Horizons game and Nintendo, as you know, is a family-friendly company, so I think they could clearly see that The Birth of Venus by Sandro Botticelli and David by Michelangelo are nude in an artistic fashion and not in a pornographic one.

That being said, if David were fucking Botticelli’s Venus doggystyle, then we’d know what’s up. Considering rule 34, it’s probably on the internet.

This comment has been flagged by the community. Click here to show it.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow