Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011)

from the rights-and-responsibilities dept

Summary: In the spring of 2011, two men were on a first date at the John Snow pub in London. They were apparently thrown out of the pub after another patron at the bar complained that the two men were kissing each other in the corner. The story of being thrown out of the pub for kissing began to go viral on social media, followed by a plan for a protest at the pub in question. In a sign of support for the protest, many people on social media posted images of two men kissing each other as well.

The Dangerous Minds Facebook page wrote about the protest, and included a promotional image from the BBC of two male characters from the popular soap opera EastEnders kissing to illustrate the post. The image had become well known in the UK a few years prior, as the scene in question had kicked off complaints to the BBC which the BBC responded to by noting that the relationship between two men was treated no differently than many other relationships displayed on the show between a man and a woman. Soon after this, Facebook removed the image, telling Dangerous Minds that it ?violated Facebook?s Statement of Rights and Responsibilities.?

Soon after this, Facebook removed the image, telling Dangerous Minds that it ?violated Facebook?s Statement of Rights and Responsibilities.?

?Hello,

Content that you shared on Facebook has been removed because it violated Facebook?s Statement of Rights and Responsibilities. Shares that contain nudity, or any kind of graphic or sexually suggestive content, are not permitted on Facebook.

This message serves as a warning. Additional violations may result in the termination of your account. Please read the Statement of Rights and Responsibilities carefully and refrain from posting abusive material in the future. Thanks in advance for your understanding and cooperation.

The Facebook Team?

Given that the image itself was to promote a protest over two men being removed from a restaurant for kissing, the removal of the image sparked further anger directed at Facebook. Many pointed out that there were plenty of images of men and women kissing each other all over Facebook, and they were not being removed.

Decisions to be made by Facebook:

  • Is a picture of two people kissing ?graphic or sexually suggestive content? in violation of rules
  • Should the identities of the people within the images matter in making content moderation decisions?
  • Should the reason for the posting of an image (e.g., as part of a larger protest) matter in determining if the image violated the rules?
  • How well trained is the content moderation staff to recognize relationships that might differ from their own preferred relationships?

Questions and policy implications to consider:

  • Should Facebook be deciding which kinds of legal relationships are acceptable to display and which are not?
  • As seen with the complaints to the BBC, there remain some segment of the population who will vocally protest non-heterosexual relationships. Are there rules, policies, or training that can be put in place to educate content moderation teams that they may receive different types of reports regarding such content?

Resolution: As the story drew more and more mainstream press attention, Facebook quickly apologized, saying the image does not violate any policies, and the takedown was a mistake.

?The photo in question does not violate our Statement of Rights and Responsibilities and was removed in error. We apologize for the inconvenience?

No further explanation of how the image came to be taken down in the first place was given.

Originally posted to the Trust & Safety Foundation website.

Filed Under: , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011)”

Subscribe: RSS Leave a comment
13 Comments
That Anonymous Coward (profile) says:

"“The photo in question does not violate our Statement of Rights and Responsibilities and was removed in error. We apologize for the inconvenience” "

It is not an inconvenience when they can nuke an entire account from orbit over an "oopsie".

The desire to protect the secret sauce of how these decisions are made & reviewed stands in the way to exploring how they managed to screw up in the first place.

How a kiss between 2 men can be "contain nudity, or any kind of graphic or sexually suggestive content" I really don’t get.
It might upset some people but ummm fuck ’em?
The same people who scream about seeing 2 men kissing, really seem to enjoy 2 women kissing… but rarely admit that to their wives.

The fact there is no way to tell what kicked this off, is troublesome as well.
Did that chick who runs 1 million moms send out a conservative battle cry to get it reported enough times?
Do reports outweigh common sense?

Does the fact many platforms seem to ignore they offer tools so the easily offended aren’t offended by content they don’t want to see play into this?
The response is always to nuke the content getting complaints then wait to see if there is an outcry before examining the situation.

If regular people/media make more noise than the conservative brigade then we should take a look.

If they people are doing the same thing as a heterosexual couple that wouldn’t even get noticed, perhaps that is a check that should be applied rather than pretending that the act of 2 men kissing is somehow way more sexual and graphic somehow.

That Anonymous Coward (profile) says:

Re: Re: Re:

Well the "public" keeps demanding that they DO SOMETHING FASTER whenever they stamp their foot & complain.

"The company is like a big lovable teddy bear with autism. Everything is taken literally"

Maybe stop asking the autistic teddy bear to protect your feels & accept you can’t control other people but you can chose not to see what they post without the demand the platform keep everyone from seeing it.

PaulT (profile) says:

Re: Re: Re: Re:

"Well the "public" keeps demanding that they DO SOMETHING"

Some part of the public, but don’t let the fact that some "karens" and repressed types are louder than everyone else fool you into thinking it’s a mainstream opinion. Since the above happened, gay rights to marriage and other things have been enshrined in law, and the reaction seems to have been that they just turned their hatred on to the trans community instead,

PaulT (profile) says:

Re: Re:

"The same people who scream about seeing 2 men kissing, really seem to enjoy 2 women kissing… but rarely admit that to their wives."

A lot of them enjoy going on Grindr whenever they’re "travelling" as well, but will keep up the pretence of being offended by things like this. It’s the "doth protest too much" rule – the more someone is vocally opposed to seeing gay people existing in public, the more likely they are to have some fabulous skeletons in their closets.

Nothing wrong with gay people being gay or showing affection for each other, but the people who complain about it the loudest seem to usually be overcompensating for something.

PaulT (profile) says:

Re: Re: Re: Re:

Yeah as I said, I apply the "doth protest too much" standard here. Most people don’t really care what people get up to in private, and if it’s a public display of affection they’ll react whether it’s a gay or straight couple necking.

But, all too often the people who obsess over this stuff are usually the ones caught with some barely legal gigolo, and the pattern is hard to ignore. Nothing wrong with that if you keep it to yourself, but the people who are hardcore demanding that something is unacceptable between consenting adults and needs to be outlawed, they usually have some skeletons hanging around.

Anonymous Coward says:

typical of Facebook! staff does this sort of thing, in my opinion, because they personally dont like or agree with non-heterosexual relationships. the bigger issue is that Facebook should not be able to go above the law and as the law is changing in more countries every day to agree with non-heterosexual relationships, regardless of whether Facebook staff or anyone else likes it or not, disagrees with it or not, it has now to be accepted.

That Anonymous Coward (profile) says:

Re: Re:

You think this is crazy, you should see the "rejected" photos from Instagram.
2 male pron stars wearing identical speedos, identical poses, one happens to have bodyhair and OMG OUR COMMUNITY STANDARDS!!

An easy rule of thumb that would avoid this sort of stupidity would simply be for them to look at the picture, imagine it was a heterosexual couple doing the exact same thing… would you pull it then? If no, its fine.

That Anonymous Coward (profile) says:

Re: Re: Re: Re:

Yep, and the rejections on Insta make even less sense than the FB rejections.

There is something wrong inside there, but because no one outside can know anything about it the only review is by the same nodding heads who approved the system in the first place without any considerations outside of their heterosexual world view.

PaulT (profile) says:

Re: Re: Re:2 Re:

"There is something wrong inside there, but because no one outside can know anything about it"

Actually, they outsource a lot of their moderation.

"the only review is by the same nodding heads who approved the system in the first place without any considerations outside of their heterosexual world view"

That’s more complicated. While these companies set certain standards, it’s know that individual moderators often make decisions based on personal biases rather than company policy. Because they’re contractors for external companies, they can get away with doing so for a little while, and it’s just another example of how difficult the moderation at scale issue really is.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow