Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway's Biggest Newspaper (2016)

from the the-terror-of-content-moderation dept

Summary: Tom Egeland, a Norwegian author of a number of best-selling fiction books, posted a well-known photo known as “The Terror of War” to Facebook. The historic photograph (taken by Vietnamese-American photographer Nick Ut) depicts a naked Vietnamese girl running from a napalm attack during the Vietnam War.

Ut’s iconic photo brought the horrors of the war in Vietnam to viewers around the world. But it was not without controversy. Given the full-frontal nudity of the child depicted in the image, the Associated Press pushed back against Ut, citing the paper’s policy against publishing nudity. In this case, the nudity of the child resulted in more resistance than usual. Ultimately, the AP decided to run the photo, resulting in a Pulitzer Prize for Ut in 1973.

Despite the photo’s historical significance, Facebook decided to suspend Tom Egeland’s account. It also deleted his post.

Facebook’s decision was based on its terms of service. While the photo was undeniably a historical artifact, moderation efforts by the platform were not attuned to the history.

A notice sent to Egeland pointed out that any displayed genitalia would result in moderation. Also, given the platform’s obligation to inform the government about Child Sexual Assault Material (CSAM), leaving a photo of a naked prepubscent up posed problems the algorithms couldn’t necessarily handle on their own.

The decision to remove the post and suspend the author’s account resulted in an open letter being sent by Norwegian journalist Epsen Hansen. The letter — addressed to Facebook founder and CEO Mark Zuckerberg — asked what negative effects moderation efforts like these would have on a “democratic society.”

Decisions to be made by Facebook:

  • Should automatic moderation that aids law enforcement be overridden when context shows posts are not attempting to sidestep rules put in place to prevent Facebook users from being subjected to abusive content?
  • What value is placed on context-considerate moderation? Does it add or subtract from financial obligations to shareholders?
  • Does it serve users better to be more responsive — and helpful — when context is a primary consideration?

Questions and policy implications to consider:

  • Is the collateral damage of negative press like this offset by Facebook’s willingness to be proactive when removing questionable content?
  • Is it more important to serve private users than the numerous governments making moderation demands?
  • Do inexact or seemingly-incoherent responses to controversial content raise the risk of government intervention?

Resolution: Despite the letter from a prominent Norwegian journalist, Facebook refused to reinstate the photo. Instead, it offered boilerplate stating its objection to “nude genitalia.” While it stated it did make “allowances” for “educational, humorous, and satirical purposes.” Ut’s photo did not make the cut apparently. Facebook asked Aftenposten, Egeland, and/or Hansen to “pixelate” the iconic photo before reposting. This was the response from Aftenposten’s Hegeland:

Unfortunately, Facebook did not see the pointed humor of Hansen’s modification. Facebook’s deletion of the original — as well as its suspension of author Tom Egeland’s account — remained in force. While public shaming has had some effect on moderation efforts by social media companies, Facebook’s stance on nudity — especially the nudity of minors — prevented it from backing down in the face of negative publicity.

Filed Under: , , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway's Biggest Newspaper (2016)”

Subscribe: RSS Leave a comment
32 Comments

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

NOW warmongers / globalists / zionists know to hide evidence.

THEY near entirely control the media and the message is always pro-war.

Where was the daily reports on civilian casualties in Irag?

Where are the gruesome pictures of Palestinians, men, womn, even children shot by a new kind of fragmenting bullet just for getting near the Israelis / zionist apartheid wall?

[No "funny" screen name because serious topic and seems easier to get in today without.]

This comment has been deemed insightful by the community.
Tom, Dick & Harry (profile) says:

Re: NOW warmongers / globalists / zionists know to hide evidence

Where was the daily reports on civilian casualties in Iraq

Behold:

… and more (just do the search yourself)

THEY near entirely control the media and the message is always pro-war.

Disregarding the above, there is more open speech now than 40, 20, 10 years ago. In history major news papers and cables controlled media dissemination but now everyone has a voice: making our own websites, our own social media (fediverse, mastodon), or choosing to use popular platforms.

Alas, we are selective in choosing history.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Where are YOU on Iraq, Afghanistan, Palestine, Maz?

You almost never even at your true opinions. Only when must as with your advocacy of corporations controlling speech.

I’ve been reading here for years, and never a peep from Techdirt about the savagery of American or Israeli forces abroad, even though almost every day you’re complaining about American police as if you’re concerned for the victims of violence. — That’s really just to attack the rule of law / build cred with your leftists fellow travelers.

But when done on mass scale, you’re a tacit warmonger.

Your selection of topics shows who you really are. You prattle every day on minor "moderation" while ignoring millions of Iraqis dead, millions more wounded, the whole country destroyed, and similarly Palestinians, hundreds dead, thousands wounded, imprisoned by the rabid Zionists who invaded Palestine.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Tom, Dick & Harry (profile) says:

Re: Where are YOU on Iraq, Afghanistan, Palestine, Maz?

[. . .] never a peep from Techdirt about the savagery of American or Israeli forces abroad, even though almost every day you’re complaining about American police as if you’re concerned for the victims of violence.

Welcome to Techdirt, an organization about technology, copyright, politics, content moderation, speech, and so on.

There are many negative forces in the world (like you stated). There are many good forces in the world (like the humanitarian aid workers in troubled nations). That being said….

Why are you here, trolling?

I feel this applies: https://xkcd.com/2368/

It says: "Your point that the world contains multiple problems is a real slam-dunk argument against fixing any of them."

This comment has been flagged by the community. Click here to show it.

Scary Devil Monastery (profile) says:

Re: Just small sample of what you NEVER even mention:

"Forbidden Weapons and Israeli War Crimes…"

So? I think that in the rare cases that we’ve had reason to bring Israel into this website about technology and social impact of technology there have been quite a lot of voices opposing quite a few of Israel’s policies and more radical nationalist extremists.

We could bring attention to the very dark signal it sends that people like Avigdor Lieberman and Netanyahu have received political office at all – because people like those are horrifying irrespective of whether they wear a yarmulke or not.

We could question whether the "most moral army in the world" deserves that hype, given discoveries of extremist views on military bases in the region.

And we could question the settler issue which not rarely involves allowing armed religious fanatics to build themselves an instant bunker within sniping range of palestinian farmsteads.

What we won’t do is to holler "ze jews!" without nuance the way you keep doing. That only serves to inform us that your rhetoric would be far better served in Stormfront than around here.

And as for "american war crimes" not being discussed it’s the same thing – as you would know if you’d been literate enough to follow the debates whenever snowden was mentioned on this forum.

This comment has been flagged by the community. Click here to show it.

Scary Devil Monastery (profile) says:

Re: And now fanboys can accuse of "anti-Semite" when Palestinian

Maybe if you learned to type coherent sentences we’d actually know what you referred to.

Although I doubt it because what you usually type out here hasn’t indicated you’re able to comprehend factual reality very well.
In fact, your usual offer leaves me in doubt on whether you’ve even mastered object permanence yet.

This comment has been flagged by the community. Click here to show it.

Shel10 (profile) says:

Content Moderation

Content Moderation – Just a polite way of saying Censorship. The term is justified by saying community or societal mores are being applied. But how is this different from defacing historical paintings.

The Vietnam War was terrible, and the photographer was simply trying the visualize the horrors for the folks back home and the rest of the world.

A world war 2 veteran was asked by his family what he saw and did. His reply – "General Sherman once said War is Hell, he was right. I survived by the Grace of God. I’m home, and that’s all you need to know."

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Content Moderation

Content Moderation – Just a polite way of saying Censorship.

We do not do or say that in this company, like various churches, clubs and other organizations like social media say, is not the same as as you will nor do or say that anywhere, enforced by state violence. The first allows you to go elsewhere to say what you want to say, the second does not.

This comment has been deemed insightful by the community.
Stephen T. Stone (profile) says:

Re: Re: Re:2

To be fair, it took me a couple of reads to parse it right. The sentence could use at least some quotation marks for emphasizing the words being “spoken”. Lemme clean it up a bit all around:

“We don’t do or say that here” (like various churches, clubs, and other non-government organizations all say) is not the same as “you won’t do or say that anywhere” enforced by either violence or the law. The first allows you to go elsewhere to say what you want to say; the second does not.

This comment has been flagged by the community. Click here to show it.

Jorges says:

Re: Re: Re:3 “We don’t say that here”

" At any given moment there is an orthodoxy, a body of ideas which it is assumed that all right thinking people will accept without question.
It not exactly forbidden to say this, that or the other but it is “not done” to say it..
Anyone who challenges the prevailing orthodoxy finds himself silenced with surprising effectiveness.
A genuinely unfashionable opinion is almost never given a fair hearing, either in the popular press or in the highbrow periodicals. "
(George Orwell "Animal Farm" )

Peter (profile) says:

You are kind of late - about 4 years and 6 weeks ...

The incident happened early September 2016, Facebook reinstated the photo within hours, following a public outcry, and updated its guidelines a few weeks later

While an update would be interesting, to see if and how the new guidelines work, it would have been nice to clarify that the article is about ancient history and not current events. Putting 2016 in the headline is not enough when the article ends with a suggestion that the situation is ongoing.

https://petapixel.com/2016/10/24/facebook-updates-censorship-policy-avoid-another-napalm-girl-issue/

Google search (see dates on results – Sept 9 (censored) Sept 10 (reinstated) Oct 26 (new Facebook guidelines). All four years ago – 2016

Tom, Dick & Harry (profile) says:

Re: You are kind of late - about 4 years and 6 weeks ...

The incident happened early September 2016

Yes, it is stated in the title of the article. It’s a case study of a past content moderation decision which is still useful to learn from, in present day.

Title of article (emphasis added):

Content Moderation Case Study: Facebook Attracts International Attention When It Removes A Historic Vietnam War Photo Posted By The Editor-in-Chief Of Norway’s Biggest Newspaper (2016)

Mike Masnick (profile) says:

Re: You are kind of late - about 4 years and 6 weeks ...

This is part of our content moderation case studies series, in which the vast majority of the case studies are for older stories:

https://www.techdirt.com/blog/contentmoderation/

We’ve done dozens of them, and many of them are old. Indeed, a key reason we focus on older ones is to see how they play out. It is indicated in the title of the post that it’s from 2016 and at the top we note: "Techdirt’s think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website."

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow