Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Understanding Cultural Context To Detect Satire (2020)

from the she's-a-witch,-burn-her dept

Summary: During the somewhat controversial Senate confirmation hearings for the nomination of Judge Amy Coney Barrett to the Supreme Court, there were a few moments that gained extra attention, including a confrontation between Senator Mazie Hirono and the nominee concerning statements regarding LGBTQ rights that Barrett had made in the past. Hirono, who had separately called the hearings themselves illegitimate, was then criticized by traditionally right-leaning media for what they felt was overly aggressive questioning.

The satirical site The Babylon Bee, which frequently targets Democrats for satirization, published a piece roughly parodying a famous Monty Python sketch in which villagers in a medieval town try to determine if someone is a witch, including by weighing them to see if they weigh the same as a duck. The Babylon Bee took that sketch’s premise and ran a satirical article claiming that Hirono demanded that Barrett be weighed against a duck.

Facebook had the article removed, saying that it was “inciting violence.” The Babylon Bee appealed the decision, only to be told that upon a further “manual” review, Facebook had decided that its original analysis stood, and that the article “incites violence.”

Decisions to be made by Facebook:

  • How do you handle moderation that requires understanding both current political controversies and historical cultural references?
  • How do you distinguish actual satire from that which only pretends to be satire?
  • How do you determine what is actually likely to incite violence?
Questions and policy implications to consider:
  • How can rules against “inciting violence” be written to take into account satire and cultural references?
  • Is it reasonable to expect that content moderators will understand cultural references as satirical?
  • How much should a platform be expected to take into account the target audience of a particular website?
Resolution: As the tweet from The Babylon Bee’s CEO started to go viral, leading to another round of news coverage in traditionally right-wing focused publications, Facebook eventually apologized and said that the moderation decision (and the manual review) were a mistake.

"This was a mistake and we apologize that it happened. Satire can be difficult for our systems to identify, but we've restored the article and their ability to monetize," a Facebook spokesperson told Fox News.

As often happens in these situations, the CEO of the Babylon Bee insisted that this response was implausible, apparently believing that everyone would recognize the cultural references his site’s article was making use of for satire.

"Why did it have to take getting the media involved to fix this? And why did it happen in the first place?" Dillon asked in response to Facebook. "This was not just an algorithm flagging an article in error. Yes, that happened. But then a manual review took place and the ruling to penalize us was upheld. I notice they left that part out."

Originally posted to the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: amy coney barrett, content moderation, mazie hirono, monty python, satire
Companies: babylon bee, facebook


Reader Comments

Subscribe: RSS

View by: Thread


  1. This comment has been flagged by the community. Click here to show it
    identicon
    Buck Lockdowns, 30 Dec 2020 @ 7:03pm

    Re: Why do you not POINT UP that the manual review failed?

    What protection does an ordinary user have from this mega-corporation, then? Have any advice on legislation that would force Facebook to do tune their automatic system and do obvious corrections? Or do you simply leave all users at whim of mega-corporations? -- *I ask because if you ever again get the attention of a legislator, they'll want something more than a sketch of events.

    Your "Case Study" is useless without how to correct, mere filler for your tiny waning site.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Insider Chat
Recent Stories
.

Close

Email This

This feature is only available to registered users. Register or sign in to use it.