Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts

from the this-again? dept

Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts. Going back over a decade, the quintessential example used to show the impossibility of coming up with clear, reasonable rules for content moderation at scale is Facebook and breasts. In the early days, as Facebook realized it needed to do some content moderation, and had to establish a clear set of rules that could be applied consistently by a larger team, it started with a simple “no nudity” policy — and then after that raised questions, it was narrowed down to define female nipples as forbidden. As a wonderful episode of Radiolab detailed last year, questions kept getting raised about how specific do you need to be (each paragraph here is a different speaker, but since Radiolab doesn’t supply transcripts, I’m not entirely sure who’s speaking):

So, for example, by then, nudity was already not allowed on the site. But they had no definition for nudity. They just said “no nudity.” And so the site integrity team — those 12 people at the time — they realized that they had to start spelling out exactly what they meant.

Precisely. All of these people at Facebook were in charge of trying to define nudity.

The first cut at it was “visible male and female genitalia.” And then “visible female breasts.” And then the question is “well, okay, how much of a breast needs to be showing before it’s nude?” And the thing that we landed on was, if you could see essentially the nipple and areola, then that’s nudity. And would have to be taken down.

This might have seemed like a straightforward rule… until mothers posting breastfeeding photos started complaining — as they did after a bunch of their photos got blocked. Stories about this go back at least until 2008 when the Guardian reported on the issue, after a bunch of mothers started protesting the company, leading Facebook to come up with this incredibly awkward statement defending the practice:

“Photos containing a fully exposed breast, as defined by showing the nipple or areola, do violate those terms (on obscene, pornographic or sexually explicit material) and may be removed,” he said in a statement. “The photos we act upon are almost exclusively brought to our attention by other users who complain.”

More public pressure, and more public protests, resulted in Facebook adjusting its policy to allow breastfeeding, but photos still kept getting taken down, leading the company to have to keep changing and clarifying its policy, such as in this statement from 2012.

When it comes to uploaded photos on Facebook, the vast majority of breastfeeding photos comply with our Statement of Rights and Responsibilities, which closely mirrors the policy that governs broadcast television, and which places limitations on nudity due to the presence of minors on our site. On some occasions, breastfeeding photos contain nudity ? for example an exposed breast that is not being used for feeding ? and therefore violate our terms. When such photos are reported to us and are found to violate our policies, the person who posted the photo is contacted, and the photos are removed. Our policies strive to fit the needs of a diverse community while respecting everyone?s interest in sharing content that is important to them, including experiences related to breastfeeding.

In the Radiolab episode they pointed out that photos of babies sleeping after having breastfed were getting taken down because the baby’s head was no longer blocking the nipple.

In 2014, Facebook clarified its policies on nipples again:

?Our goal has always been to strike an appropriate balance between the interests of people who want to express themselves with the interests of others who may not want to see certain kinds of content,? a Facebook spokesperson told the Daily Dot. ?It is very hard to consistently make the right call on every photo that may or may not contain nudity that is reported to us, particularly when there are billions of photos and pieces of content being shared on Facebook every day, and that has sometimes resulted in content being removed mistakenly.

?What we have done is modified the way we review reports of nudity to help us better examine the context of the photo or image,? the spokesperson continued. ?As a result of this, photos that show a nursing mothers? other breast will be allowed even if it is fully exposed, as will mastectomy photos showing a fully exposed other breast.?

Right. And then, just a few months later, people started protesting again, as more breastfeeding photos were taken down.

Again in the Radiolab program, they then discuss how this gets even more confusing, as some people started posting photos of “breast feeding porn” that appeared to show breast feeding that wasn’t infants. So they modified the rule to say the breastfeeding individual had to be an infant. But how does Facebook determine who is and who is not an infant? We’re right back to the definitional problem. The original rule Facebook put in place was “does the kid look old enough to walk?” which raises other problems, since many kids breastfeed long after they can walk. Facebook has to keep amending and changing. It eventually allows one (just one) nipple/areola showing if it appears related to breastfeeding… then after some time a second one could be shown.

But as Radiolab documented, every time you set a definition, a new exception comes up. In the midst of the breastfeeding mess, this happens:

Literally every time this team at Facebook would would come up with a rule that they thought was airtight–ka-plop–something would show up that they that they weren’t prepared for, that the rule hadn’t accounted for.

As soon as you think, “yeah, this is good” like the next day something shows up to show you, yeah, you didn’t think about this.

For example, sometime around 2011, this content moderator is is going through a queue of things–accept, reject, accept, escalate, accept–and she comes upon this image: the photo itself was a teenage girl, african by dress and skin, breastfeeding a goat — a baby goat. And the moderator throws their hands up and said “what the fuck is this?”

And we Googled breastfeeding goats and found that this was a thing. It turns out it’s a survival practice according to what they found, this is a tradition in Kenya that goes back centuries, that in a drought, a known way to help your herd get through the drought is to, if you if you have a woman who’s lactating, you have her nurse the kid, the baby goat, along with her human kid. And so there’s nothing sexual about it.

…. And theoretically if we go point by point through this list: it’s an infant–it sort of could walk so maybe there’s an issue there–but there’s physical contact between the mouth and the nipple. But (obviously) breastfeeding as we intended, anyway, meant human infants. And so, in that moment, what they decide to do is remove the photo. And there was an amendment, an asterisk, under the rule stating “animals are not babies.” So in any future cases people would know what to do.

This then raised new problems and so on and so on.

And so consider me not at all surprised that Facebook is still facing this very same issue. Late last week there were reports in Australia of some (reasonably) outraged people, who were angry that Facebook was taking down a series of ads for breast cancer survivors.

Facebook has come under fire from outraged breast cancer awareness groups after it banned online advertisements that featured topless survivors, claiming they violated the platform?s nudity policy.

The Breast Cancer Network of Australia (BCNA), in partnership with Bakers Delight, launched its annual Pink Bun Campaign yesterday to raise awareness and money for charity.

As the article notes, the ads showed “10 topless breast cancer survivors holding cupcakes to their chests”. In another article Facebook gives its reasoning, which again reflects much of the history discussed above:

Facebook said it rejected the ads because they did not contain any education about the disease or teach women how to examine their breasts.

It said since the ads were selling a product, they were held to a higher standard than other images because people could not block ads the way they could block content from pages they followed.

So, clearly, over time the rule has evolved so that there’s some sort of amendment saying that there needs to be an educational component if you’re showing breasts related to breast cancer (remember, above, years back, Facebook had already declared that mastectomy photos are okay, and at least some of these ads do show post-mastectomy photos).

The charity in question is furious about this and calls the whole thing “nonsensical,” but it’s actually the opposite of that. It’s totally “sensical” once you understand much of the history, and the fact that Facebook keeps having to change and adapt these rules, often multiple times a month, to deal with the “new” cases that keep showing up that don’t quite match. And you could (and many do) argue that it’s “obvious” why these ads should be allowed, but you forget that the company can’t just rely on something being “obvious.” It has over 10,000 people it employs who are in charge of making these decisions, and what’s obvious to one of them may not be obvious to another. And thus it needs clearly spelled out rules.

And those rules will never encompass every possible situation, and we’ll continue to see stories like this basically forever. We keep saying that content moderation at scale is impossible to do well, and part of that is because of stories like this. You can’t create rules that work in every case, and there are more edge cases than you can possibly imagine.

Filed Under: , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts”

Subscribe: RSS Leave a comment
71 Comments
Scote (profile) says:

" Like a teenaged heterosexual boy, it appears that Facebook has no clue how to deal with naked female breasts."

You can’t blame that entirely on Facebook. Rather, it is society that is erratic on the display of breasts, where male nipples are ok to display in public, but female nipples, even if they look exactly like male nipples, are not. It’s not that moderation doesn’t scale it’s that our societies rules about nipples are ridiculous.

Anonymous Coward says:

Re: Re:

It’s not that moderation doesn’t scale it’s that our societies rules about nipples are ridiculous.

That is exactly the reason that moderation doesn’t scale. Once you get all 7.5 billion people to agree on what’s acceptable, then content moderation will be easy… because no moderation will be necessary.

bob says:

Re: Re: Re:

Thats a very comforting, snuggly feeling, I wasnt sure if people would. Definitely FB tries, and any moderation team should get a lot of credit even if they aren’t producing the results others cry for.

But my attempt at playing with the subject matter, undoubtedly would tweak some people. We can’t all be like babies in life, you sometimes just need to latch onto it and suck out every drop of truth or you miss a key part of growing. It’s a good nurturing function that I wish more people would grab hold of and tease till they are completely satisfied.

But alas these things come in all shapes and sizes. Some don’t like to be pulled and squished the same way as others. But in the end, we all get longer with age.

John85851 (profile) says:

What about artwork

Then there’s the issue of whether nudity is acceptable in artwork. If that happens, who will be the judge as to whether the artwork is "artistic"?

On a related note, I went to Florence recently and took pictures of Michaelangelo’s David (in the Accedemia dell’Arte) and of the Sistine Chapel (showing a nude Adam). Yet the photos are still up!
So either someone at Facebook recognizes the photos of classical artwork or the photos don’t have naked female breasts, or (more likely) no one complained that they were "indecent".

Anonymous Coward says:

like a dog chasing its tail

The central core of the problem could be Facebook’s progressivism. Why not just go by the FCC’s much more conservative traditions on what you can and can’t show regarding nudity on entertainment television broadcasts on public airwaves? That seems to be pretty much the way Facebook started out, since those rules have always been relatively simply and comparatively constant, and concepts such as "fairness" and "inclusiveness" has never been a major goal of the FCC’s rather staid "my way or the highway" censorship policies.

Perhaps it’s not unlike the way that more and more letters keep having to get added to "LGB" and ever more creative types of genders keep having to get added to the trans-gender definition. Because the moment someone finally figures it all out and establishes a completely and universally inclusive standard, someone will crawl out of the woodwork and complain "but what about me? It’s so unfair!!!"

and the cycle continues …

TKnarr (profile) says:

Re: like a dog chasing its tail

Because it isn’t Facebook’s progressivism so much as it’s user’s progressivism conflicting with other user’s lack of progressivism. And unfortunately there’s no "Don’t show my content to conservatives." setting for the progressive users to use.

Most of the FCC’s rules are around how to deal with content in a medium defined to be child-safe (at least during certain times). The Internet in general is simply not child-safe, never has been, never should be. In fact it’s not adult-safe either (I’ve seen things that make Goatse look like a pleasant daydream), and the variation in users is so great that generating rules for content can only be done on the consumer’s end where they only have to satisfy one or a relative few users.

Stephen T. Stone (profile) says:

Re:

Public airwaves are “government ‘property’ ”; Facebook servers are not. Whereas the “moderation” of public airwaves will always trend towards conservative values due to the nature of that particular beast, private entities can have whatever values they want so long as they do not break the law.

Oh, and by the way: Your “joke” about queer people is sad. “Oh noes, we have to keep finding new ways of changing the language for reasons of inclusivity!” The issue there is…what, exactly? We alter our language patterns all the time; the word “thick” evolved into the memetic variant of “thicc” because enough people altered their own pattern and pushed the new variant into wider usage. The language surrounding sexual identity is constantly evolving toward a more “accurate”, more personally considerate vocabulary (e.g., “cisgender” instead of “normal”). If the evolution of that language bothers you, perhaps you should consider how badly the issue affects you…and whether that issue is the most important one in front of you at the moment.

Zgaidin (profile) says:

Which is, more or less, why large scale social media isn’t likely to last and probably isn’t good for us, or maybe we’re not good enough for it. You can’t cram a billion people from all over the world into one gigantic room (digital or otherwise) and not expect endless problems. It’s why Reddit will probably outlast Facebook. If I don’t like a specific topic, I don’t ever go to the subreddit for it. Why would I? That, in turn, allows each subreddit to mostly moderate its much smaller userbase as it sees fit. They never have to try to make one size fits all content rules because they didn’t cram a billion people in one big room. They made a bunch of rooms, let users make an endless supply of new rooms, and then let them wander freely between rooms. Meanwhile Facebook, by its very nature, can never escape the hunt for one size fits all, because it’s just one big room.

Anonymous Coward says:

Re: Re:

Meanwhile Facebook, by its very nature, can never escape the hunt for one size fits all, because it’s just one big room.

Not really, as various groups exist on Facebook, and people decide who to follow. It could and should be more flexible that Reddit when it come to people associating with each other.

Zgaidin (profile) says:

Re: Re: Re:

I found out about this afterwards (shows you how long it’s been since I left Facebook). That’s a step in the right direction, but I doubt it’s particularly successful long term, at least so long as Facebook continues to try to create a globally family/child friendly environment since there’s no universally agreed upon definition for it.

Stephen T. Stone (profile) says:

Re:

Which is, more or less, why large scale social media isn’t likely to last and probably isn’t good for us, or maybe we’re not good enough for it.

Pretty much both, yeah. Humanity was not ready — and may never be ready — for the kind of communications made possible by Twitter, Facebook, and their ilk. The ideal, at least for me, is an old-school forum with a cap on userbase size to prevent things from getting too out of hand in re: moderation and community bullshit (with an optional chatroom for “live” communications). Discord is about the closest modern equivalent, at least to my knowledge and usage of it.

Anonymous Coward says:

Re: Re: Re:

The millennials and younger should be the ones to make those sorts of decisions, as they are the ones growing up with global communications. Th rest of use grew up with limited range communications, and while some have adapted to Global communications, others have gone over the top pushing their agendas.

Bamboo Harvester (profile) says:

Re: Re: Re:

You’ve hit on one of the two big problems – the sheer size of the community. Etiquette is a result of a culture defining the social norms it will tolerate, and those items which it will not.

When dealing with global communications, you’re going to have clashes, sometimes severe, of what the different groups find acceptable.

The other problem is the willingness to let a tiny fraction of a population dictate special rules for their particular case.

Which means the one-armed, left-handed, gravitationally challenged, brain damaged individual who "identifies" as a diseased goat’s penis (aka: Jhon) gets to sue everyone else for not conforming to "it’s" desire to define "normal".

BTW, "normal", just like "sane" is whatever 51% of the polled population says it is.

Anonymous Coward says:

And we Googled breastfeeding goats and found that this was a thing. It turns out it’s a survival practice according to what they found, this is a tradition in Kenya that goes back centuries, that in a drought, a known way to help your herd get through the drought is to, if you if you have a woman who’s lactating, you have her nurse the kid, the baby goat, along with her human kid. And so there’s nothing sexual about it.

TIL

If we didn’t live in a regressive society that always views nudity as inherently immoral/sexual in nature we’d be a lot better off (we could learn a lot from the "naturalism" movement of nudism).

Uriel-238 (profile) says:

Nudity taboos meet the Sorites Paradox

Nudity is offensive when someone feels it is, but different people find different levels of nudity offensive, case in point naturist camps, in contrast to hijab mandates (and in the west, the offense taken to referring to legs rather than the more appropriate limbs)

Which makes the subject of nudity an apt topic for the Sorites Paradox in its most common form What is the threshold of a heap of sand? At what point is it one grain away from no longer being a heap?

For thresholds we feel rather than define in discreet terms, it’s going to very from person to person. Judges know porn when they see it, but some judges see porn where others do not. Curiously, the art collections of the Vatican serve as an anthropological history of fine artists and offended clergymen with different works in varying states of censorship. (Fig leaves were added later not to cover genitals but the blank spot where they were chiseled away).

What it means here: Even if we outline terms of nudity that a computer can check for and follow (with trees of rules and exceptions and exceptions to exceptions) someone’s going to be offended that something is too much or too little and will disagree with the next person, both of whom will insist they represent a societal norm.

Maybe NSFW gates that can be set in personal settings to be all-on / all-off / choose by visited account / choose by picture? I’m just guessing that might be the best compromise.

christenson says:

Gonna remain impossible, until...

We start asking and grouping users into what they want….
We allow those answers to change according to circumstance….
We recognize that CONTEXT matters…(exact same content is or is not "bad" depending on how it is framed)
We recognize that fame is also a criterion… think of Goatse…famous enough that even if I disapprove of him, he is a subject of general discussion and at least some of his pictures need to be available!

Ben (profile) says:

and internationally speaking?

So far the article and the discussion seem quite US-oriented (which is understandable given the nature of Techdirt and the broad distribution of the audience here). However Facebook has a much larger audience than just the US, so the suggestion, for example, that FCC rules could or should apply is entirely inappropriate. Indeed, much of what the FCC permits is deeply offensive when seen through Malaysian or Arabian or Mongolian eyes (just examples, not meant as either an inclusive nor exclusive list of places where culture differs from the US), and there is no doubt content such cultures would accept content USians would quail to see.
In the end, I think the only answer to content moderation at scale is to leverage the user base as a starting point, accepting that a) determined bad actors can maliciously ‘moderate’ content away, and b) occasionally things will be missed from your personal point of view.

Uriel-238 (profile) says:

Re: Sexualization of the breast

Humans are unique among primates exhibiting breasts even when not lactating. It’s one of those things zoologists flip out over sexy space alien girls in space opera clearly designed for the human male gaze: human boobs are about as species-specific as a giraffe’s neck.

We’re pretty sure men have been sexualizing knockers since women have had them. So maybe when we were Australopithecus afarensis? It seems to correlate with walking upright concealing the vulva and nipple stimulation becoming part of foreplay.

We don’t consistently sexualize them in the bigger-is-better fetish stereotypical of the US, Australia and Japan. Rather breasts indicate age and nubility by development and sag. The primal offspring-seeking male brain wants protuberant minimally sagging mammae and a high hip-to-waist ratio.

It’s also why women in their teens and early twenties are often cast by Hollywood as foils to men twice their age in action and suspense thrillers. Her primary quality is breedable.

Uriel-238 (profile) says:

Re: Re: Re: "Way too much"

WTF?

Boobs and stuff get us horny because that’s what’s propogated the species for millions of years. I’m pretty sure those who failed to go overboard died off before we even left the oceans, let alone developed hands or breasts.

Now yes, we live in a culture that has been defined for fifteen hundred years by a church that controlled the laity by witholding sex except through relations it specifically condoned, and as a result of centuries of suppression we’re very hung-up about sex. Hence we fear we lust too much, when medically and interpersonally we would be better off having more sex than we do.

We cover ourselves up for modesty’s sake because that’s a norm we’ve established. But in doing so, it only exasperates our fixation on the physical qualities of others and our desire to see them. In contrast, the excitement quickly wears off in nudist and clothing-optional societies. In those societies we also have healthier, more realistic expectations of what normal human bodies look like.

So no, I think our norms should be way more relaxed than they are. We’d be healthier and not freak out over seeing other people’s bits. Fewer people would have to adhere to uncomfortable dress codes and Facebook would have fewer problems to solve.

bob says:

Re: Re: Re:2 "Way too much"

But in doing so, it only exasperates our fixation on the physical qualities of others and our desire to see them.

This is what I mean by way too much. Sure people find them attractive, I am one of them. And yes that is a good thing. I was just stating some people take that fixation too far.

Anonymous Coward says:

"Too expensive" and "impossible" are not the same thing.

USENET survived just fine until those who monetized their USENET audiences no longer had an interest in the very free speech that built those audiences. It still exists today as mostly a common carrier, though few have any interest in promoting its use. We don’t have internet free speech because it appears we do not want it.

Large social-media networks should become common carriers like the phone company. An internet company may be private, but the internt airwaves are public, and once a company connects to those airwaves, it should not be allowed to censor without a court order.

Also those who see senators tying Section 230 protection to political neutrality are not claiming that it is mandated by 230, but rather it is their condition for continuing to extend that immunity.

Stephen T. Stone (profile) says:

Re:

Assume you own and operate a forum that is open to the public. (The niche/topic for the forum is irrelevant.) One day, someone joins your forum and immediately starts spamming White supremacist propaganda¹. None of their spam advocates for any illegal acts (thus making it legally protected speech), but all of it is at least distasteful. How would you feel if you could not immediately delete their postings and ban them from your forum because the government said “go get a court order first”?

¹ — Feel free to replace “White supremacist” with “homophobic”, “anti-Semitic”, “pro-Movie Sonic”, or any other adjective that describes the beliefs of an abhorrent group of people.

Anonymous Coward says:

Re: Re: Re:

Users can filter out the messages, and SPAM policies (like USENET’s rules) can be content-neutral. Just restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message. This of course doesn’t solve the problem of OTHERS reading the content, which is what people don’t like. It’s one thing to say you don’t want to read something, quite another for you to say you don’t want ME reading it.

What we define as "hate" or "trolling" is subjective. That’s what makes moderation impossible. Should we unplug their telephones and refuse to deliver their mail next? The USPS used to moderate the mail until it was illegal to do so. AOL used to have "guides" moderate its chatrooms until their volunteer status and free-account compensation ran afoul of wage laws.

Stephen T. Stone (profile) says:

Re: Re: Re:

SPAM policies (like USENET’s rules) can be content-neutral

How would a content-neutral automatic moderation policy that includes racial slurs differentiate between a post using the N-word as a racial slur and a post using the N-word as a discussion of the word itself/a news story where someone else said it? Until you can design a policy or automated system that can account for context and nuance, it will always find false positives and “censor” speech that would otherwise be unobjectionable.

restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message

I agree that users should have more controls over what does and does not show up in their timeline. But that alone does not solve the problem of the person posting the content in the first place, which — according to your proposition — could not be removed without a court order, which can take time and money you might not be able to spare.

This of course doesn’t solve the problem of OTHERS reading the content, which is what people don’t like.

The problem would not necessarily be others reading it. The problem would be the inability to prevent it from showing up, or proactively delete it after it shows up, because the government would have said you could not do so without first going to court.

What we define as "hate" or "trolling" is subjective. That’s what makes moderation impossible.

Truly objective moderation is, has been, and always will be impossible. All moderation is subjective; the only difference between “rulesets” is who makes them and what the rules say is “outlawed” on a given platform. If you dislike the rules of one platform, you can leave it and go to another one — or make your own platform with blackjack and hookers and Futurama references.

Should we unplug their telephones and refuse to deliver their mail next?

Booting someone off Facebook is not the equivalent of denying them delivery of their mail.

That One Guy (profile) says:

Re: Re: Re:2 'You first'

You missed the best part of their comment, in that they advocated for a limit on comments that would be allowed to be posted per day, a limit that, as numerous comment sections have made clear, they would likely max out on very quickly, such that they could no longer post unless they wanted to try to avoid the very rules they’re trying to foist on others.

Also worth pointing out that they completely ignored your question/hypothetical, but I suppose as that’s par for the course for them it’s hardly surprising.

Anonymous Coward says:

Re: Re: Re:2 Re:

SPAM policies (like USENET’s rules) can be content-neutral
How would a content-neutral automatic moderation policy that includes racial slurs differentiate between a post using the N-word as a racial slur and a post using the N-word as a discussion of the word itself/a news story where someone else said it? Until you can design a policy or automated system that can account for context and nuance, it will always find false positives and “censor” speech that would otherwise be unobjectionable.

First they laughed at smokers who complained about the cigarette tax, and now they have a soda tax. Why stop at the N-word? Disability-based slurs are still mainstream. They’d have to be banned. Next up is poor-shaming ("winner" versus "loser"), etc. Content moderation is simply a SPEECH CODE. Since I’m against that, I say let users block it, and limit the ability to post to X per day if one feels overwhelmed, and buying another server isn’t an option.

restrict their ability to post to X messages per day to reduce the burden on the system, and give individual users the tools to filter the message
I agree that users should have more controls over what does and does not show up in their timeline. But that alone does not solve the problem of the person posting the content in the first place, which — according to your proposition — could not be removed without a court order, which can take time and money you might not be able to spare.

With free speech, there is no "problem" with any legal speech that can’t be solved by blocking and post-limiting (Google can afford a high limit of posts obviously). The "problem" with free speech is that other people can talk back and no one can commandeer it.

This of course doesn’t solve the problem of OTHERS reading the content, which is what people don’t like.
The problem would not necessarily be others reading it. The problem would be the inability to prevent it from showing up, or proactively delete it after it shows up, because the government would have said you could not do so without first going to court.

What is the problem with free speech "showing up?" I think it’s the opposite: proof that we are hearing all voices. Someone who doesn’t like a TV show can change their channel, but to change MY channel they need it censored.

What we define as "hate" or "trolling" is subjective. That’s what makes moderation impossible.
Truly objective moderation is, has been, and always will be impossible. All moderation is subjective; the only difference between “rulesets” is who makes them and what the rules say is “outlawed” on a given platform. If you dislike the rules of one platform, you can leave it and go to another one — or make your own platform with blackjack and hookers and Futurama references.
Should we unplug their telephones and refuse to deliver their mail next?
Booting someone off Facebook is not the equivalent of denying them delivery of their mail.

It’s not quite mail denial, but it does influence public discourse to a significant level. I still see these complaints more as financial than political. Phone calls cost money because it uses system resources to call someone. The internet could function the same way, with moderation built into the equation.

Stephen T. Stone (profile) says:

Re: Re: Re:3

Content moderation is simply a SPEECH CODE.

Yes, and this is a problem…why, exactly? Few places, if any, allow every type of protected speech “no matter what”.

The "problem" with free speech is that other people can talk back and no one can commandeer it.

…which is a problem if someone runs a platform meant for use by a marginalized community — LGBT people, for example — and said community comes under attack from assholes who would love to marginalize that group even further. Post limits and client-side filters will not change human behavior in the way you think it will, and they will not discourage the assholes more than solid moderation can and will.

What is the problem with free speech "showing up?"

The problem is that, within the framing of your “moderation by judicial order” plan, any speech that a platform does not want to host — regardless of whether it is protected by law, regardless of how the userbase at large acts — would be forced upon that platform for as long as the platform lacks a court order saying “you can delete that specific instance of that specific speech” or “you can ban this user and remove all their posts” or whatever.

That even gets back to my original question, which you did not directly answer: How would you feel if, as a platform owner, you could not immediately delete content you absolutely did not want to host and ban the poster from your platform because the government said “go get a court order first”?

I think it’s the opposite: proof that we are hearing all voices.

And the problem with this mindset is thinking all voices deserve to be heard and treated with equal respect. Someone who sincerely believes in the Flat Earth theory, for example, does not deserve the same respect as everyone else. To act as if they do because of some ridiculous “view from nowhere” mindset is to fool yourself into thinking all speech is “created” equal.

it does influence public discourse to a significant level.

This does not change the fact that using a platform such as Facebook is a privilege, not a right, and that privilege can be revoked if you violate the terms of service. Your “moderation through court order” system cannot exist within this framework. It would metaphorically spit in the face of every law, statute, and court ruling that says a platform for third-party speech is under no legal obligation to host any specific speech from any specific third party.

Phone calls cost money because it uses system resources to call someone. The internet could function the same way, with moderation built into the equation.

Yes, because every platform for third-party speech becoming a (likely expensive) paywalled service because they need the money for lawyers who can handle filing motions for moderation decisions with the court would totally be okay with everyone~.

Anonymous Coward says:

Re: Re: Re:4 Re:

And the problem with this mindset is thinking all voices deserve to be heard and treated with equal respect.

Some animals more equal than others?

Equal access to public internet airwaves is not equal respect. My problem with censorship is that no one deserves that type of power.

Those who can’t stand the existence of the flat-earth society are welcome to block that content. Banning it would lead to a slippery slope where those who question all scientific dogma, like the Big Bang, are also banned. Same for other unpopular beliefs. It becomes groupthink.

Stephen T. Stone (profile) says:

Re: Re: Re:5

When discussing sociopolitical ideologies? Yes, racist beliefs are less equal to anyone who does not believe, say, “I’m a Christian and my Christian beliefs are you don’t do interracial marriage. … [W]hen it comes to all this stuff you see on TV, when you see blacks and whites together, it makes my blood boil because that’s just not the way a Christian is supposed to live.” The same goes for Flat Earthers, anti-vaxxers, and anyone who sincerely enjoyed Batman v Superman. Some views are so toxic, so harmful, so absolutely ridiculous on their face that we need not treat them as having credibility.

Client-side blocking of such content is all well and good, but it still does nothing about the content being there in the first place. And if you explicitly do not want to host content both you and your userbase find distasteful (which would be within your rights), telling everyone else that you cannot do anything about it because “the government says I can’t” will not appease people who think you have cowed to those who post that content. Getting rid of the content would prevent your platform from being associated with it; leaving it up because “free speech” would prevent your platform from being associated with any speech but the distasteful content. 8chan became associated with GamerGate (among other things) because 4chan kicked the Gaters out; that 8chan is both celebrated by its users for its “free speech” ideals and considered a breeding ground for the kind of bullshit you find in manifestos left behind by mass shooters is no coincidence.

Oh, and as for the “banning speech” thing: Any platform not owned by the government has every right to ban whatever speech it wants. Facebook could ban advocacy for the Flat Earth theory later today and nothing — not a single goddamn thing — could be done to legally force Facebook into hosting Flat Earther content. Using a third-party platform is a societal privilege, not a legal right. If’n you hate the rules of that platform, go find one with a ruleset you prefer. I hear 8chan is still a thing…

Anonymous Coward says:

Re: Re:

"Too expensive" and "impossible" are not the same thing.

They’re not. The problem is that vested interests in government and copyright enforcement keep trying to frame "impossible" as simply "too expensive" and insisting that the issue is everyone else not wanting to foot their bill.

Anonymous Coward says:

Rules are a lot simpler to define for traditional media. One reason is the cost of producing content. TV shows can cost millions of dollars per episode, and movies can be more expensive. When there is that much money riding on shows, studios want to stay away from any gray areas in the rules. So with breasts for example, shows will either cover up more than needed so that they don’t accidentally show anything. Or they will have many naked breasts attached to characters having sex to justify a mature rating. Either way no one questions the rating the show was given.

With Facebook, it is so easy to post photos that I believe millions of them were probably posted by accident. While Facebook does enforce rules, the penalty of a few photos betting removed is small. People will choose what photos they upload based on their own morality instead of thinking about what Facebook wants, and some people will even deliberately post pictures for the purpose of showing that the rules of Facebook or its users are arbitrary.

Rekrul says:

"When it comes to uploaded photos on Facebook, the vast majority of breastfeeding photos comply with our Statement of Rights and Responsibilities, which closely mirrors the policy that governs broadcast television, and which places limitations on nudity due to the presence of minors on our site."

Um, I’ve seen full frontal female nudity on network TV at least twice. Both times it was in a miniseries. The first was back in the 70s or 80s. I forget what the miniseries was, but there was a nude native woman, her body partially covered with mud or some type of markings. The second time was in the 90s or early 00s in a miniseries about war. The scene was of women having their heads shaved in a concentration camp. I’ve also seen bare breasts fully exposed in documentary shows about breast cancer and reconstructive surgery.

So, the rules governing broadcast TV aren’t exactly clear either.

Uriel-238 (profile) says:

Re: I, Claudius

I, Claudius starring Derek Jakobi featured a surprising amount of nudity and was essentially the Game Of Thrones of the seventies (albeit taking place in classical Rome). Shown without censorship on PBS when I was first getting more buzz from sex than violence.

More poisonings than stabbings, and without the benefits of classical Hollywood budgets or CGI but still enjoyable.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...