Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Tumblr's Approach To Adult Content (2013)

from the how-to-destroy-a-website dept

Summary: There are unique challenges in handling adult content on a website, whether it’s an outright ban, selectively allowed, cordoned off under content warnings, or (in some cases) actively encouraged.Tumblr’s early approaches to dealing with adult content on its site is an interesting illustration in the interaction between user tagging and how a site’s own tools interact with such tags.

Tumblr was launched in 2007 as a simple “blogging” platform that was quick and easy to setup, but would allow users to customize it however they wanted, and use their own domain names. One key feature of Tumblr that was different from other blogs was an early version of social networking features — such as the ability to “follow” other users and to then see a feed of those users you followed. While some of this was possible via early RSS readers, it was both technologically clunky and didn’t really have the social aspect of knowing who was following you or being able to see both followers and followees of accounts you liked. Tumblr was also an early pioneer in reblogging — allowing another user to repost your content with additional commentary.

Because of this more social nature, Tumblr grew quickly among certain communities. This included communities focused on adult content. In 2013, it was reported that 11.4% of Tumblr’s top domains were for adult content. In May of 2013, Yahoo bought Tumblr for $1.1 billion, with an explicit promise not to “screw it up.” Many people raised concerns about how Yahoo would handle the amount of adult content on the site, but the company’s founder, David Karp, insisted that they had no intention of limiting such content.

“We’ve taken a pretty hard line on freedom of speech, supporting our users’ creation, whatever that looks like, and it’s just not something we want to police…. I don’t want to have to go in there to draw the line between this photo and this behind-the-scenes photo of Lady Gaga and, like, her nip.” — David Karp

Yahoo CEO Marissa Mayer noted that the content on the site might prove more challenging for advertisers, but promised that they would employ “good tools for targeting” to help advertisers avoid having their brands appear next to adult content. However, she still supported allowing Tumblr to continue hosting such content.

“I think the richness and breadth of content available on Tumblr—even though it may not be as brand-safe as what’s on our site—is what’s really exciting and allows us to reach even more users.”  Marissa Mayer

A key part of how Tumblr managed this at the time was allowing its users to tag their content in a way that would indicate to others if there was adult content, while letting the users themselves set preferences regarding their own interest in avoiding such content. Tumbr’s terms of service at the time explained how this worked:

Tumblr is home to millions of readers and bloggers from a variety of locations, cultures, and backgrounds with different points of view concerning adult-oriented content. If you regularly post sexual or adult-oriented content, respect the choices of people in our community who would rather not see such content by flagging your blog (which you can do from the Settings page of each blog) as Not Suitable for Work (“NSFW”). This action does not prevent you and your readers from using any of Tumblr’s social features, but rather allows Tumblr users who don’t want to see NSFW content to avoid seeing it.  Tumblr’s 2012 Terms of Service

Notably, those same terms did ban “sexually explicit videos” with a somewhat explicit reason for that ban: “We’re not in the business of profiting from adult-oriented videos and hosting this stuff is fucking expensive.”

Around the time of the Yahoo purchase, however, users began noticing a change. As shared in Tarleton Gillespie’s book “Custodians of the Internet”:

In May 2013, some Tumblr users noticed that blogs rated “adult” were no longer findable through the major search engines. A month later, Tumblr began using the ratings to selectively exclude posts from its own search tool. Posts from “NSFW” or “adult” blogs no longer appeared in Tumblr’s search results, even if the post itself was not explicit, and regardless of whether the search was explicit. Actually, it was even more complicated than that: if the searcher already followed the explicit blog, that blog’s posts would appear— if it was “NSFW.” If it was “adult,” the more explicit rating, those posts would not appear in the search results, even if the searcher already followed that blog.  Tarleton Gillespie

The end result of this was widespread confusion among Tumblr’s users and fans:

“Clear? No? It was an intricate and confusing arrangement, one that users had a hard time following and the company had a hard time explaining. The principle behind this intricate policy is not an unreasonable one: let users continue to post explicit pornography, while using the self- rating to shield users who do not want to encounter it. But enacting this principle meant codifying it in a series of if/then conditions that could be automated in Tumblr’s search algorithm. And what the policy meant in practice was that while an explicit blog’s existing followers could more or less still get to it, it would now be much more difficult for anyone new ever to find it, given that its posts would not appear in any search results.

In addition, there were other assumptions hiding in the new policy: that the rules should be different for mobile users than for users on their computers; that “logged-out” users (which includes users who have not yet signed up for Tumblr) should not encounter explicit blogs at all; and that explicit Tumblr blogs shouldn’t be appearing in search results on Google or Bing—or Yahoo. These represent somewhat different priorities, but get folded in with Tumblr’s apparent concern for balancing the right to share pornography and the right not to encounter it if you choose not to.” — Tarleton Gillespie

Company Considerations:

  • How important to Tumblr is the culture and community that has built up around those who share adult content on the site? How does that play into product decisions regarding discovery of new content on the site?
  • How practical is user tagging for dealing with adult content on the site? How will the site handle it if there is a significant disagreement, such as someone posting content they insist is not adult content, but others feel is?
  • What sorts of technical/algorithmic rules should be put in place to deal with such adult content on the site? How effective is it to remove such results from search? What are the consequences of removing NSFW results from search?

Issue Considerations:

  • Adult content raises different kinds of challenges for social media websites. What kinds of policies should sites consider regarding such content, and how will it impact their userbase and communities?
  • Tumblr argued that photos including nudity were different than “sexually explicit videos.” Are these distinctions meaningful in a way that can be explained to a team of content moderators?

Resolution: For many years after this, Tumblr continued to allow adult content. In 2017, the company made one major change, rolling out its safe mode option which would not just remove “NSFW” tagged content from search, but also from user feeds. So even if a user followed a certain blog, if they were in “safe mode” they would no longer see content tagged as NSFW.

In 2017, Verizon purchased Yahoo, including Tumblr. A year later, it became apparent that Verizon did not take the same view as Yahoo had regarding allowing Tumblr to continue hosting adult content. In December of 2018, it was announced that Tumblr would be banning adult content on its servers.

Many people worried about what this would do to the site and the communities that grew up around it. Others pointed out that the ban would negatively impact queer and sex-positive communities. The end result was that traffic to the site diminished noticeably in the following months. In August of 2019, Verizon sold Tumblr to blogging company Automattic for just a few million dollars, well below the $1.1 billion Yahoo had paid for it in 2013.

Originally posted to the Trust & Safety Foundation website.

Filed Under: ,
Companies: tumblr, verizon, yahoo

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Content Moderation Case Study: Tumblr's Approach To Adult Content (2013)”

Subscribe: RSS Leave a comment
5 Comments
Rico R. (profile) says:

Re: the algorithms...

Even as someone who’s asexual, I could tell the algorithmic censoring of NSFW content was doomed from the start. And it’s not even inactive users that are affected… Somehow, this old post of mine was flagged. It had no notes, and I appealed it. It’s still in limbo to this day, hence the screenshot!

GHB (profile) says:

Re: Re: the algorithms...

Agree, this is just as bad as dropbox’s setting public folders to private a while later AFTER they were posted, for users going inactive in the future. Not good for posterity. Even worse is that they prohibited the wayback machine from saving those links (excluded).

Scunthorpe problem, but NSFW images instead of swear words.

Anonymous Coward says:

I have been (and am still) using that site since 2013.
While those who moved to platforms like twitter after porn ban, those who staid consider the site to have improved.
It’s a bit hard to express but it seems people who were overconsuming porn there were also the people who were the biggest "drama queens"

As of now, it feels a lot more usable and a lot calmer than in the past.

As a side note, an aspect why so many people staid there after site "collapsed" can be assigned to the aspect that it’s one of the few social medias where algorithms have zero control on the content people see. The newsfeed is chronological and if one user spams while other barely posts their posts will still be in chronological order oppositely than sites like Facebook.
One has full control on what they see.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:43 Content Moderation Case Study: Facebook Struggles To Correctly Moderate The Word 'Hoe' (2021) (21)
15:32 Content Moderation Case Study: Linkedin Blocks Access To Journalist Profiles In China (2021) (1)
16:12 Content Moderation Case Studies: Snapchat Disables GIPHY Integration After Racist 'Sticker' Is Discovered (2018) (11)
15:30 Content Moderation Case Study: Tumblr's Approach To Adult Content (2013) (5)
15:41 Content Moderation Case Study: Twitter's Self-Deleting Tweets Feature Creates New Moderation Problems (2)
15:47 Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021) (14)
15:28 Content Moderation Case Study: Bing Search Results Erases Images Of 'Tank Man' On Anniversary Of Tiananmen Square Crackdown (2021) (33)
15:32 Content Moderation Case Study: Twitter Removes 'Verified' Badge In Response To Policy Violations (2017) (8)
15:36 Content Moderation Case Study: Spam "Hacks" in Among Us (2020) (4)
15:37 Content Moderation Case Study: YouTube Deals With Disturbing Content Disguised As Videos For Kids (2017) (11)
15:48 Content Moderation Case Study: Twitter Temporarily Locks Account Of Indian Technology Minister For Copyright Violations (2021) (8)
15:45 Content Moderation Case Study: Spotify Comes Under Fire For Hosting Joe Rogan's Podcast (2020) (64)
15:48 Content Moderation Case Study: Twitter Experiences Problems Moderating Audio Tweets (2020) (6)
15:48 Content Moderation Case Study: Dealing With 'Cheap Fake' Modified Political Videos (2020) (9)
15:35 Content Moderation Case Study: Facebook Removes Image Of Two Men Kissing (2011) (13)
15:23 Content Moderation Case Study: Instagram Takes Down Instagram Account Of Book About Instagram (2020) (90)
15:49 Content Moderation Case Study: YouTube Relocates Video Accused Of Inflated Views (2014) (2)
15:34 Content Moderation Case Study: Pretty Much Every Platform Overreacts To Content Removal Stimuli (2015) (23)
16:03 Content Moderation Case Study: Roblox Tries To Deal With Adult Content On A Platform Used By Many Kids (2020) (0)
15:43 Content Moderation Case Study: Twitter Suspends Users Who Tweet The Word 'Memphis' (2021) (10)
15:35 Content Moderation Case Study: Time Warner Cable Doesn't Want Anyone To See Critical Parody (2013) (14)
15:38 Content Moderation Case Studies: Twitter Clarifies Hacked Material Policy After Hunter Biden Controversy (2020) (9)
15:42 Content Moderation Case Study: Kik Tries To Get Abuse Under Control (2017) (1)
15:31 Content Moderation Case Study: Newsletter Platform Substack Lets Users Make Most Of The Moderation Calls (2020) (8)
15:40 Content Moderation Case Study: Knitting Community Ravelry Bans All Talk Supporting President Trump (2019) (29)
15:50 Content Moderation Case Study: YouTube's New Policy On Nazi Content Results In Removal Of Historical And Education Videos (2019) (5)
15:36 Content Moderation Case Study: Google Removes Popular App That Removed Chinese Apps From Users' Phones (2020) (28)
15:42 Content Moderation Case Studies: How To Moderate World Leaders Justifying Violence (2020) (5)
15:47 Content Moderation Case Study: Apple Blocks WordPress Updates In Dispute Over Non-Existent In-app Purchase (2020) (18)
15:47 Content Moderation Case Study: Google Refuses To Honor Questionable Requests For Removal Of 'Defamatory' Content (2019) (25)
More arrow