Filters Suck Out Loud: Tumblr's Porn Filters Flag Tumblr's Examples Of Allowed Content

from the tumble-in-the-sheets dept

As you’ll recall, Tumblr recently decided to go the Puritan route with its platform, announcing that it would begin filtering “porn” from its platform. As we pointed out, this was bound to go hilariously wrong, with plenty of innocent content getting swept up in the auto-filters. There were already examples of this, ranging from pictures of cartoons to what looks to be accidental photos people took on their couches. You may have thought at that time that no better example could be found for how dumb auto-filters like this tend to behave.

But Tumblr itself accidentally just provided such an example. Seeking to clarify what is and is not allowed, Tumblr posted a GIF of the kinds of images that would be allowed on the site: artwork, educational material, etc. It all went swimmingly… until others tried to post the exact same GIF to see what would happen.

When Gizmodo posted the gif to Tumblr ourselves, it was immediately flagged as a potential violation and hidden by the platform’s filter. When the images shown in the gif were uploaded individually, two of the four examples—which appeared to show a breast ultrasound and a pro-choice protest, respectively—were similarly flagged and hidden.

And that really should tell you everything you need to know about how effective these types of filters are at completely pooping the bed on their most basic function. That Tumblr did this to itself makes it all the more sweet, particularly when it did so as part of a message that would supposedly clarify things for users of its service.

And, yet, it’s not as though the filters will go away. Platforms are incentivized to screw up on the side of over-blocking and collateral damage by misguided legislation that seeks to solve a non-problem but instead only makes it worse. In other words: bang up job all around, guys.

Filed Under: ,
Companies: tumblr, verizon

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Filters Suck Out Loud: Tumblr's Porn Filters Flag Tumblr's Examples Of Allowed Content”

Subscribe: RSS Leave a comment
34 Comments
Anonymous Coward says:

I just find it interesting from a software development angle.

At the absolute very least… you test to make sure that anything you tell a user to do works as intended. It should be the absolute smoothest part of the application if nothing else.

But given how Tumblr as a platform has been abused there is probably 3 people working on it that are all temp contractors.

Bergman (profile) says:

Re: Re:

My guess on what happened was that whoever was in charge of testing used their own Tumblr corporate account. So the system gave them a pass because the guy who can approve or reject content posted the content in question.

You see this sort of thing all the time in things like Google Docs or other cloud services — the permissions and security on something are so transparent to the usual users that they forget it’s there. But when they try to share a link, the person they shared with doesn’t have permission. And instead of thinking ‘oh, I need to give him permission’ when the problem arises, the person instead thinks ‘he must be doing something wrong, it works fine for ME!’

Anonymous Coward says:

Since context is the overriding factor in determining if nudity is allowed or forbidden on Tumblr, there’s simply no way that any computer code will ever be able to accurately make that judgement. Perhaps we should go back to the Victorian Era (pre-National Geographic) standards, and ban certain body parts outright from public view. Or even the 20th century standard, when a person’s race and ethnicity was the determining factor for whether nudity was considered acceptable or obscene.

Anonymous Coward says:

Re: Re: Re:

How about just allowing everything?

Just wait until something gets flagged by that other content moderation agency, called police, which should serve you an injunction for the stuff that’s really illegal, and be done with it. No hassling with subjective moral points of view, just following the law.

Anonymous Coward says:

Re: Re: Re:

That might have been the prevailing 1990s attitude, but since then demands for censorship and “deplatforming” have been increasing at a rapid pace, with the end nowhere in sight. We should expect a resurgence of the religious Right, whose national letter-writing/public-shaming pressure campaigns had largely withered away about the time the internet came into being. Sooner or later the radical Christians will regroup and start using the same 21st century tactics that the activist Left has employed so effectively to censor online speech they don’t like. Even traditionally ultra-liberal parts of Europe have been taking a hard turn toward censorship in just the last few years, as many Muslim immigrants don’t easily accept the decadence of the ‘infidel’ West.

As the old saying “if you don’t like it, turn the channel” was never accepted by the Right, and is no longer advocated by the Left, it seems increasingly few people remain to promote the ‘live and let live’ attitude of tolerance. Tumblr’s picture bans might only center on nudity right now, but there will no doubt be pressure to increase that list to include other controversial subjects like guns and hunting.

Stephen T. Stone (profile) says:

Re: Re: Re:

the old saying "if you don’t like it, turn the channel" was never accepted by the Right, and is no longer advocated by the Left, it seems increasingly few people remain to promote the ‘live and let live’ attitude of tolerance.

Yeah, ostensible leftist here: I prefer that logic over “shut it all down”. While I do believe some people do not deserve the privilege of using large-scale platforms such as Twitter to spread their bile across the Internet (e.g., Nazis and Nazi sympathizers), those same people have every right to build their own platform(s) and spread their message without worrying about being booted off that platform. And I have every right to block/ignore those platforms—as well as any third-party platforms that would allow their speech—to curate my own Internet experience.

Also: I do not advocate for deplatforming based on political ideology alone. I advocate for deplatforming based primarily on extreme expressions of hatred for (and advocacy for violence against) marginalized segments of the population. “Live and let live”/“agree to disagree” is for positions on which there can be reasonable disagreement. How much people should be taxed or what role the government should have in the institution of marriage are such positions; advocacy for slavery or expressed desires to eradicate gay people are not.

Uriel-238 (profile) says:

Re: Re: Re: Live and let live, and encourge more of it.

Yeah, I can’t count myself as a typical libtard cuck, but this is a conversation that often takes place regarding computer games, whether it’s computer games that feature sensitive or distressing content, things like:

~ Sexual relationships in which sex happens
~ People responding realistically to getting hurt or shot
~ Portrayals of popular religions, their dogma and their followers
~ Children put in actual risk of life and limb
~ Portrayals of realistic political tragedies
~ Portrayals of real-world plagues and diseases
~ Realistic drug use

It’s a common problem. We Happy Few is currently rated-out in Australia (meaning it cannot be sold on console markets or in common marketplaces) and Binding of Isaac still remains banned from iTunes.

My position has been to bring it all. Most of our controversial games (even hate propaganda games) have been horrifically bad, but with time and acceptance eventually better games are made which actually contribute substance to discussions around those topics.

Tumblr is going to go the same direction that Sony Betamax did, and yet in a society where sexual content is easy to access, we’ll observe what porn is consumed by the public and talk about it, and that dialogue will change what we consume and how we consume it, probably for the better.

Uriel-238 (profile) says:

Re: The problem with nudity taboos...

Is that they creep. Victorians were so uptight they had to cover table legs lest they become of prurient interest to those who see them.

Saudi Arabia of and on debates whether a woman in full hijab should be allowed a single peep hole, because too eyes are too much sexy.

The cycle can work virtuously too. As more nudity is acceptable, the public gets less aroused / offended by general nudity, and fancy lingerie business booms. Also we develop a better sense of (and attraction for) typical proportions, and stop looking for Jessica Rabbit.

Zgaidin (profile) says:

Re: Re: The problem with nudity taboos...

Precisely this. The sexualization of specific body parts is directly correlated to which body parts a society deems unfit to be seen socially. If, as in much of the world, female chests, all buttocks and all genitals must be covered in public, those are the body parts that become the object of fantasy, lust, and sexualization precisely because they are taboo.

OGquaker says:

Re: teats are not tits

My Uncle owned every copy since 1900 and all the titty I could find before 1969 (Georgia 394 U.S. 557) was black teat from the wholly-anglo-male N.G. WASP propaganda ‘Society’, under the rubric ‘science’! Can you imagine a complete lead article about New Orleans without a picture of a single American Black? Desperate and shucked and jived by N.G. nonsense, I moved to South Central Los Angeles.

Murdoch-Fox bought National Geographic in 2016; now the fig-leaf of ‘race’ as science is gone.

Lawny (profile) says:

Frankly...

When I saw that the initial post explaining the rule change, the introduction of the bot, and the reasons behind it – which, for the record, had zero pictures in any form and had no obscene language at all – get flagged as needing review due to obscene content, I knew this was going to be a real pile.

There are cheaper ways to alienate your userbase and cause a public scandal.

All of this seemingly done because of the Apple App Store. Shows where they make the real money from.

Stephen T. Stone (profile) says:

Re:

The timetable on the implementation of those rules and that filter was likely the result of the App Store ban. The implementation itself had been planned for months, according to those in the know. If anything, Tumblr(/Oath/Verizon) planned to do this because it needed to make the site palatable for higher-end corporate advertisers.

Too bad the one thing that Tumblr needed to actually make the site worth a damn to advertisers (a large, diverse userbase) has decided to jump ship. My dashboard is not quite dead, but activity has definitely slowed down thanks to numerous people I follow—well, I suppose now it would be “followed”—having deleted their blogs. (I plan to delete my own as soon as 2019 rolls in.)

And all this because Americans think porn is the worst thing ever, yet take no issue with media that glorifies serial killers and celebrates violence. Fuckin’ morons, the lot of us.

Anonymous Coward says:

This shows filters don,t work ,
the new filters proposed by eu article 13 will
be a disaster ,there will be massive over blocking and needless censorship .
I thinks its quite logical that a porn filter would block
an image of a topless woman getting a medical examination.
Theres no filter in the world that can examine any
image or video and tell if its infringing ,fair use
or part of a review ,eg film tv review .
Filters simply do not work and tumblr is simply
trying to block porn ,
Its not liable under eu law at the moment , that it might get sued
for posting an image that might be infringing .

Christenson says:

Re: Yet one more example

The other example of how filters cannot work is the now-dismissed lawsuit over posting of the Steele Dossier.

Assume, for the sake of argument, that not one word of the dossier is true.
If someone had made up that from whole cloth and said ‘Here, it’s true’, then it would be libel.
But, a bunch of top government officials were mulling it over, and the judge said unequivocally that publishing it fell under the “fair reporting privilege”. That is, the accompanying words were, in effect, “Government officials are mulling this:”

One sentence, out of hundreds, makes all the difference, because it is context. Same for images — a million pixels, one sentence (“This is a picture of the girl we napalmed in vietnam”) makes all the difference. In fact, the same goes for content we might all agree is horrible — say the entire racist rants on the Daily Stormer. Studying how that works says nothing about whether we are a Nazi or the Opposition, a terrorist or a cop, or, probably better, a satirist.

It also makes a difference (for the exact same content for many values of content) if the content is in the National Enquirer or on Techdirt. And what’s in the comment section runs differently than the main Techdirt posts, too. Oh, and what it is adjacent to also matters — Let’s start with that picture of your favorite icky body part, and consider whether it is accompanied by a hundred random pictures of other icky body parts, in which case its likely porn, or a block of text explaining that this is what it is supposed to look like with three or four examples of diseased versions or even typical variations.

Supposing I am Facebook, and I want to deal with this paradox automagically, a filter would have to begin with “questionable content” (yes, no, maybe, not too hard to decide correctly “what” is in an image) and then follow on to ask “why” was it posted?

As an attacker, of course, I then create undecidable examples…say, Trump getting it on with Stormy Daniels and captioning “this is what sex looks like, educate yourself!”.

Rekrul says:

I’ve checked a few adult Tumblrs that are still left and while they have tons of missing pictures, every now and then there’s an explicit photo that hasn’t been censored.

Tumblr long ago implemented an adult filter. Some sites would inform you that it contained adult content and that you needed to be signed in to see it. Which itself is stupid since anyone can easily create an account and claim to be 18+.

Anonymous Coward says:

Lt Tumblr, he's dead Jim!

Just don’t visit it. Time to move on. Time to let Tumblr rest.

As Monty Python would have said; E’s not pinin’! ‘E’s passed on! Tumblr is no more! He has ceased to be! ‘E’s expired and gone to meet ‘is maker! ‘E’s a stiff! Bereft of life, ‘e rests in peace! If you hadn’t nailed ‘im to the perch ‘e’d be pushing up the daisies! ‘Is metabolic processes are now ‘istory! ‘E’s off the twig! ‘E’s kicked the bucket, ‘e’s shuffled off ‘is mortal coil, run down the curtain and joined the bleedin’ choir invisible!! This is an Ex-Website!!’

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...