Everything Wrong In One Story: Data Silos, Privacy, And Algorithmic Blocking

from the nerding-harder-won't-solve-complex-problems dept

Facebook is probably not having a very good week concerning its privacy practices. Just days after it came out that — contrary to previous statements — the company was using phone numbers that were submitted to Facebook for two-factor-authentication as keys for advertising, earlier this morning the company admitted a pretty massive data breach in which its “view as” tool was allowing users to grab tokens of other users and effectively take over their accounts (even if those users had two factor authentication enabled).

This is, as they say, “really, really bad.” It turned the “view as” feature — which lets you see how your own page looks to other users — into a “take over someone else’s account” feature. That’s a pretty big mistake to make for a product used by approximately half of the entire population of the planet. I’m sure there will be much more on this, but a few hours after the announcement, Facebook had another headache to deal with: numerous reports said that people trying to post articles about this new security mess from either the Guardian or the AP, were getting that action blocked, with Facebook’s systems saying that the action looked like spam:

If you can’t read that, it says:

Action Blocked

Our security systems have detected that a lot of people are posting the same content, which could mean that it’s spam. Please try a different post.

If you think this doesn’t go against our Community Standards let us know.

It’s not hard to see how this happened of course. Many times, when a ton of people all start linking to the exact same story, there’s a decent chance that it might just be a spam attack. I think even our own spam filter for the Techdirt comments takes something similar into account. Thus, with so many people all posting that link to Facebook, it tripped an algorithmic alarm, leading it to block the posting as possible spam. It appears this practice only lasted for a little while, as currently both articles can be posted to Facebook again.

Obviously, given that the content was about a big Facebook security breach, this looks fishy, even if there’s a perfectly “logical” explanation for how it happened. But this also gives us yet another opportunity to highlight how ridiculous it is for people to argue that algorithmic content moderation is a reasonable solution. It’s always going to mess up, especially when used at scale, and sometimes will do so in incredibly embarrassing ways, such as here.

And, of course, it provides yet another opportunity to highlight the problems of having just a few giant silos collecting and keeping so much data about people. Even if they are very good at security — and despite arguments to the contrary, Facebook has a strong security team — there are always going to be vulnerabilities like this, and companies like Facebook are always going to represent huge targets. This seems like yet another reminder that we need to be looking for more solutions to decentralize the web, and move away from giant silos holding onto all of our data.

Tragically, the powers that be are often looking at this the other way: trying to magically “force” big companies to “lock down” data, which actually only increases the value and demands on the silo, while expecting magic algorithms to protect the data. If we’re serious about protecting privacy, we need to start looking at very different solutions that don’t mean letting the giant internet companies control all this data all the time. Move it out to the ends of the network, let individuals control their own data stores (or partner with smaller third parties who can help with security) and then let those users choose when, how and where to allow the large platforms access to that data (if at all). There are better solutions, but there seems to be little interest in actually making them work.

Filed Under: , , , , , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Everything Wrong In One Story: Data Silos, Privacy, And Algorithmic Blocking”

Subscribe: RSS Leave a comment
24 Comments
By whatever 60's group that was. says:

I'm your vehicle, baby! Take you anywhere you want to go!

But this also gives us yet another opportunity to highlight

A Freudian reveal that pieces are only vehicles for your very few agenda items. And this one is not cutting Google.

how ridiculous it is for people to argue that algorithmic content moderation is a reasonable solution.

This flop is not "algorithmic", though, it’s lousy programming, for Marketing’s idea of yet another "feature". Problem is programmers do too much, unnecessary to start, and beyond ability too. — By the way, I got shadowbanned, banned, and even my IP blocked on a Linux site for almost exactly that opinion that Linux has become unusable crap. So, yes, I’ve been banned from better sites than
this. — Anyhoo, algorithms too should be simple, stated clearly so known, and then will work fine. The Reg, for instance, has clear commenting guidelines that keep it civil, so no one wastes time complaining in either direction. Simple works. Foolishness of features without end is certain to flop. It’s amazing that humanity has survived this long. One of these days, an unintentional Doomsday Device will end all.

Anonymous Coward says:

Re: Your an idiot, 5 seconds of search revealed

Ides of March

Hey well I’m the friendly stranger in the black sedan won’t you hop inside my car

I got pictures got candy I am a lovable man I’d like to take you to the nearest star

I’m your vehicle baby

I’ll take you anywhere you wanna go

I’m your vehicle woman

By now I’m sure you know that I love ya (love ya)

I need ya (need ya)

I want you got to have you child

Great God in heaven you know I love you

Well if you want to be a movie star I got the ticket to Hollywood

If you want to stay just like you are you know I think you really should

I’m your vehicle baby

But like them, it won’t win you any friends. If you had any now, why would you spend your time spamming this site?

Anonymous Coward says:

Re: Re: Blame Algorithms !

‘Blame the Algorithms’ — not us wonderful humans and programmers !

this algorithm ploy is the modern, trendy parlance for saying “computer error” … to deflect blame from the people actually managing things.

hardware/software systems did not come from another galaxy– real people here controlled every aspect of their performance

(“the dog ate my homework”)

Anonymous Coward says:

Re: Re: Re: Blame Algorithms !

It is more like politicians and management blaming the programmers for not being able to do the impossible. Yes its a human problem, a combination of loud voices insisting that something is done, and politicians and management telling programmers to do something, and ignoring the problems that they told about in using algorithms.

Just look at the E.U passing laws that mandate algorithms to detect and block copyright infringement.

Anonymous Anonymous Coward (profile) says:

Secure your friendships by talking to them, face to face

I am so glad I never used any social media, including Facebook. I have friends, and don’t need more (not that they don’t crop up now and again), so using social media to expand my ‘social’ existence is not necessary. I do interact with people from all over the world, but I gained those connections through other means. And I interact with them through other means.

For others, it seems an imperative. So sad. Want to talk to friends and family, then talk to them. Want to make more friends, go out, be engaging, don’t talk about politics, or religion or sex, but test for other mutual interests. Then if some connection arises, find ways to interact and get to know one another. Friendship comes over time. It is not like love at first sight. One party might be interested, the other not. Kismet might happens, but should not be assumed.

I am not so sure about using third parties to secure any account. That would mean investigating and trying to ensure (something not likely with any one organization let alone two) that things are actually secure.

There is mention that Facebook has a strong security team, but there is no mention of Facebook policy (in this article) that might prevent them from doing their jobs. When Two Factor ID is given to third parties it sure seems like policy is not about security, but about something else.

trollificus (profile) says:

Re: Secure your friendships by talking to them, face to face

Thing is, people are not using “social media” to create relationships, make friends or expand the pool of people they can communicate with. They are only seeking, in the vast number of people online, personal validation for whatever bizarre constellation of beliefs, political positions and emotional distortions they might exhibit. Nobody is so totally wrong-headed, misinformed or stupid that they cannot find online a thousand people who are 100% in agreement with them.

A lot of people nowadays find that irresistible, especially compared to RL where people will say things like “Uh…that’s wrong.” and “No, I think your facts are outdated there.” Sheesh! Who signs up for THAT?

Anonymous Coward says:

Well said, Mike.

Unfortunately the tech monopolists and their government partners are completely opposed to letting go of our data. It doesn’t matter how much sense decentralisation makes for users/citizens. It is about power. Power is never given away. It is taken. They will fight to the bitter end before letting go.

You should write more about the better solutions you mentioned. What are the best prospects?

Anonymous Coward says:

Re: Algo is to blame

?? Why is this flagged? I’ve always thought that things that are “computer errors” have just as much likelihood of being human error. If the moderation algorithms are secret and human moderation isn’t looked in on by outside parties to see specifically how they work either, how can we tell what’s being moderated by whom in whatever case?

It’s a legitimate complaint.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...