Damned If You Do; Damned If You Don't: ProPublica's Bizarre Reporting On WhatsApp Abuse Reports

from the come-on-guys dept

I’ve been struck over the years by how much reporting on technology involves attacking companies for what they do — even if for totally contradictory reasons. Everything is viewed through the lens of assuming the worst possible intentions. And, yes, sometimes perhaps that’s deserved. Companies act badly and no one should give them the benefit of the doubt if they can’t show reasons it ought not to be. But sometimes, it just gets ridiculous, as is clear in a recent ProPublica piece that attacked WhatsApp for its “report” feature. Now, I like ProPublica a lot and feel that they do some of the best investigative reporting around. But this was not that.

ProPublica itself has reported on how WhatsApp can be abused by those with nefarious intent — criticizing the company for failing to do anything about it. But this new article is basically the opposite. It’s attacking WhatsApp because it has a feature that allows users to “report” a message they received to WhatsApp. ProPublica dangerously incorrectly used this to claim that WhatsApp (which offers end-to-end encryption) is somehow bad about privacy. The title of the article reads — incorrectly — “How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users.” The (since edited) article contains this bullshit section:

Zuckerberg?s vision centered on WhatsApp?s signature feature, which he said the company was planning to apply to Instagram and Facebook Messenger: end-to-end encryption, which converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. WhatsApp messages are so secure, he said, that nobody else ? not even the company ? can read a word. As Zuckerberg had put it earlier, in testimony to the U.S. Senate in 2018, ?We don?t see any of the content in WhatsApp.?

WhatsApp emphasizes this point so consistently that a flag with a similar assurance automatically appears on-screen before users send messages: ?No one outside of this chat, not even WhatsApp, can read or listen to them.?

Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users’ content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen ? claims of everything from fraud or spam to child porn and potential terrorist plotting ? typically in less than a minute.

The false implication here is that WhatsApp (and Facebook) are lying about end-to-end encryption. Except, that’s bullshit. The people reviewing content are only reviewing content that has been reported. I don’t understand why this is hard for people to comprehend, but even with end-to-end encryption one of those “ends” can forward the contents to someone else, and those people can see it. And that’s all that’s happening here. When you “report” content in Facebook, it is the functional equivalent of forwarding the message.

If it’s “undermining” privacy (and it’s not), then the person who is reporting the content is the one undermining privacy by forwarding the message and saying it might be problematic. This is actually quite a reasonable approach to dealing with questionable content on an encrypted messaging program, but ProPublica decides to spread a very misleading report suggesting that it’s undermining privacy.

Alec Muffett does a nice job dismantling the argument. As he notes, it’s really bad when journalists try to redefine end-to-end encryption to mean something it is not. It does not mean that recipients of messages cannot forward them or cannot share them. And, in fact, pretending that’s true, or insisting that forwarding messages and reporting them is somehow an attack on privacy is dangerous. It actually undermines encryption by setting up false and dangerous expectations about what it actually entails.

Alex Stamos similarly has a great thread on the many problems in the article, but here’s the key bit:

But, really, this gets back to a larger point that I keep trying to make with regards to reporting on “privacy” violations. People differ (greatly!) on what they think a privacy violation really entails, and because of that, we get very silly demands — often from the media and politicians — about “protecting privacy” when many of those demands would do tremendous harm to other important ideas — such as harming competition or harming free speech.

And this is especially troubling when perfectly reasonable (and in fact, quite good) systems like WhatsApp “report” feature are portrayed incorrectly as “undermining privacy” when what it’s actually trying to do is help deal with the other issue that the media keeps attacking WhatsApp for: enabling people to abuse these tools to spread hatred, disinformation, or other dangerous content.

Filed Under: , , , , , ,
Companies: facebook, propublica, whatsapp

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Damned If You Do; Damned If You Don't: ProPublica's Bizarre Reporting On WhatsApp Abuse Reports”

Subscribe: RSS Leave a comment
14 Comments
jilocasin (profile) says:

Poorly setting up user expectations

Perhaps the issue is one of not setting user expectations correctly. If you laud the fact that no one can read the contents of these messages, but forget to mention that we’ve set up an easy to use system to forward unencrypted copies of those self same messages to Facebook (not a company exactly known for respecting people’s privacy) you can see how that might freak out some people.

Ars Technica has an article covering the same subject. The interesting part in their reporting was how groups were abusing the system to get the AI to ban groups left and right.

More troubling, in my opinion, is speculation that What’s App may have the undisclosed ability to scan decrypted messages and automatically flag them for FaceBook review.

The most troubling is the unencrypted metadata that Facebook appears to be storing and forwarding/reporting as it sees fit. From the above mentioned Ars Technica article:

"Although WhatsApp’s "end-to-end" encryption of message contents can only be subverted by the sender or recipient devices themselves, a wealth of metadata associated with those messages is visible to Facebook—and to law enforcement authorities or others that Facebook decides to share it with—with no such caveat."

It appears that WhatsApp is a lot less secure than some competitors, such as Signal, and a lot less secure than Facebook likes to admit or users are lead to believe.

Mike Masnick (profile) says:

Re: Poorly setting up user expectations

It appears that WhatsApp is a lot less secure than some competitors, such as Signal, and a lot less secure than Facebook likes to admit or users are lead to believe.

I just don’t think that’s even remotely accurate, though. You can always forward decrypted messages or take a screenshot of them. This reporting makes people believe things that just aren’t true.

jilocasin (profile) says:

Re: Re: Poorly setting up user expectations

Mike, while it’s true that you can usually forward or at least screen shot messages on the destination device, how many people are aware of this?

You, me, probably lots of the folks who read this site, sure. Is little Jonny, Aunt Mable, Uncle Bob aware? Probably not.

A commenter (Jim Salter) of the Ars Technica article summed it up when he posted:

"You’re absolutely correct—millions of teens every year have similar discoveries when they share "disappearing" videos on Snapchat."

Which is why I titled my original response; "Poorly setting up user expectations". I think this is especially important when you are talking about the capabilities of a security application.

ProPublica’s article may come across a little reactionary to folks like you and me, but if it helps raise awareness about the ways that WhatsApp (or any other application) isn’t secure, that can only be seen as a good thing. Excepting law enforcement and Facebook shareholders of course.

PaulT (profile) says:

Re: Re: Re: Poorly setting up user expectations

"while it’s true that you can usually forward or at least screen shot messages on the destination device, how many people are aware of this?"

Why? Does people being aware of it change its existence or something?

"ProPublica’s article may come across a little reactionary to folks like you and me, but if it helps raise awareness about the ways that WhatsApp (or any other application) isn’t secure, that can only be seen as a good thing"

Except, that’s a lie. It IS secure, unless you take the deliberate step of bypassing the security by forwarding a message. Spreading FUD about Whatsapp will not help anyone, and may in fact harm if people decide to move to less secure solutions as a response.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...