Facebook Acting Badly: Shuts Down Researchers Accounts Over Claims Of Privacy Violations That Don't Stand Up To Scrutiny

from the oh-stop-it dept

Last year we wrote about Facebook threatening NYU researchers who had set up a browser extension that would allow users to voluntarily collect information about ads and ad targeting on Facebook — information that Facebook does not publicly reveal — and provide it to the researchers’ “Ad Observatory” project. As we noted at the time, Facebook’s threats were definitely going too far, though you could see how it came about, as it could be argued that there were technical similarities to what the NYU researchers were doing, and what an academic from Cambridge did many years ago that turned into… the Cambridge Analytica scandal, which resulted (in part) in a massive fine from the FTC for privacy violations.

But, as we explained then, the NYU Ad Observer story is clearly distinguishable from the Cambridge Analytica story. The Ad Observatory involves people installing an extension on their own computer in their own browser, and choosing to share that information. That’s not something Facebook should have any right to block.

Unfortunately, this week, Facebook took things up a notch and shut down the accounts of everyone associated with the project, effectively cutting off all their access to Facebook’s Ad Library and other tools they were using in their research. Facebook’s own explanation for this is… to claim that it was compromising people’s privacy, though that appears to be bullshit (as we’ll get to in a moment). Here’s part of what Facebook said:

We told the researchers a year ago, in summer of 2020, that their Ad Observatory extension would violate our Terms even before they launched the tool. In October, we sent them a formal letter notifying them of the violation of our Terms of Service and granted them 45 days to comply with our request to stop scraping data from our website. The deadline ended on November 30, long after Election Day. We continued to engage with the researchers on addressing our privacy concerns and offered them ways to obtain data that did not violate our Terms. 

Earlier this year, we invited researchers, including the ones from NYU, to safely access US 2020 Elections ad targeting data through FORT?s Researcher Platform. This offered the Ad Observatory researchers a more comprehensive data set than the one they created by scraping data on Facebook. The researchers had the opportunity to use the data set, which is designed to be privacy-protective, instead of relying on scraping, but they declined.

We made it clear in a series of posts earlier this year that we take unauthorized data scraping seriously, and when we find instances of scraping we investigate and take action to protect our platform. While the Ad Observatory project may be well-intentioned, the ongoing and continued violations of protections against scraping cannot be ignored and should be remediated. 

Collecting data via scraping is an industry-wide problem that jeopardizes people?s privacy, and we?ve been clear about our public position on this as recently as April. The researchers knowingly violated our Terms against scraping ? which we went to great lengths to explain to them over the past year. Today?s action doesn?t change our commitment to providing more transparency around ads on Facebook or our ongoing collaborations with academia. We?ll continue to provide ways for responsible researchers to conduct studies that are in the public interest while protecting the security of our platform and the privacy of people who use it. 

This is… nonsense. I challenge the idea that “data scraping” is a problem, first of all. I mean, that’s what the entire search engine business is built on. It’s all data scraping. The problem is when private information is somehow exposed. But that’s different. And here we’re talking about individuals installing an extension in their own browsers, to collect information that is sent to their own computers. That’s not Facebook’s data to control. Once it’s in my browser, it’s mine to make use of. Facebook’s argument here makes no sense.

And, indeed, a few others have stepped in to point out the same thing. For one, Mozilla, makers of the Firefox browser, called bullshit on the claims pretty directly:

Facebook claims the accounts were shut down due to privacy problems with the Ad Observer.  In our view, those claims simply do not hold water. We know this, because before encouraging users to contribute data to the Ad Observer, which we?ve done repeatedly, we reviewed the code ourselves. And in this blog post, we want to explain why we believe people can contribute to this important research without sacrificing their privacy.  

Anytime you give your data to another party, whether Facebook or Mozilla or researchers at New York University, it is important that you know whether that party is trustworthy, what data will be collected, and what will be done with that data. Those are critical things to consider before you potentially grant access to your data. And those are also key factors for Mozilla when we consider recommending an extension. 

Before Mozilla decided to recommend Ad Observer, we reviewed it twice, conducting both a code review and examining the consent flow to ensure users will understand exactly what they are installing. In both cases the team responsible for this add-on responded quickly to our feedback, made changes to their code, and demonstrated a commitment to the privacy of their users. We also conducted an in-depth design review of Ad Observer, the results of which can be found here

We decided to recommend Ad Observer because our reviews assured us that it respects user privacy and supports transparency. It collects ads, targeting parameters and metadata associated with the ads. It does not collect personal posts or information about your friends. And it does not compile a user profile on its servers. The extension also allows you to see what data has been collected by visiting the ?My Archive? tab. It gives you the choice to opt in to sharing additional demographic information to aid research into how specific groups are being targeted, but even that is off by default.

You don?t have to take our word for it. Ad Observer is open source, so anybody can see the code and  confirm it is designed properly and doing what it purports to do.

And then, the FTC’s Bureau of Consumer Protection weighed in with a letter to Mark Zuckerberg also calling bullshit on the idea that the Ad Observatory extension might somehow violate Facebook’s various consent decrees with the FTC regarding privacy protection.

I write concerning Facebook?s recent insinuation that its actions against an academic research project conducted by NYU?s Ad Observatory were required by the company?s consent decree with the Federal Trade Commission. As the company has since acknowledged, this is inaccurate. The FTC is committed to protecting the privacy of people, and efforts to shield targeted advertising practices from scrutiny run counter to that mission.

While I appreciate that Facebook has now corrected the record, I am disappointed by how your company has conducted itself in this matter. Only last week, Facebook?s General Counsel, Jennifer Newstead, committed the company to ?timely, transparent communication to BCP staff about significant developments.? Yet the FTC received no notice that Facebook would be publicly invoking our consent decree to justify terminating academic research earlier this week.

Had you honored your commitment to contact us in advance, we would have pointed out that the consent decree does not bar Facebook from creating exceptions for good-faith research in the public interest. Indeed, the FTC supports efforts to shed light on opaque business practices, especially around surveillance-based advertising. While it is not our role to resolve individual disputes between Facebook and third parties, we hope that the company is not invoking privacy ? much less the FTC consent order ? as a pretext to advance other aims.

I do think that some of the pressures that people put on Facebook about “protecting” privacy do more harm than good and lead to situations that backfire and actually give Facebook more control and less transparency. But this case seems to be Facebook cynically using this as an excuse to shut down academic research they were uncomfortable with and punish the researchers.

There are two simple answers here: give the academic researchers access to the data they need to do this research (Facebook’s claims of making the data accessible leave out that not everything the researchers are asking for is actually in those data sets) and don’t block people from installing things on their own computer. That’s not Facebook’s job.

Filed Under: , , , , , ,
Companies: cambridge analytica, facebook, mozilla, nyu

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facebook Acting Badly: Shuts Down Researchers Accounts Over Claims Of Privacy Violations That Don't Stand Up To Scrutiny”

Subscribe: RSS Leave a comment
16 Comments

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: You have no evidence.

Lol at Blue, who always yells at everybody as being too credulous about big companies, taking the side of Facebook against the academic researchers, Mozilla experts who went over the code thoroughly, and FTC officials.

You’re not even internally consistent. The true hallmark of a troll who serves no purpose but to troll. It’s pretty sad that you’ve wasted so many years of your life just being an asshole who gets stuff consistently wrong online. What a sad, pathetic life you must lead.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Yet again SIX YEAR ZOMBIE pops out in Geigner piece.

Here because blocked in Geigner’s piece.

raugturi: 3 (<0.5) near SIX YEAR GAP to 2nd; Apr 19th, 2015 https://www.techdirt.com/user/raugturi

Good correlation has stood for years, with no one contradicting, not even a token other explanation than the obvious conclusion that Geigner WRITES ’em to try and fill out particularly his few comments.

This comment has been flagged by the community. Click here to show it.

Escaped Leopard Spotted says:

Re: Yet again SIX YEAR ZOMBIE pops out in Geigner piece.

So long as the site remains tiny — and you clearly can’t increase number of readers — then anywhere I can get in is good ’nuff.

Why don’t you answer the ZOMBIE phenom, Maz? LONG GAPS, rare commenters, been going four years now!

Anonymous Coward says:

Deanonymized Data Isn't

It is well known that "anonymized" data may often be deanonymized especially if given qualifiers like local geography. Given FTC involvement and both sides of the aisle being moronic tools who think they are actively helping the other win elections I can understand jumpiness and overzealousness. That is the whole fucking point of FTC fines, to change behavior!

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...