Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private

from the that-doesn't-make-any-sense dept

There have been a bunch of slightly wacky court rulings of late, and this recent one from magistrate judge Zia Faruqui definitely is up there on the list of rulings that makes you scratch your head. The case involves the Republic of Gambia seeking information on Facebook accounts that were accused of contributing to ethnic genocide of the Rohingya in Myanmar. This situation was — quite obviously — horrible, and it tends to be the go-to story for anyone who wants to show that Facebook is evil (though I’m often confused about how people often seem more focused on blaming Facebook for the situation than the Myanmar government which carried out the genocide…). Either way, the Republic of Gambia is seeking information from Facebook regarding the accounts that played a role in the genocide, as part of its case at the International Court of Justice.

Facebook, which (way too late in the process) did shut down a bunch of accounts in Myanmar, resisted demands from Gambia to hand over information on those accounts noting, correctly, that the Stored Communications Act likely forbids it from handing over such private information. The SCA is actually pretty important in protecting the privacy of email and messages, and is one of the rare US laws on the books that is actually (for the most part) privacy protecting. That’s not to say it doesn’t have its own issues, but the SCA has been useful in the past in protecting privacy.

The ruling here more or less upends interpretations of the SCA by saying once an account is deleted, it’s no longer covered by the SCA. That’s… worrisome. The full ruling is worth a read, as you’ll know you’ll be in for something of a journey when it starts out:

I come to praise Facebook, not to bury it.

Not quite what you expect from a judicial order. The order lays out the unfortunately gory details of the genocide in Myanmar, as well as Facebook’s role in enabling the Myanmar government to push out propaganda and rally support for its ethnic cleansing. But the real question is how does all of this impact the SCA. As the judge notes, since the SCA was written in 1986 it certainly didn’t predict today’s modern social media, or the questions related to content moderation, so this is a new issue for the court to decide. But… still. The court decides that because an account is disabled… that means that the communications are no longer “stored.” Because [reasons].

The Problem Of Content Moderation

At the time of enactment, Congress viewed ECS and RCS providers as mail/package delivery services. See Cong. Rsch. Serv., R46662, Social Media: Misinformation and Content Moderation Issues for Congress (2021), https://crsreports.congress.gov/product/pdf/R/R46662. This view failed to consider content moderation; mail/package delivery services have neither the ability nor the responsibility to search the contents of every package. Yet after disinformation on social media has fed a series of catastrophic harms, major providers have responded by taking on the de facto responsibility of content moderation. See id. ?The question of how social media platforms can respect the freedom of expression rights of users while also protecting [users] from harm is one of the most pressing challenges of our time.? …

This Court is the first to consider the question of what happens after a provider acts on its content moderation responsibility. Is content deleted from the platform but retained by the provider in ?backup storage?? It is not.

That obviously seems like a stretch to me. If the company still retains the information then it is clearly in storage. Otherwise, you’ve just created a massive loophole by saying that any platform can expose the private communications of someone if they first disable their account.

The court’s reasoning, though gets at the heart of the language of the SCA and how it protects both “any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof” or “any storage of such communication by an electronic communication service for purposes of backup protection of such communication.” It says the first bit can’t apply because these communications had reached their “final destination” and were no longer temporary. And it can’t be “backup” since the original content had been deleted, therefore there couldn’t be any “backup.”

Congress?s conception of ??backup? necessarily presupposes the existence of another copy to which this [backup record] would serve as a substitute or support.? Id. Without an original, there is nothing to back up. Indeed ?the lifespan of a backup is necessarily tied to that of the underlying message. Where the underlying message has expired . . . , any copy is no longer performing any backup function. An [ECS] that kept permanent copies of [deleted] messages could not fairly be described as ?backing up? those messages.?

But… I think that’s just wrong. Facebook retaining this data (but blocking the users from accessing it themselves) is clearly a “backup.” It’s backup in case there is a reason why, at some future date, the content does need to be restored. Under the judge’s own interpretation, if you backup your hard drive, but then the drive crashes, your backup is no longer your backup, because there’s no original. But… that’s completely nonsensical.

The judge relies on (not surprisingly) a case in which the DOJ twisted and stretched the limits of the SCA to get access to private communications:

Nearly all ?backup storage? litigation relates to delivered, undeleted content. That case law informs and supports the Court?s decision here. ?Although there is no binding circuit precedent, it appears that a clear majority of courts have held that emails opened by the intended recipient (but kept on a web-based server like Gmail) do not meet the [backup protection] definition of ?electronic storage.?? Sartori v. Schrodt, 424 F. Supp. 3d 1121, 1132 (N.D. Fla. 2019) (collecting cases). The Department of Justice adopted this view, finding that backup protection ?does not include post-transmission storage of communications.? U.S. Dep?t of Just., Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, 123 (2009), https://www.justice.gov/sites/default/files/criminal-ccips/legacy/2015/01/14/ssmanual2009.pdf. The Gambia argues for following the majority view?s limited definition of backup storage. See Sartori, 424 F. Supp. 3d at 1132; ECF No. 16 (Pet?r?s Resp. to Surreply) at 5?6. If undeleted content retained by the user is not in backup storage, it would defy logic for deleted content to which the user has no access to be in backup storage.

As for the argument (which makes sense to me) that Facebook made that the entire reason for retaining the account shows that it’s backup, the judge just doesn’t buy it.

Facebook argues that because the provider-deleted content remains on Facebook servers in proximity to where active content on the platform is stored, both sets of content should be protected as backup storage. See Conf. Tr. at 76. However, the question is not where the records are stored but why they are stored. See Theofel, 359 F.3d at 1070. Facebook claims it kept the instant records as part of an autopsy of its role in the Rohingya genocide. See Conf. Tr. at 80?81. While admirable, that is storage for self-reflection, not for backup.

The judge also brushes aside the idea that there are serious privacy concerns with this result, mainly because the judge doesn’t believe Facebook cares about privacy. That, alone, is kind of a weird way to rule on this issue.

Finally, Facebook advances a policy argument, opining that this Court?s holding will ?have sweeping privacy implications?every time a service provider deactivates a user?s account for any reason, the contents of the user?s communications would become available for disclosure to anyone, including the U.S. government.?…. Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook?s sordid history of privacy scandals.

So… because Facebook doesn’t have a great history regarding the protection of privacy… we can make it easier for Facebook to expose private communications? What? And even if it’s true that Facebook has made problematic decisions in the past regarding privacy, that’s wholly separate from the question of whether or not it has a legal obligation to protect the privacy of messages now.

Furthermore, the judge insists that even if there are privacy concerns, they are “minimal”:

The privacy implications here are minimal given the narrow category of requested content. Content urging the murder of the Rohingya still permeates social media. See Stecklow, supra (documenting ?more than 1,000 examples . . . of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook? even after Facebook apologized for its services being ?used to amplify hate or exacerbate harm against the Rohingya?). Such content, however vile, is protected by the SCA while it remains on the platform. The parade of horribles is limited to a single float: the loss of privacy protections for de-platformed content. And even that could be mitigated by users joining sites that do not de-platform content.

Yes. In this case. But this could set a precedent for accessing a ton of other private communications as well, and that’s what’s worrying. It’s absolutely bizarre and distressing that the judge doesn’t bother to think through the implications of this ruling beyond just this one case.

Prof. Orin Kerr, one of the foremost experts on ECPA and the SCA, notes that this is both an “astonishing interpretation” and “stunning.”

The entire ruling is concerning — and feels like yet another situation where someone’s general disdain for Facebook and its policies (a totally reasonable position to take!) colored the analysis of the law. And the end result is a lot more dangerous for everyone.

Filed Under: , , , , , , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private”

Subscribe: RSS Leave a comment
15 Comments
Anonymous Coward says:

Re: Re:

The thing is, a reasonable person wouldn’t care what these definitions are, and don’t need to understand tech at all, they simply understand that their stuff is stored somewhere, and whether they or the service provider suspend or delete their account, anything remaining is still their private stuff, and should be protected as such.

This comment has been deemed insightful by the community.
TaboToka (profile) says:

How do you know for sure? How do you really know?

This line caught my eye

our backup is no longer your backup, because there’s no original

For a philosophical discussion, what’s the difference between the two?

Let’s say you have two text or image files, one a backup of another. How can you tell which one is the original? From the directory timestamp? Is it the location each is stored?

If you move both files to another volume – hell, both to the same volume – and reset their directory timestamps, how can you tell which one is the backup and which one is the original?

This isn’t an issue in meatspace. Making an exact duplicate of any physical thing is not possible for the foreseeable future (or ever, if you take into account the Observer effect), and that’s what the courts are used to dealing with.

That’s also why they fail so spectacularly to frame their rulings without putting their collective feet in their collective mouths.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'If you can do it then so can we.'

That argument regarding privacy is beyond absurd. ‘Facebook doesn’t care about user privacy so the courts and legal systems don’t have to either’ in a single step both condemns Facebook and then exonerates them by arguing that once privacy has been ignored by one party the government no longer needs to care either, something which if anything leaves the government in a position where they want Facebook or other companies to show as much contempt towards user privacy as possible.

On a more general note the idea that once an account is deleted any data from it is free to grab is beyond disturbing as that makes the law an absolute joke by punching a massive hole in it’s protections and encourages people to keep accounts on services they might otherwise avoid in order to ‘protect’ their data from being grabbed by any third party that wants it.

PaulT (profile) says:

Re: If you're not on Facebook, don't get on.

That’s assuming it’s accurate in the first place. For all the complaints, I’ve been friends with people who use use pseudonyms, people who use multiple accounts, even inanimate objects and dogs, for the whole time I used the service.

The problem with FB isn’t so much the way they gather and use data, it’s the people who believe Facebook "sources" more than they do reputable sources, and I’m not sure I know how to fix that any more than I do people who think that Fox, the Daily Mail and The Sun are factual sources – and that’s a problem that predates the internet, let alone Facebook.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...