The DOJ Is Conflating The Content Moderation Debate With The Encryption Debate: Don't Let Them

from the it's-not-the-same dept

As we've detailed a lot over the last week, the DOJ has decided that after years of failing to get backdoors mandated by warning about the "terrorism" bogeyman, it's decided to pick up the FOSTA playbook, and instead start focusing on child porn -- or what "serious people" now refer to as Child Sexual Abuse Material (CSAM). It did this last week with an assist from the NY Times, who published an article with (legitimately) scary stories, but somehow blaming the internet companies... because they actually report it when they find such content on their networks. I've seen more than a few people, even those who generally have been strong voices on the encryption debate and against backdoors, waver a bit on this particular subject, and note that maybe there shouldn't be encryption on social media networks, because it might (as the narrative says) help awful people hide their child porn.

Except... that's confusing a few different things. Mainly, it's mixing up the content moderation debate with the "lawful access" or "backdoors" debate. Yes, encryption makes it harder for the police to get in and see certain things, but that's by design. We live in a country with the 4th Amendment, in which we believe that it should be difficult for law enforcement to snoop deeply into our lives -- and that's always meant that some people will do and plot bad stuff out of the sight and hearing of law enforcement. Yet, if you were to look at law enforcement over the past 100 years, you can bet that they have many times more access to information about people today than they have in the past. The claim of "going dark" is laughable when you compare the information that law enforcement can get today even to what it could get 15 or 30 years ago.

But, importantly, bringing CSAM into the debate muddies the water by pretending -- incorrectly -- that in an end-to-end encrypted world you can't do any content moderation, and there's simply no way for platforms to block or report certain kinds of content. Yet, as Princeton professor Jonathan Mayer highlights in a new paper, content moderation is not impossible in an encrypted system. It may be different than it is today, but it's still very much possible:

Much of the public discussion about content moderation and end-to-end encryption over the past week has appeared to reflect two common technical assumptions:

  1. Content moderation is fundamentally incompatible with end-to-end encrypted messaging.
  2. Enabling content moderation for end-to-end encrypted messaging fundamentally poses the same challenges as enabling law enforcement access to message content.

In a new discussion paper, I provide a technical clarification for each of these points.

  1. Forms of content moderation may be compatible with end-to-end encrypted messaging, without compromising important security principles or undermining policy values.
  2. Enabling content moderation for end-to-end encrypted messaging is a different problem from enabling law enforcement access to message content. The problems involve different technical properties, different spaces of possible designs, and different information security and public policy implications.

You can read the whole thing, but as the paper notes, user reporting of such content still works in an end-to-end encrypted world, as does hash matching if done at the client end. There's a lot more in there as well, but what you realize in reading the paper is that while law enforcement has now latched onto the CSAM issue as its hook to break encryption (in part, I've been told by someone working with the DOJ, because they found it "polled well"), it's an entirely different problem. This is yet another "but think of the children" argument, which ignores the technical and societal realities.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: chris wray, content moderation, doj, encryption, going dark, jonathan mayer, william barr
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Thread


  1. icon
    nasch (profile), 9 Oct 2019 @ 8:38am

    Re: Re: Re: Nonsense

    Why would anyone want to do that?

    Because there are things on the internet that you don't want to see.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Recent Stories
.

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.