White Paper Points Out Just How Irresponsible 'Responsible Encryption' Is

from the a-hole-for-one-is-a-hole-for-all dept

In recent months, both Deputy Attorney General Rod Rosenstein and FBI Director Christopher Wray have been calling for holes in encryption law enforcement can drive a warrant through. Both have no idea how this can be accomplished, but both are reasonably sure tech companies can figure it out for them. And if some sort of key escrow makes encryption less secure than it is now, so be it. Whatever minimal gains in access law enforcement obtains will apparently offset the damage done by key leaks or criminal exploitation of a deliberately-weakened system.

Cryptography expert Riana Pfefferkorn has released a white paper [PDF] examining the feasibility of the vague requests made by Rosenstein and Wray. Their preferred term is “responsible encryption” — a term that allows them to step around landmines like “encryption backdoors” or “we’re making encryption worse for everyone!” Her paper shows “responsible encryption” is anything but. And, even if implemented, it will result in far less access (and far more nefarious exploitation) than Rosenstein and Wray think.

The first thing the paper does is try to pin down exactly what it is these two officials want — easier said than done because neither official has the technical chops to concisely describe their preferred solutions. Nor do they have any technical experts on board to help guide them to their envisioned solution. (The latter is easily explained by the fact that no expert on cryptography has ever promoted the idea that encryption can remain secure after drilling holes in it at the request of law enforcement.)

If you’re going to respond to a terrible idea like “responsible encryption,” you have to start somewhere. Pfefferkorn starts with an attempt to wrangle vague law enforcement official statements into a usable framework for a reality-based argument.

Rosenstein’s remarks focused more on data at rest than data in transit. For devices, he has not said whether his preferred legislation would cover a range of devices (such as laptop and desktop computers or Internet of Things-enabled appliances), or only smartphones, as in some recent state-level bills. His speeches also leave open whether his preferred legislation would include an exceptional-access mandate for data in transit. As some commentators have pointed out, his proposal is most coherent if read to be limited in scope to mobile device encryption and to exclude data in transit. This paper therefore makes the same assumption.

Wray, meanwhile, discussed both encrypted messaging and encrypted devices in his January 2018 speech. He mentioned “design[ing] devices that both provide data security and permit lawful access” and asked for “the ability to access the device once we’ve obtained a warrant.” Like Rosenstein, he did not specify whether his “responsible solution” would go beyond mobile devices. As to data in transit, he used a financial-sector messaging platform as a real-world example of what a “responsible solution” might look like. Similarly, though, he did not specify whether his “solution” would be restricted to only certain categories of data—for example, communications exchanged through messaging apps (e.g., iMessage, Signal, WhatsApp) but not web traffic (i.e., HTTPS). This paper assumes that Wray’s “solution” would, like Rosenstein’s, encompass encryption of mobile devices, and that it would also cover messaging apps, but not other forms of data in transit.

Either way, there’s no one-size-fits-all approach. This is somewhat ironic given these officials’ resistance to using other methods, like cellphone-cracking tools or approaching third parties for data and communications. According to the FBI (in particular), these solutions “don’t scale.” Well, neither do either of the approaches suggested by the Rosenstein and Wray, although Rosenstein limiting his arguments to data at rest on devices does suggest a somewhat more scalable approach.

The only concrete example given of how key escrow might work to access end-to-end encrypted communications is noted above: a messaging platform used for bank communications. An agreement reached with the New York state government altered the operation of the banking industry’s “Symphony” messaging platform. Banks now hold encrypted communications for seven years but generate duplicate decryption keys which were held by independent parties (neither the banks nor the government). But this analogy doesn’t apply as well as FBI Director Christopher Wray thinks it does.

That agreement was with the banks about changing their use of the platform, not with the developer about changing its design of the platform, which makes it a somewhat inapt example for illustrating how developers should behave “responsibly” when it comes to encryption.

Applied directly, it would be akin to asking cellphone owners to store a copy of a decryption key with an independent party in case law enforcement needed access to the contents of their phone. If several communication platform providers are also involved, then it becomes the generation of several duplicates. What this analogy does not suggest is what Wray and Rosenstein suggest: the duplication or development of decryption keys by manufacturers solely for the purpose of government access.

These officials think this solution scales. And it does. But scaling increases the possibility of the keys falling into the wrong hands, not to mention the increased abuse of law enforcement request portals by criminals to gain access to locked devices and accounts. As Pfefferkorn notes, these are problems Wray and Rosenstein have never addressed. Worse, they’ve never even admitted these problems exist.

What a quasi-escrow system would do is exponentially increase attack vectors for criminals and state-sponsored hacking. Implementing Rosenstein’s suggestion would provide ample opportunities for misuse.

Rosenstein suggests that manufacturers could manage the exceptional-access decryption key the same way they manage the key used to sign software updates. However, that analogy does not hold up. The software update key is used relatively infrequently, by a small number of trusted individuals. Law enforcement’s unlocking demands would be far more frequent. The FBI alone supposedly has been unable to unlock around 7,800 encrypted devices in the space of the last fiscal year. State and local law enforcement agencies, plus those in other countries, up the tally further. There are thousands of local police departments in the United States, the largest of which already amass hundreds of locked smartphones in a year.

Wray’s suggestion isn’t any better. In fact, it’s worse. His proposal (what there is of it) suggests it won’t just be phone manufacturers providing key escrow but also any developer offering end-to-end encrypted communications. This vastly increases the number of key sources. In both cases, developers and manufacturers would need to take on more staff to handle law enforcement requests. This increases the number of people with access to keys, increasing the chances they’ll be leaked, misused, sold, or stolen.

The large number of law enforcement requests headed to key holders poses more problems. Bogus requests are going to start making their way into the request stream, potentially handing access to criminals or other bad actors. While this can be mitigated with hardware storage, the attack vectors remain open.

[A]n attacker could still subvert the controls around the key in order to submit encrypted data to the HSM [hardware security module] for decryption. This is tantamount to having possession of the key itself, without any need to attack the tamper-resistant HSM directly. One way for an attacker to get an HSM to apply the key to its encrypted data input is to make the attacker’s request appear legitimate by subverting the authentication process for exceptional-access demands.

These are just the problems a key escrow system would produce on the supply side. The demand for robust encryption won’t go away. Criminals and non-criminals alike will seek out truly secure platforms and products, taking their business to vendors out of the US government’s reach. At best, forced escrow will be a short-term solution with a whole bunch of collateral damage attached. Domestic businesses will lose sales and other businesses will be harmed as deliberately-introduced holes in encryption allow attackers to exfiltrate intellectual property, trade secrets, conduct industrial espionage, and engage in identity theft.

Wray and Rosenstein tout “responsible encryption.” But their arguments are completely irresponsible. Neither has fully acknowledged how much collateral damage would result from their demands. They’ve both suggested the damage is acceptable even if there is only a minimal gain in law enforcement access. And they’ve both made it clear every negative consequence will be borne by device and service providers — from the additional costs of compliance to the sales lost to competitors still offering uncompromised encryption. There’s nothing “responsible” about their actions or their public statements, but they both believe they’re 100% on the right side of the argument. They aren’t and they’ve made it clear the wants and needs of US citizens will always be secondary to the wants and needs of law enforcement.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “White Paper Points Out Just How Irresponsible 'Responsible Encryption' Is”

Subscribe: RSS Leave a comment
23 Comments
Anonymous Anonymous Coward (profile) says:

Re: Re: Useful, but useless

The former is oxymoronic, the latter is a dream, one that is in their charter but beyond the ken of the current systems. Wray and Rosenstein believe, in their distorted sense of duty (duty to whom is a very real question) that ‘Responsible Encryption’ is a means to the execution of ‘Responsible Law Enforcement’ regardless of the intended/unintended consequences. Those intended, but unstated, consequences are most bothersome.

SteveMB (profile) says:

This is somewhat ironic given these officials’ resistance to using other methods, like cellphone-cracking tools or approaching third parties for data and communications. According to the FBI (in particular), these solutions "don’t scale."

This objection gives away the real agenda. Solutions that require significant investment of time and effort for each individual target work just fine for legitimate surveillance (i.e. surveillance authorized by warrant based on specific grounds for suspicion). The fact that these solutions "don’t scale" is simply not a problem… unless the government’s real intention is to conduct mass surveillance.

Hans says:

Problem of their own making

From the paper’s introduction:

“With the rise over the past few years in both communications and mobile device
encryption, authorities claim their ability to investigate crime and terrorism is ‘going dark.'” [1]

It’s important to remember that the “rise” they’re faced with is largely their own fault. Stellarwind and “upstream”, among others, were massive violations of citizen’s rights, which they thought they could get away with… until they were caught. This “rise” is merely a response. Had they appropriately upheld their pledges to defend the Constitution we likely wouldn’t be here.

That One Guy (profile) says:

Re: Problem of their own making

The push for better security would still be there, as even beyond keeping those with badges from indulging their voyeuristic fetishes more security is a better thing on it’s own, but it likely wouldn’t be getting this much attention, both public and by the companies involved.

Once it became widely known that police and government agencies had no restraint when it came to grabbing anything they could, and therefore if the public and companies wanted to protect their data they were going to have to do it themselves the ‘encrypt everything, as best as possible’ idea got a real kick in the pants.

Anonymous Coward says:

Law enforcement possession of keys is a disaster

“Law enforcement’s unlocking demands would be far more frequent. The FBI alone supposedly has been unable to unlock around 7,800 encrypted devices in the space of the last fiscal year. State and local law enforcement agencies, plus those in other countries, up the tally further. There are thousands of local police departments in the United States, the largest of which already amass hundreds of locked smartphones in a year.”

Yesterday, two Baltimore police officers were convicted on the federal charges. They were part of a conspiracy (whose full size still isn’t known, but another dozen officers were implicated in trial testimony) which operated for years inside the BPD. They conducted armed robberies in which they stole drugs, guns, money and other things. They filed false charges. They carried “throwndown” guns to plant on people they shot. They utilized their access to investigatory data to target people. They shot people because they didn’t want to chase them. And in the middle of all us, detective Sean Suiter was shot and killed the day before we was scheduled to testify to the feds, and we still don’t know who killed them. AND someone in the state’s attorney’s officer tipped them off, and we don’t know who that is either.

Bottom line is that it appears that much more of the BPD was and is involved. There are calls — justified calls — to shut down and disband the department. Why would anyone trust this barrel full of rotten apples with encryption keys?

ECA (profile) says:

sTUPID..

Anyone here understand that the Big companies have already installed a BACKDOOR?

The Verification system..
Sending you an email or sending you a phone call, is enough to open most phones..
With a Court order you can get Access to either, and have it routed to the enforcement agency.
Most have never setup 2 part authentication..where you need 1 other password/ID..

But most(I cant say all) of incidences have BEEN WITH BASIC CELL PHONES THAT DONT HAVE GPS.. And a SEMI SMART PERSON, would think about caring a device that could/DOES track most of their movements..
THAT most in the US/UK have recently been individuals, that have little or no connections to Groups..(unless they found something they have not released)(anyone get info on the FBI tricks dept??)

The thing Iv noticed over the years is that the Enforcement agencies tend to NOT WANT CITIZENS to assist, them.. THEY want to do the work, and be able to say , “THEY DID IT”. And if they released info that SOME of the incidents were REAL terrorist attacks from OUTSIDE the USA…HOW many of us would be out with Guns to PROTECT OUR NATION??

Uriel-238 (profile) says:

The police ARE the "wrong hands"

US law enforcement agencies are at this point notorious for going to great lengths to bypass fourth-amendment protections, either by evidence laundering (id est parallel construction) or by cooperation with conviction-eager judges.

Police access to crypto back doors will lead to fishing expeditions and it will lead to state-assisted industrial espionage. Neither the US government nor it’s agencies, including the Department of Justice are neutral party. They’re not even unified parties, in which departments are allied to the interests of their administrators, and will lie to congress and to overseeing agencies to cover their ulterior motives.

Anonymous Coward says:

Re: They can get by with some help from their friends.

By now it isn’t even a question if they have the knowledge available to them or not. They probably started to ask their local IT guy who set up their email and have since moved on to security experts and agencies with specialists.
Combined they have probably 1000 people who either came up with a completely unfeasible solution or (hopefully 99% at least) just plain rejected the possibility.
It is no longer about encryption or even terrorism, but a way for politicians to cover their ass.
It is a simple way to deflect blame away from them in any future incidents. Now they can just blame the tech industry for not cooperating because “the perpetrators would surely have been caught without encryption”. “Brilliantly” it works for almost any crime.
At the same time they score cheap points for doing something against terrorism, child abuse, and the like… Not many people have the knowledge to see through the ruse and are willing to actually call them out for trying to cause the possibly worst disaster in our time, when such call outs will be labeled as support of terrorism and other heinous crimes.

The worst part is that no matter if the people with suggestions of “responsible encryption” are successful or not, they have almost every angle covered to come out on top.
The only way I can see a Win for us in this, is if a large part of the population starts to understand how insane, destructive, and irresponsible such suggestions are and gives the appropriate response… I just don’t see that happening.

dcfusor (profile) says:

Follow the money?

One wonders if by stopping at the spokespuppet we followed far enough. It’s become obvious that government is almost utterly compromised by money from big businesses. I can’t think of a law or regulation made since I was born > 64 years ago that didn’t somehow help the big guys/big IP portfolio at the expense of the entrepreneur.
Now, end to end encryption has at least the possibility of locking out even the major data slurpers. They don’t like that, they live on data for their marketing. Forcing them to have and store a key solves two issues for them – they can now peek (again) – and they get paid, inevitably, for doing it. Either out of taxes or just raising their prices. So, in the more obvious model, they use their influence with their .gov to make it so, or attempt to.

It’s pretty obvious that even with the very powerful tools LEO already has – they haven’t solved or prevented diddly in terms of terrorism, which is always their stated reason for this stuff. If they have, why the crickets? If their real reason is wanting to nip any serious dissent in the bud before things get organized, the whole point of doing so is that we wouldn’t notice if things were nipped in the bud early enough “Earl was always a little off” and so on – and the crickets now make sense.

What’s funny is I don’t even own any tinfoil.

Chip says:

Re: Follow the money?

I can’t think of a law or regulation made since I was born > 64 years ago that didn’t somehow help the big guys/big IP portfolio at the expense of the entrepreneur.

I Agree! The government "Fat casts" Banned DDT, "leaded" Gasoline, and ESPECIALLY delicious, delicous Leaded "paint chips" Just to line Corporate pockets!!

Every Nation eats the Paint chips it Deserves!

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...