Back Down The Rabbit Hole About Encryption On Smartphones

from the the-rule-of-law dept

Deputy Attorney General Rod Rosenstein wrote the disapproving memo that President Trump used as a pretext to fire FBI Director James Comey in May. But on at least one area of law-enforcement policy, Rosenstein and Comey remain on the same page—the Deputy AG set out earlier this month to revive the outgoing FBI director’s efforts to limit encryption and other digital security technologies. In doing so, Rosenstein has drawn upon nearly a quarter century of the FBI’s anti-encryption tradition. But it’s a bad tradition.

Like many career prosecutors, Deputy Attorney General Rod Rosenstein is pretty sure he’s more committed to upholding the U.S. Constitution and the rule of law than most of the rest of us are. This was the thrust of Rosenstein’s recent October 10 remarks on encryption, delivered to an audience of midshipmen at the U.S. Naval Academy.

The most troubling aspect of Rosenstein’s speech was his insistence that, while the government’s purposes in defeating encryption are inherently noble, the motives of companies that provide routine encryption and other digital-security tools (the way Apple, Google and other successful companies now do) are inherently selfish and greedy.

At the same time, Rosenstein said those who disagree with him on encryption policy as a matter of principle—based on decades of grappling with the public-policy implications of using strong encryption versus weak encryption or no encryption—are “advocates of absolute privacy.” (We all know that absolutism isn’t good, right?)

In his address, Rosenstein implied that federal prosecutors are devoted to the U.S. Constitution in the same way that Naval Academy students are:

“Each Midshipman swears to ‘support and defend the Constitution of the United States against all enemies, foreign and domestic.’ Our federal prosecutors take the same oath.”

Of course, he elides the fact that many who differ with his views on encryption—including yours truly, as a lawyer licensed in three jurisdictions—have also sworn, multiple times, to uphold the U.S. Constitution. What’s more, many of the constitutional rights we now regard as sacrosanct, like the Fifth Amendment privilege against self-incrimination, were only vindicated over time under our rule of law—frequently in the face of overreaching by law-enforcement personnel and federal prosecutors, all of whom also swore to uphold the Constitution.

The differing sides of the encryption policy debate can’t be reduced to supporting or opposing the rule of law and the Constitution. But Rosenstein chooses to characterize the debate this way because, as someone whose generally admirable career has been entirely within government, and almost entirely within the U.S. Justice Department, he simply never attempted to put himself in the position of those with whom he disagrees.

As I’ve noted, Rosenstein’s remarks draw on a long tradition. U.S. intelligence agencies, together with the DOJ and the FBI, reflexively resorted to characterizing their opponents in the encryption debate as fundamentally mercenary (if they’re companies) or fundamentally unrealistic (if they’re privacy advocates). In Steven Levy’s 2001 book Crypto, which documented the encryption policy debates of the 1980s and 1990s, he details how the FBI framed the question for the Clinton administration:

“What if your child is kidnapped and the evidence necessary to find and rescue your child is unrecoverable because of ‘warrant-proof’ encryption?”

The Clinton administration’s answer—deriving directly from George H.W. Bush-era intelligence initiatives—was to try to create a government standard built around a special combination of encryption hardware and software, labeled “the Clipper Chip” in policy shorthand. If the U.S. government endorsed a high-quality digital-security technology that also was guaranteed not to be “warrant-proof”—that allowed special access to government agents with a warrant—the administration asserted this would provide the appropriate “balance” between privacy guarantees and the rule of law.

But, as Levy documents, the government’s approach in the 1990s raised just as many questions then as Rosenstein’s speech raises now. Levy writes:

“If a crypto solution was not global, it would be useless. If buyers abroad did not trust U.S. products with the [Clipper Chip] scheme, they would eschew those products and buy instead from manufacturers in Switzerland, Germany, or even Russia.”

The United States’ commitment to rule of law also raised questions about how much our legal system should commit itself to enabling foreign governments to demand access to private communications and other data. As Levy asked at the time:

“Should the United States allow access to stored keys to free-speech—challenged nations like Singapore, or China? And would France, Egypt, Japan, and other countries be happy to let their citizens use products that allowed spooks in the United States to decipher conversations but not their own law enforcement and intelligence agencies?”

Rosenstein attempts to paint over this problem by pointing out that American-based technology companies have cooperated in some respects with other countries’ government demands—typically over issues like copyright infringement or child pornography rather than digital-security technologies like encryption. “Surely those same companies and their engineers could help American law enforcement officers enforce court orders issued by American judges, pursuant to American rule of law principles,” he says.

Sure, American companies, like companies everywhere, have complied as required with government demands designed to block content deemed in illegal in the countries where they operate. But demanding these companies meet content restrictions—which itself at times also raises international rule-of-law issues—is a wholly separate question from requiring companies to enable law-enforcement everywhere to obtain whatever information they want regarding whatever you do on your phone or on the internet. This is particularly concerning when it comes to foreign governments’ demands for private content and personal information, which might include providing private information about dissidents in unfree or “partly free” countries whose citizens must grapple with oppressive regimes.

Technology companies aren’t just concerned about money—it’s cheaper to exclude digital security measures than to invent and install new ones (such as Apple’s 3D-face-recognition technology set to be deployed in its new iPhone X). Companies do this not just to achieve a better bottom line but also to earn the trust of citizens. That’s why Apple resists pressure, both from foreign governments and from the U.S. government, to develop tools that governments—and criminals—could use to turn my iPhone against me. This matters even more in 2017 and beyond—because no matter how narrowly a warrant or wiretap order is written, access to my phone and other digital devices is access to more or less everything in my life. The same is true for most other Americans these days.

Rosenstein is certainly correct to have said “there is no constitutional right to sell warrant-proof encryption”—but there absolutely is a constitutional right to write computer software that encrypts my private information so strongly that government can’t decrypt it easily. (Or at all.) Writing software is generally understood to be presumptively protected expression under the First Amendment. And, of course, one needn’t sell it—many developers of encryption tools have given them away for free.

What’s more, our government’s prerogative to seek information pursuant to a court-issued order or warrant has never been understood to amount to a “constitutional right that every court order or search warrant be successful.” It’s common in our law-enforcement culture—of which Rosenstein is unquestionably a part and partisan—to invert the meaning of the Constitution’s limits on what our government can do, so that that law-enforcement procedures under the Fourth and Fifth Amendments are interpreted as a right to investigatory success.

We’ve known this aspect of the encryption debate for a long time, and you don’t have to be a technologist to understand the principle involved. Levy quotes Jerry Berman, then of the Electronic Frontier Foundation and later the founder of the Center for Democracy and Technology, on the issue: “The idea that government holds the keys to all our locks, even before anyone has been accused of committing a crime, doesn’t parse with the public.”

As Berman bluntly sums it up, “It’s not America.”

Mike Godwin (@sfmnemonic) is a distinguished senior fellow at the R Street Institute.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Back Down The Rabbit Hole About Encryption On Smartphones”

Subscribe: RSS Leave a comment
31 Comments
ThaumaTechnician (profile) says:

Thread is Godwinned even before the first comment.

Great to see a certified Internet celebrity posting here.

From Rosenstein’s speech: “On the morning of the attack, one of the terrorists exchanged 109 instant messages with an overseas terrorist.” A few words later: “Responsible encryption can involve effective, secure encryption that allows access only with judicial authorization.”

Yeah, you wouldn’t have had the authorization in time, aside from what if the terrorists used codewords in plaintext communications?

Besides, if the FBI and the CIA spend their time actually investigating, rather than _creating_ terrorists, lying to Congress, etc, etc, and maybe put those to started illegal wars on trial, I’d pay attention.

Anonymous Coward says:

So you're okay with the meta-data, GPS and cell-tower locations logged?

Meta-data and location are going to be easily available to gov’t (besides every commercial entity that wishes to buy it). By having a cell phone at all, you tacitly agree to be surveilled. Since there’s no way around that, you simply ignore it. — Or is the omission intentional?

But the criminals disguised as gov’t of the former United States of America are not the only invaders of privacy.

The everyday actuality is that Israeli-owned (and therefore, the state of Israel and Mossad) Amdocs gets nearly all American phone meta-data and is thereby able to find high-value targets:
https://en.wikipedia.org/wiki/Amdocs

You don’t complain about a foreign state having your meta-data. — Did you know about that? Does it worry or bother you at all?

Anonymous Coward says:

Re: So you're okay with the meta-data, GPS and cell-tower locations logged?

You also don’t mention Apple having and selling your meta-data. (You even imply fond of Apple, key indicator of a netwit who trusts corporations.)

By the way, the only information that can be trusted about the seven mega-corporations that Snowden named is that they give NSA “direct access”. They now and then put on shows of going to court to reveal some numbers, but we have no way of even vaguely checking those numbers, nor ANY statements corporations make about how or whether they protect your privacy. But Techdirt clearly believes that corporations can be trusted.

Anyhoo, you just rant narrowly on the last tiny bit of privacy through encryption. Yet since you’ve already surrendered 98% of your own phone privacy (and no doubt more by having dozens of other gadgets and online accounts), you’re more of a Judas goat than a privacy advisor.

Anonymous Coward says:

Re: Re: So you're okay with the meta-data, GPS and cell-tower locations logged?

You bring up a good point and while related, is an entirely different discussion. And, I might add, one that Techdirt has reported on previously, many times with disdain for the practice of corporations collecting and selling your metadata.

The author isn’t saying he’s ok with the collection of his metadata, it just isn’t relevant to this particular article.

Go do your research before you rant and rave about someone’s supposed hypocrisy and make a bunch of accusations that have no basis in anything remotely close to what we call reality.

Carlie Coats says:

poison pill / compromise suggestion

OK, if he wants to be that way, he’s claiming that "righteous" breaking of encryption is OK.
Then, let’s treat the "unrighteous" breaking, trying to solve some of the problems we have
seen with "official" Fourth Amendment related actions:

  • Disclosure of keys and non-warrant breaking of encryption are causes of civil action.
  • Statutory damages are $250,000 per individual breaking or disclosure. (e.g., breaking the
    encryption on 3 files shall be treated as 3 actions, not 1)
  • Perpetrators are responsible as individuals — sovereign immunity is not a defense
    nor justification.
  • Not knowing the law is not a defense (as it has been, e.g., in all those
    "photographing the police" camera-seizure cases)
  • In court actions, failure to respond to subpoenas — including "national security"claims–
    shall be treated as spoliation. Refusal to respond to subpoenas is justification for
    adding the refuser to the list of defendants, and shall constitute an additional offense
    for purpose of computing damages.
  • Statute of limitations is 25 years.
  • Loser pays attorneys’ fees.

Does he consider his "righteous" breaking sufficiently important to balance it with these
reasonable penalties for "unrighteous" breaking? If not, why not?

Anonymous Coward says:

Re: poison pill / compromise suggestion

Broken encryption is broken encryption, does not matter what the reason is or whether it is just – it is still worthless broken encryption no matter how one looks at it.

Are these foolish people actually suggesting that we continue to conduct business as usual while knowing that the encryption has been compromised? What sort of liability would a business be accepting if they did this and would the government actually attempt to prosecute them for simply obeying what the government told them to do? That is perverse but not unexpected – sad.

kog999 says:

Re: What we need

“Silicone Valley nerds should be able to come up with encryption that is perfectly secure until a judge signs a warrant. They just haven’t tried yet”

That would be a good start. but once they have that tackled they will need to work on having perfectly secure encryption until a law enforcement officer wants to look at the data. only working if there is a valid warrant would be unamerican.

Anonymous Coward says:

What if

Ok, lets pretend that there is some magical way to accomplish this backdoor/front door/golden key for encryption the way that some politicians is possible.
In fact, lets also pretend that this magical method only lets the government use it when the targeted party is actually guilty. Remember, we’re playing pretend.
Still following?

How would any of this help when the would-be bad guy uses unauthorized/illegal encryption?

Rekrul says:

The people calling for warrant-friendly encryption don’t appear to understand the problem because it’s computer related. They could be lying (which wouldn’t surprise me), but let’s assume that they truly think computers are magical enough to do what they want.

Someone should ask them to come up with a plan for making a warrant-friendly lock that keeps the bad guys out, but that the police can open when they need to. Then they need to be asked what will keep whatever plan they come up with from being abused (other than the obviously impeccable character of all law enforcement officers) and how they will keep the bad guys from gaining the same access that the police have.

I’d be really interested to hear what they come up with.

Anonymous Coward says:

Why encryption matters

FBI says they have about 7,000 or 50% encrypted devices they can’t get in.
Consumer Reports says about 2.1 million were stolen in 2014. Applying FBI percentages, that’s over 1 million people protected by encryption when their phones were stolen.
And the theft rate is dropping.
For all the talk of law enforcement decrying encryption. The reality is, it appears to be reducing crime.

That One Guy (profile) says:

Re: "We must destroy your security in order to protect your security."

Yeah, that’s one of the big things that really pisses me off. Not just the arrogant ‘We know better than the experts’ stance, but the fact that they are trying to screw the public over in the name of protecting the public. They claim that encryption that works is preventing them from solving crimes, completely ignoring all the crimes working encryption prevents.

Even assuming that every single encrypted phone they can’t access was equal to one unsolvable case each that is still a trivial number to the crimes that are prevented by criminals not having access to the contents of stolen phones. Having a phone stolen is bad, having email, banking, social media credentials stolen would be vastly worse, and that is what they are arguing for more of.

Tanner Andrews (profile) says:

Rosenstein is certainly correct to have said "there is no constitutional right to sell warrant-proof encryption"

He is certainly wrong.

Initially, there is a right to write such a thing, as the US First Amendment provides for free expression. If I am smart enough to figure out how to do this, I may certainly express and demonstrate my view through the writing. That would include my view that I have really developed “warrant-proof” encryption.

There may be other views, where other people working in the field of encryption, may have doubts that my ROT-13 encryption is truly unbreakable. Those folks are also free to express their views. But, given the general competence of Federal warrant-seeking agencies, I feel pretty confident that ROT-13 would be qualify.

The second prong, selling, does not negate the first. Courts routinely hold that publishers who charge for copies of their publication retain the First Amendment rights despite the commercial motive in publishing. Valentine v. Chrestensen was a bit of an outlier, if not flatly wrong even on its rather specialized facts.

Were these things to be otherwise, you could never have had the NY Times publishing “Pentagon Papers” information, because the newspaper publishes with the intent of collecting money and the motive of making a profit.

Wherefore:

V sneg va gur trareny qverpgvba bs Ebfrafgrva.

Anonymous Coward says:

“which might include providing private information about dissidents in unfree or “partly free” countries whose citizens must grapple with oppressive regimes.”

Complete bullshit. As if americans were that free. Nothing farther from the truth. Millions are poor. Tens of millions more drowned in debt. Too many hooked on drugs. With the biggest penitentiary population on Earth. Most full of fear, hate, paranoia and hysteria towards their peers. But they are free, looool. And all to just be able to buy Chinese plastics anyway.

And oppressive governments? May I remind you who is oppressing people as Snowden, Assange, KDC, etc. And many millions around the world through all those worlds.

Mike Godwin and Techdirt are pots calling out the kettles.

Wra says:

Technology companies aren’t just concerned about money—it’s cheaper to exclude digital security measures than to invent and install new ones (such as Apple’s 3D-face-recognition technology set to be deployed in its new iPhone X). Companies do this not just to achieve a better bottom line but also to earn the trust of citizens. That’s why Apple resists pressure, both from foreign governments and from the U.S. government, to develop tools that governments—and criminals—could use to turn my iPhone against me. This matters even more in 2017 and beyond—because no matter how narrowly a warrant or wiretap order is written, access to my phone and other digital devices is access to more or less everything in my life. The same is true for most other Americans these days.

Some D.C. circles have been pointing at Apple’s quiet acquiescence to China’s backdoor demands as evidence that Apple cares more about money rather than principle.

I admit I’m curious why none of the journalists who say encryption backdoors would necessarily mean it would enable criminals + authoritarian gov control emphasize how China is making its communications infrastructure vulnerable and how Apple is staying silent on this.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...