Still Not 'Going Dark:' Device Encryption Still Contains Plenty Of Exploitable Flaws

from the good-news-and-bad-news-(which-some-might-consider-good-news) dept

Law enforcement — especially at the federal level — has spent a great deal of time complaining about an oddity known only to the FBI and DOJ as “warrant-proof” encryption. Device users and customers just call this “encryption” and realize this protects them against criminals and malicious hackers. The federal government, however, sees device encryption as a Big Tech slap in the face. And so they complain. Endlessly. And disingenuously.

First off, law enforcement has access to a wide variety of tech solutions. It also has access to plenty of communications and other data stored in the cloud or by third parties that encryption can’t protect. And it has the users themselves, who can often be persuaded to allow officers to search their devices without a warrant.

Then there’s the protection being handed out to phone users. It’s got its own problems, as Matthew Green points out:

Authorities don’t need to break phone encryption in most cases, because modern phone encryption sort of sucks.

More specifically, even the gold standard (Apple’s) for encryption still leaves some stuff unencrypted. Once unlocked after a period of rest (say, first thing in the morning), the phone is placed into an “AFU” (after first unlock) state where crypto keys are stored in the phone. These stay in memory until erased. Most common use of phones won’t erase them. And they’re only erased one at a time, leaving several sets resident in memory where cops (and criminals!) using phone-cracking tech can still access them.

A report [PDF] put together by Matthew Green, Maximilian Zinkus, and Tushar Jois highlights the exploitable flaws of device encryption efforts by Apple, Google, and other device manufacturers. And there’s not a lot of darkness going on, despite law enforcement’s protestations.

This reaction is best exemplified by the FBI’s “Going Dark” initiative, which seeks to increase law enforcement’s access to encrypted data via legislative and policy initiatives. These concerns have also motivated law enforcement agencies, in collaboration with industry partners, to invest in developing and acquiring technical means for bypassing smartphone security features. This dynamic broke into the public consciousness during the 2016 “Apple v. FBI” controversy, in which Apple contested an FBI demand to bypass technical security measures. However, a vigorous debate over these issues continues to this day. Since 2015 and in the US alone, hundreds of thousands of forensic searches of mobile devices have been executed by over 2,000 law enforcement agencies, in all 50 states and the District of Columbia, which have purchased tools implementing such bypass measures.

The research here shows there’s no need for legislative mandates or court orders to access most of the contents of suspects’ iPhones. There’s plenty to be had just by exploiting the shortcomings of Apple’s built-in encryption.

[W]e observed that a surprising amount of sensitive data maintained by built-in applications is protected using a weak “available after first unlock” (AFU) protection class, which does not evict decryption keys from memory when the phone is locked. The impact is that the vast majority of sensitive user data from Apple’s built-in applications can be accessed from a phone that is captured and logically exploited while it is in a powered-on (but locked) state.

This isn’t theoretical. This has actually happened.

[W]e found circumstantial evidence in both the DHS procedures and investigative documents that law enforcement now routinely exploits the availability of decryption keys to capture large amounts of sensitive data from locked phones. Documents acquired by Upturn, a privacy advocate organization, support these conclusions, documenting law enforcement records of passcode recovery against both powered-off and simply locked iPhones of all generations.

Utilizing Apple’s iCloud storage greatly increases the risk that a device’s contents can be accessed. Using iCloud to sync messages results in the decryption key being uploaded to Apple’s servers, which means law enforcement, Apple, and malicious hackers all have potential access. Device-specific file encryption keys also make their way to Apple via other iCloud services.

Over on the Android side, it’s a bigger mess. Multiple providers and manufacturers all use their own update services. Devices are phased out for ongoing protection by both, resulting in user confusion as to which devices still offer software updates and the latest in encryption tech. Cheaper devices sometimes bypass these niceties entirely, leaving low-cost options the most vulnerable to exploitation. And, while the DOJ and FBI may spend the most time complaining about Apple, it only commands about 15% of the smartphone market. This means most devices law enforcement seizes aren’t secured by the supposedly “impenetrable” encryption provided by Apple.

Google’s cloud services offer almost no protection for Android users. App creators must opt in to certain security measures. In most cases, data backed up to Google’s cloud services is only protected by encryption keys Google holds, rather than the user uploading the data. Not only is encryption not much of a barrier, but neither is the legal system. A great deal of third party data — like the comprehensive data sets maintained by Google — can be accessed with only a subpoena.

The rest of the report digs deep into the strengths and limitations of encryption offered to phone users. But the conclusion remains unaltered: law enforcement does have multiple ways to access the contents of encrypted devices. And some of these solutions scale pretty easily. While it’s not cheap, it’s definitely affordable. While there will always be those who “got away,” law enforcement isn’t being hindered much by encryption that provides security to all phone users, whether or not they’re suspected of criminal activity.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Still Not 'Going Dark:' Device Encryption Still Contains Plenty Of Exploitable Flaws”

Subscribe: RSS Leave a comment
23 Comments
Anonymous Coward says:

The insane cop proof security I put on my phone would prevent law enforcement from ever getting at the data on my phone.

And if my password is brute forced, the phone will wipe itself and reset.

When I do go to Canada’s Wonderland (I am still an amusement park and thrill ride junkie, despite myh age), I dial up my phone’s security to that level, as there is no way to get to Toronto, from the west coast, without going through Michigan.

This way if my phone is ever seized in Michigan as part of asset forfeiture, the contents will be totally inaccessible.

Anonymous Coward says:

Re: Re: Re:

Just changing a few settings on the phone where the phone will not even start until you enter the password, so connecting a computer to the phone and trying to get anything will yield nothing.

I am merely using stuff buillt into the phone. It is just a matter of knowing what settings to tinker with, which varies from phone to phone

Anonymous Coward says:

Re: Re: Re: Re:

Does that mean you use your password every time you do something with your phone, or do you have a simpler means of activation. You security is only as strong as what it takes to operate your phone after it is awitched on.

It does not matter how good the encryption is, if the keys are automatically enabled when someone logs into their system, as defeating the usually much weaker login is all it takes to access the contents.

Upstream (profile) says:

Re: Re: Re:

But won’t a sufficiently long and sufficiently random password make even NAND mirroring / brute-forcing time prohibitive? Scroll about half way down the article to the bullet points for the tl;dr; part. I know the article in the link is a few years old, and large scale parallel processing is definitely a thing, but I don’t think that computer processing speed has gotten fast enough to render the concept irrelevant, has it?

Upstream (profile) says:

Ah Ha!

Thanks, Tim! This answers my previous question about how the cops are getting around phone encryption. It is not the encryption itself that is being broken, but rather a faulty implementation of the decryption process that is being exploited. This should be a quick and easy fix, if Apple and Google care to bother with it. Of course, one has to wonder if this faulty implementation was intentional. This may be yet another situation where Hanlon’s Razor is not appropriate.

Over on the Android side, it’s a bigger mess.

Of course, this is no surprise. It has always been thus.

Anonymous Coward says:

Re: Re: Re: Ah Ha!

Encryption always requires that the end points have the keys. Key length/complexity means that users cannot memorize keys, so they are either held on the phone, or on removable media carried with the phone. This is a weakness of device encryption, protection is largely limited to memorable passwords that grant access.

shocker says:

Disingenuous indeed . . .

What about Spectre and Meltdown?

To the best of my knowledge these were zero day vulnerabilities that exposed the kernel language and have never actually been addressed. Yes, they came out with some patches and that shut everybody up, but if I remember correctly, the fix was only cosmetic. The only way to address the problem was at the manufacturing level.

I have long suspected that Intel and law enforcement have access to virtually any machine they want to get into.

Uriel-238 (profile) says:

Isn't the "going dark" complaint a cover

I had assumed for a while that the complaints by the FBI and law enforcement were a cover for their ability to get into any phone, and to discourage anyone from adding additional layers of security.

We may have to go back to a dongle system to have actually secure phones.

To be fair, it seems none of our government departments, including the DoD and DHS and its subdivisions, have robust security and Russian attackers are wandering through our secure networks like rambunctious children in Fantasyland.

Uriel-238 (profile) says:

As a note, if the police have it, the black hats do too.

The police rely on black-boxed technology, essentially cracking apps that use reliable exploits.

The black hats — especially cracking enthusiasts and industrial spies — have access to those exploits without the box, either doing it manually or a software version and a system to run it on. A Rasberry Pi if they’re saucy.

So if you have business secrets vulnerable to law enforcement, they’re probably vulnerable to rival companies and international interests as well.

Anonymous Hero says:

I’m having trouble understanding the point here.

TechDirt tends to support encryption and privacy, yet this article seems to make the point that law enforcement should not complain because they can just break into these phones without a warrant anyway.

So, should there be a warrant process? Or should we just leave it up to law enforcement to break into phones that use shit security?

Uriel-238 (profile) says:

Re: Locked phones and law enforcement

Techdirt has a long established opinion that weak encryption on phones is a bad thing. Neither law enforcement nor black-hat hackers should be able to easily crack open a phone and get access to its contents (which often includes permissions to personal data in the cloud, such as decades of email history).

But it is curious that our law enforcement has not stopped complaining about going dark even though they have tools to break into phones. They’ve even lied about the number of phones remaining sealed, if those phones are actually closed to them at all.

Government institutions have been lying to the people for decades (if not centuries) but since the aughts, their lies have been conspicuous and evident. And so it is with our law enforcement departments: we can’t trust what they are saying to reflect what is actually going on.

It’s long evident law enforcement departments are antagonistic to the public, all the while they insist they serve the public in some capacity (There are rare exceptions, mostly affecting the top 10%-ish wealthiest segment of the population) But even if they did serve the public the ability to penetrate phone security and gain easy access to individual lives is a liability to established basic human rights, specifically the right against unreasonable searches as described by the fourth amendment in the Bill of Rights.

nasch (profile) says:

Re: Re:

TechDirt tends to support encryption and privacy, yet this article seems to make the point that law enforcement should not complain because they can just break into these phones without a warrant anyway.

The point of this article is that they should not be pressuring companies to implement compromised security, because the story of law enforcement efforts "going dark" due to encryption is a false narrative.

Upstream (profile) says:

Do what you can, when you can

Always use all the security measures that are available to you.

Use phone disk encryption, even though, as has been discussed, there are holes in it. Also use phone SIM card encryption, if your phone allows it. This requires an additional password from you on each power-up, but all security measures require some small inconvenience. Don’t take your primary phone with you unless necessary. Don’t keep your primary phone powered up unless necessary. You can carry a $20 flip phone without even having a SIM card installed, and it can still call 911, although in many cases that might just make a bad situation worse. But 911 can call ambulances as well as LEOs, so there is that. Write important phone numbers on your arm (lawyer) or on edible paper (friends) if you are going where arrest might be likely (protest).

While these measures may not make it impossible for LE to break into your phone, they will make it extremely difficult, and will limit the amount of information available if they do break in.

Use full disk encryption on your home computers and laptops.

Use a VPN on your phone and computer. Riseup / Bitmask is a well regarded outfit. This will hamper over-the-air or on-the-wire snooping, as well as provide some measure of anonymity to end web sites.

Use Tor whenever you can, on phone and computer. It can be a bit slow, but often not too bad. This also will hamper over-the-air or on-the-wire snooping, as well as provide some measure of anonymity to end web sites, even more so than VPN.

Use a password manager, so you can have different, long, random passwords for all online accounts. This is basic infosec. KeePassXC is cross-platform, and well regarded. It is what TAILS uses. REMEMBER your master password!!

Maintain separate email accounts to compartmentalize some of your online activities.

Use Enigmail (or some other OpenPGP Public Key Encryption scheme) for all your email, and encourage all your email correspondents to use it, too.

Use TAILS, a privacy and anonymity oriented OS that can boot and run from a USB stick. It routes all communication through the Tor network.

These last two are what Edward Snowden used in his heroic information dumps to various media outlets about US spying on it’s citizens.

Some of these things are too simple to ignore (VPN, basic phone security). Some require a bit of effort and / or have a bit of a learning curve (Tor, TAILS). Some require others to participate in order to be effective (encrypted email). And some are just plainly a bit of extra hassle. But security does not come without at least some small cost, and chances are that you and most of your friends are well capable of employing them all. There is an abundance of information on the Internet about all of these. It is usually just initiative and the will that is lacking.

If you are into making New Year Resolutions, implementing some of these security measures might be a good one.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...