Alternate Titles: Apple Now Looking To Close The Backdoor The FBI Discovered

from the real-talk dept

Yesterday the NY Times put out a story claiming that Apple Is Said to Be Working on an iPhone Even It Can’t Hack, with the underlying thrust being that this is a response to the big DOJ case against it, in which the court has ordered Apple to undermine key security features, which would then enable the FBI to brute force the (almost certainly weak) passcode used by Syed Farook on his work iPhone. But, here’s the thing: prior to that order and its details coming to light, many people were under the impression that the existing iPhones were ones that it “couldn’t hack.” After all, it was offering full disk encryption tied to the device where it didn’t hold the key.

And thus, a key reality of this debate is that Apple already had a bit of a backdoor in its devices: it could update the code on the device, without it wiping the key, and that updated operating system could, theoretically, remove key security protections that made the iPhone’s security workable. It’s just that the FBI found the backdoor.

So, really, it appears that what Apple is doing is what a few of us asked about as the details became clear: why can’t Apple build a phone that works the way many people assumed it worked prior to this court order: and not allow for such a software update to work without first being approved by the end user.

So, really, this is just Apple closing the backdoor that the FBI revealed. Nice work, FBI, for disclosing this vulnerability.

Of course, as this so-called “arms race” continues, the surveillance state apologists are coming out of the woodwork to insist that the law must stop what the technology allows:

?We are in for an arms race unless and until Congress decides to clarify who has what obligations in situations like this,? said Benjamin Wittes, a senior fellow at the Brookings Institution.

Or, Congress can leave things as they are, and Apple and others can continue to better protect the security of all of us. That seems like a good idea.

Filed Under: , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Alternate Titles: Apple Now Looking To Close The Backdoor The FBI Discovered”

Subscribe: RSS Leave a comment
47 Comments
Anonymous Coward says:

“So, really, this is just Apple closing the backdoor that the FBI revealed. Nice work, FBI, for disclosing this vulnerability.”

And for the FBI to disclose the vulnerability they cut their own throat with regards to not being able to use the vulnerability in question. Way to go. If you were a spy and found a way to spy without being detected then by revealing the way you spy you lose forever that way of spying without being detected as that way will be plugged up to stop you from spying once it is revealed how you do it.

DannyB (profile) says:

Re: Re: Re:

Dear Apple: here is a checklist:

* Firmware must be approved by the user.
* That approval can only be done by unlocking the phone first in order to use it to give the approval.
* Firmware updates to the phone OS should NOT be able to compromise security.
* Firmware updates to the security apparatus should require destruction of all keys — effectively wiping the phone.
* Make sure customers are fully aware: if you lose your password you have lost all your family photos forever. Make backups of things like this.

That would allow you to test new security apparatus updates in your labs as many times as you want. But you would want the security apparatus firmware to be effectively ‘baked in’. Or updating it must require at a minimum that you back up everything first, and be willing to completely wipe the phone for an update to the security apparatus.

Ninja (profile) says:

So the crack in the wall was already there. So we now confirmed it can be done. It still doesn’t change the fact that this will set a bad precedent and it confirms that companies will move to make their access impossible while the phone is locked and encrypted. I suspect a good start is preventing any updates without user input. Actually, Micro$oft could learn a thing or two from this. Even if Apple can do it it doesn’t actually mean we should crucify them, it will all depend on how they will act now.

Anonymous Anonymous Coward says:

Re: Just when I thought I could have a reason to like Apple

Apple has always been that way. The first computer I purchased was an Apple IIc. When I first used an IBM PC, one of the the first things I noticed what the Apple hid the operating system to a large degree, whereas DOS was fairly transparent.

Interesting how over time Microsoft has become more like Apple in that sense.

Max says:

Utterly disgraceful

Only goes to confirm that anybody, absolutely anybody with any power sooner or later ends up abusing it – even the maker of a “secure” phone itself. That backdoor should NEVER have been there in the first place. If a device is supposed to be impenetrable for everyone except its owner then THAT INCLUDES YOU TOO, APPLE! WHY DO THOSE IN POWER _ALWAYS_ CONSIDER IT SELF-EVIDENT THAT _THEY_ ARE OF COURSE ABOVE THE RULES?!?

Anonymous Coward says:

Can I screem

just a big fat fucking I TOLD YOU SO!

I am one of the AC’s that was asking why apple was not already telling the FBI that they could not hack their shit, and if they could hack it then their encryption was already fucking bunk!

The real news story all along should have been this headline…

“FBI is asking Apple to hack a phone and Apple’s response WAS NOT ‘we do not have the ability to decrypt our phones, sorry'”

Now that we go that out of the way… the judicial overreach should still be getting a very hefty slap down for it tyrannical approach to so called ‘Justice’.

Whatever (profile) says:

I think it’s incredibly naive to think that Apple wasn’t aware of this very “flaw”. In fact considering that Apple has updated their OS and firmware a number of times since they started to offer encryption, you can bet that they were well aware of the “flaw”.

For me, it looks much more like something Apple came to realize at some point in the past, a paradox of making the encryption “harder” but relying on a user supplied pincode as an access method. No matter how big the wall, how strong the lockbox, if you secure it with a 29 cent lock, that is the weakest point.

You are the weakest link, goodbye.

I also think Apple may have been keeping this one in their back pocket as a way of dealing with issues that might come up in a country like China. You can bet pretty solidly that the Chinese government made it abundantly clear to Apple that their position in the market place comes with great responsiblity – read into that what you like, I guess.

Tim Cook’s massive agro approach in the last week, the beyond full court press of horror stories and the end of personal privacy narrative seems to be there mostly to cover up and stop us paying attention to the basic concept that Apple’s encryption ain’t all that good or all that secure. Don’t look at the issue, let’s just scare you with this stuff over here instead.

Mike, it would be great if Techdirt took as critical a look at Apple as you have at the FBI and it’s approach. Apple doesn’t come off lily white innocent in all of this either.

AJ says:

Re: Re:

“Mike, it would be great if Techdirt took as critical a look at Apple as you have at the FBI and it’s approach. Apple doesn’t come off lily white innocent in all of this either.”

I like where your going with this, but I’m not so sure this would be the best issue to beat them over the head with. Apple very well may have decided not to patch this particular vulnerability because they were the only ones capable of exploiting it. I would suggest that Apple had a reasonable expectation that the government would never require them to create software that could exploit a vulnerability in their own products. So it was probably a known vulnerability, but deemed not worth the effort to fix. So evil? No.// Lazy? Maybe so.

Anonymous Coward says:

Re: Re:

Tim Cook’s massive agro approach in the last week, the beyond full court press of horror stories and the end of personal privacy narrative seems to be there mostly to cover up and stop us paying attention to the basic concept that Apple’s encryption ain’t all that good or all that secure.

But it is secure enough that the FBI can’t break into it, unless Apple helps them. Unless you’re not paying attention to that little tidbit of information…

Whatever (profile) says:

Re: Re: Re:

“ut it is secure enough that the FBI can’t break into it”

I think in part it’s a question of what all would have to be hacked to make it happen. Not only would the FBI have to develop the patch, they would also have to hack the updating process, which in itself is a whole lot of work. It is way more expedient and way safer to get Apple to do the work.

I also think the FBI thinks that Apple already has such a patch made for countries like China. So it’s not exactly that it would take long to create.

Anonymous Coward says:

Re: Re:

Mike, it would be great if Techdirt took as critical a look at Apple as you have at the FBI and it’s approach.

So lemme make sure I understand your “logic:”

1. Apple, in having a product that you feel is insecure makes Apple a villain.

2. The fact that the FBI can’t break into it without Apple’s help also makes Apple the villain.

3. The FBI, in demanding that Apple develop something that weakens the security further is just fine, despite the 1st point.

How do you reconcile such a stupid viewpoint in that head of yours?

Whatever (profile) says:

Re: Re: Re:

No, you don’t understand the logic at all.

1 – Apple has a product that is suppose to be super secure, but they have a fairly obvious attack vector open that they likely knew about for a long time

2 – Apple’s response initially wasn’t “oh, a hole, let’s fix it” and instead “What is at stake here is can the government compel Apple to write software that we believe would make hundreds of millions of customers vulnerable around the world, including the US”. Rather than worrying about fixing the hole and issuing a patch, they seem more cocerned with protecting the hole

3 – Nobody is asking Apple to write code and release it to all of the public’s phones. Apple tightly controls the update process through intense levels of security (see the story about a week ago about replacement parts on an Apple phone). Nobody is asking Apple to break security for millions of Americans.

The level of arm waving and histrionics coming from Apple make me think they have something to hide, that’s all.

AJ says:

Re: Re: Re: Re:

  1. That attack vector that is open is only able to be exploited if Apple creates a new OS. Since Apple is the only one capable of doing that, I’m not sure that really qualifies as a bug or exploit.

    2. See number 1.

    3. If they open the door, the flood gates will open. They will get thousands of requests from all over the U.S., not counting other countries. There are literally thousands of iphones sitting in evidence rooms waiting for this very thing. They are already lining up to take advantage of this. This could cost Apple Billions in PR loses. Even the FBI admits that this case could set a legal precedent. Once this hacked O.S. gets out into the wild, there is no telling what damage it could/will do to apple and the public. Apple damn well better stand it’s ground. No one will ever trust that company again if they do this.

    Why exactly do you want to see Apple destroyed so badly? Is the data on that phone really worth putting millions of people at risk?

    http://abcnews.go.com/Technology/york-da-access-175-iphones-criminal-cases-due/story?id=37029693

    http://www.theguardian.com/technology/2016/feb/25/fbi-director-james-comey-apple-encryption-case-legal-precedent

Anonymous Coward says:

Re: Re: I don't want Congress to decide...

Agree… “The People” really don’t give a fucking damn what is going on or they would not have elected this moron or the moron that put them there or gave them power.

Congress has been largely ducking their responsibilities until manufactured public outrage by the press forces their hand.

Anonymous Coward says:

It's just that the FBI found the backdoor.

You’re finally starting to get it.

Now, who has the ability to change that firmware? …it’s not just apple. Apple uses the standard method the chipmaker designed- through the standard authorization channels that the carrier and protocals enable.

Castles on foundations of sand.

Dig deeper.

Anonymous Coward says:

Re: Re: It's just that the FBI found the backdoor.

Apple *IS* the chipmaker (well, the chip designer anyway).

From the A6 Teardown, I understand Samsung fabbed that processor?

The Apple A6—labeled APL0598 on the package marks and APL0589B01 on the inside—is fabricated by Samsung on their 32 nm CMOS process and measures 9.70 mm x 9.97 mm.

Anonymous Coward says:

It's just that the FBI found the backdoor.

The are very few manufacturers of baseband chips, Apple does not design or manufacture there own. They are moving in that direction; but even if they did- they don’t control the standards those chips must be designed to meet. Remote authority is built into the standard, baseband must accecpt and execute commands delivered via the carrier. Eliminating exploits and/or improving security of carrier authentication does not change this fundemental authentication architecture.

It’s a house of cards, black-box security setup, based on “trust us”.

Anonymous Coward says:

Re: It's just that the FBI found the backdoor.

Well, they CAN impose standards even if they cannot control them AND they can publicly embarrass and sue any company found supplying chips that did not conform to standards or had other nefarious shit put into it.

Like I had mentioned in a past article… the ONLY reason Apple was fighting is because they had something to hide from the public. Were they able to give the FBI what they wanted and keep it under the rug…??? Well… Apple would have sucked FBI dick so fast they would have thrown their neck out.

Lets never pretend that any business has the interest of the public or its customers in mind… therefore we, as a public must make it clear that public interest also matches business interest. Sadly there is fat little chance of that because the public itself is a fucking tool ripe for abuse and ignorance!

Anonymous Coward says:

apple can impose baseband standards.

no. no they can not. That’s the FCC’s realm.

They could segregate the system from the baseband, to limit what control and access it has over the phone. As far as I’m aware they have not done that.

As far as I’m aware (and I have done a fair amount of reaserch into this- admitedly mostly on android based systems) NO phone is currently available that segregates the baseband co-processor. The baseband has full access to every aspect of the phone- and the main cpu/system/os has no oversight or control of this.

That Guy says:

If the FBI wins this case and compels Apple to create and sign a firmware image that lets the FBI brute-force the passcode on an iPhone the FBI has confiscated, the FBI will next demand that Apple create and sign a firmware image which captures the passcode on an iPhone that is still in the hands of an FBI or NSA suspect. That new Apple evil-firmware will transmit the user’s passcode to the FBI every time the user unlocks the phone.

The FBI will also ask Apple to sign a firmware image or perhaps just an App which runs on an iPhone in the background to either capture the user’s passcode and send it to the government or– if Apple builds a secure path from the user’s fingers to the passcode/key-hash facility– to brute-force the passcode (using the phone’s own CPU– and battery life) then send it to the FBI and NSA, plus all the other governments and well-funded crooks in the world.

darren chaker (profile) says:

Re: Brute Force

Great point. I also see this as a case where if the government is successful in forcing the manufacturer to thrart its own decsign, next on the hit list will be whole disk encyption, encrypted email, file encryption and all things we thought were secure. As Bruce Schneier recently posted, there are almost 600 foreign encryption products, hence if American made products are not secure, people will simply look to foreign products for true security.

Possibly we can secure our border where 2.5 million have crossed in in the last 7 years, http://www.washingtontimes.com/news/2015/jul/20/number-of-illegals-levels-off-fewer-crossing-mexic/?page=all, to secure this country in before assaulting the Fourth Amendment, encryption technology, American ingenuity, in the name of figuring out what’s on the phone of two people who never should have passed screening to get into the USA in the first place. What happened in San Bernardino is being used as hyoe to get backdoors.

Whatever (profile) says:

Re: Re:

“the FBI will next demand that Apple create and sign a firmware image which captures the passcode on an iPhone that is still in the hands of an FBI or NSA suspect.”

You are falling for the narrative, not reality. There is nothing in the law that would allow them to do that. You would have to get past all sorts of issues, such as CALEA (as Mike has pointed out) as well as the need for a warrant, etc.

Don’t fall for the scare tactics being used by Apple. Pay attention to the whole story and you will quickly realize Apple has something to hide. They want you to look away while they deal with it.

Anonymous Coward says:

Re:It's just that the FBI found the backdoor.

“It doesn’t really matter does it? I mean, the thing’s a complete fabrication….”

It’s not a fabrication, at least not entirely… it’s just not exactly what it purports to be about. The (known by both parties) pretext about backdoors is classified information- so the public argument can’t be about what it’s really about…

Everyone agrees the “debate” needs to happen- this is the best way they can do that. Debate the structural integrity of the building foundation- ignore the classified sink hole below- at the end of the day we’re all still sort-of discussing weather the building is going to collapse.., and who will bear responsibility if it does. -imo this is likely what apple is more concerned with.

The industry standard is that big corp quietly cowers to big gov and enables surveillance and data collection. Good for the goose, good for the gander; back scratches and a grotesquely suspicious lack of antitrust litigation, consumer protection, and tax enforcement all around.

Win10 is only distinguishable from Malware/Trojan/Spyware by fallacious categorical error, and OEM’s Android is really not much better for non-technical users.

Apple is an outlier with their privacy stance- at the very least, in their marketing; I would hope more, but I wouldn’t place much faith in that notion at all. A legally informed read of there EULA would likely reveal the standard, loophole swiss cheese that amounts to “you have no rights or recourse, and we can do whatever we want”

A proprietary closed source walled garden with “trust us” security, being the last bastion of hope for security/privacy- that shouldn’t sit right with anyone.

If they really cared about privacy/security they’d segregate the baseband and go open source- then I’d sing their praise from the tree-tops after making a substantial investment in company stock.

Anonymous Coward says:

Re:

quote=”What type of lousy spies do we have these days when they can’t even break into apple’s hq and get the key to use it themselves?”

That’s a good question… The better one would be why you’d assume we have lousy spies. We have the best spies in the world, and they have more/better capabilities then anytime in history.

Apple had over 650 vulnerabilities in 2015 alone (incidentally, worse then any other vendor)

I’d say there’s not a even snowballs chance in hell they don’t have that key.

Anonymous Coward says:

the things we thought were secure.

“next on the hit list will be whole disk encyption, encrypted email, file encryption and all things we thought were secure. “

If your intent is to covertly collect valuable intel, your targets belief in their privacy/security is paramount. If they don’t have faith in the security/privacy of the device/method they’re not going to trust it with any of their secrets. It’s not outlandish to consider that this fight may have been picked to lose. lose the battle, win the war.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...