Dissecting And Dismantling The Myths Of The DOJ's Motion To Compel Apple To Build A Backdoor

from the dishonest-doj dept

While everyone’s waiting for Apple’s response (due late next week) to the order to create a backdoor that would help the FBI brute force Syed Farook’s work iPhone, the DOJ wasted no time in further pleading its own case, with a motion to compel. I’ve gone through it and it’s one of the most dishonest and misleading filings I’ve seen from the DOJ — and that’s saying something. Let’s dig in a bit:

Rather than assist the effort to fully investigate a deadly terrorist attack by obeying this Court’s Order of February 16, 2016, Apple has responded by publicly repudiating that Order. Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data which has been found by this Court to be warranted for an important investigation. Despite its efforts, Apple nonetheless retains the technical ability to comply with the Order, and so should be required to obey it.

This part is only marginally misleading. The key point: of course Apple has designed a product that allows technology to control access because that’s how encryption works. It’s as if the DOJ still doesn’t understand that. Here’s a simple, if unfortunate, fact for the DOJ: there are always going to be some forms of communications that it doesn’t get to scoop up. Already we know that Farook and his wife destroyed their two personal iPhones. Why not just recognize that fully encrypted phones are the equivalent of that? No one seems to be whining about the destroyed iPhones and what may have been lost even though the very fact that they were destroyed, and this one was not, suggests that if there was anything important on any of his phones, it wasn’t this one. There are also things like communications between, say, a husband and wife in their own home. The DOJ can never get access to those because the two people are dead. Think of that like their brains were encrypted and their death made the key get tossed.

There are lots of situations where the physical reality is that the DOJ cannot recover communications. It’s not the end of the world. It’s never been the end of the world.

Apple, now (finally) trying to design encryption systems that make it so no one else can get in sees this is the best way to protect the American public, because it means that their own information is much safer. It means fewer phones get stolen. It means fewer people are likely to have their information hacked. It means much more safety for the vast majority of the public. And I won’t even get into the fact that it was the US government’s own hacking of private data that pushed many companies to move more quickly towards stronger encryption.

The government has reason to believe that Farook used that iPhone to communicate with some of the very people whom he and Malik murdered. The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple. The FBI obtained a warrant to search the iPhone, and the owner of the iPhone, Farook’s employer, also gave the FBI its consent to the search. Because the iPhone was locked, the government subsequently sought Apple’s help in its efforts to execute the lawfully issued search warrant. Apple refused.

“May contain” is a pretty weak standard, especially noting what I said above. Furthermore, if there were communications with Farook’s victims, then shouldn’t that information also be accessible via the phones of those individuals as well? And if they already know that there was communication between the two, much of that data should be available elsewhere, in terms of metadata of a phone call, for example.

Apple left the government with no option other than to apply to this Court for the Order issued on February 16, 2016.

Actually, there are plenty of other options, including traditional detective work, looking for information from other sources or just recognizing that sometimes you don’t get every piece of data that exists. And that’s okay.

The Order requires Apple to assist the FBI with respect to this single iPhone used by Farook by providing the FBI with the opportunity to determine the passcode. The Order does not, as Apple’s public statement alleges, require Apple to create or provide a “back door” to every iPhone; it does not provide “hackers and criminals” access to

iPhones; it does not require Apple to “hack [its] own users” or to “decrypt” its own phones; it does not give the government “the power to reach into anyone’s device” without a warrant or court authorization; and it does not compromise the security of personal information. To the contrary, the Order allows Apple to retain custody of its software at all times, and it gives Apple flexibility in the manner in which it provides assistance. In fact, the software never has to come into the government’s custody.

And here’s where the misleading stuff really starts flowing. It absolutely is a backdoor. Anything that makes it easier for a third party to decrypt data without knowing the key is a backdoor. That’s the definition of a backdoor. That it comes in the form of making it substantially easier to brute force the passcode doesn’t change the fact that it’s still a backdoor.

And, yes, this impacts “every” iPhone. As Senator Ron Wyden correctly notes, if the precedent is set that Apple can be forced to do this for this one iPhone, it means it can be forced to do it for all iPhones. No, this single piece of code may not be the issue — though there are some concerns that even creating this code could lead to some problems if the phone connects to a server — but forcing a company to hack its own customers puts everyone at risk.

And yes, there is no legitimate way to describe this without claiming that it’s hacking Apple’s own customers. The whole point of the system is to get around the fact that they don’t have the key, and building a tool to disable security features and then allow a brute force attack on the passcode is very much exactly “hacking” Apple’s own customers. Sure, this one still requires a warrant, but once Apple is pushed to create that kind of code — and other companies are forced to build similar backdoors, the technology itself is being designed with extra vulnerabilities that will put many more people at risk. It’s not just about the DOJ seeing what’s on this damn phone.

The fact that Apple can retain control over the software is a total red herring. No one cares about that. It’s about the precedent of a court requiring a company to hack its own customers, as well as forcing them to create a backdoor that can be used in the future — even to the point of possibility requiring such backdoors in future products.

In the past, Apple has consistently complied with a significant number of orders issued pursuant to the All Writs Act to facilitate the execution of search warrants on Apple devices running earlier versions of iOS. The use of the All Writs Act to facilitate a warrant is therefore not unprecedented; Apple itself has recognized it for years. Based on Apple’s recent public statement and other statements by Apple, Apple’s current refusal to comply with the Court’s Order, despite the technical feasibility of doing so, instead appears to be based on its concern for its business model and public brand marketing strategy.

And the misleading bullshit gets ratcheted up a notch. First of all, we already went through why the “Apple helped us in the past” story is wrong. This is totally different. One is giving access to unencrypted information that Apple had full access to. The other is building a system to hack away security features in order to hack into an encrypted account. Very, very different. Second, the whole idea that better protecting its customers is nothing more than “a brand marketing strategy” is insulting. Should the US government want the American public to be protected from criminals and malicious hackers and attacks? The best way to do that is with encryption. The fact that consumers are demanding that they be safer is not an “Apple marketing strategy” it’s Apple looking out for the best interests of its customers.

And I won’t even dig deep into the fact that one of the big reasons why the public is clamoring for more protection these days is because the US government ran roughshod over the Constitution over the past few years to suck up all kinds of information it shouldn’t have.

Later in the motion, the DOJ again argues that there’s no “unreasonable burden” on Apple to hack its own customers. It trots out a similar line that was in the original application for the order, saying “what’s the big deal — we’re just asking for software, and Apple makes software, so no burden.”

While the Order in this case requires Apple to provide or employ modified software, modifying an operating system which is essentially writing software code in discrete and limited manner is not an unreasonable burden for a company that writes software code as part of its regular business. The simple fact of having to create code that may not now exist in the exact form required does not an undue burden make. In fact, providers of electronic communications services and remote computing services are sometimes required to write some amount of code in order to gather information in response to subpoenas or other process. Additionally, assistance under the All Writs Act has been compelled to provide something that did not previously exist the of the contents of devices seized pursuant to a search warrant. In United States v. Fricosu…, a defendant’s computer whose contents were was seized, and the defendant was ordered pursuant to the All Writs Act to assist the government in producing a copy of the contents of the computer. Here, the type assistance does not even require Apple to assist in producing the contents; the assistance is rather to facilitate the FBI’s attempts to test passcodes.

Again, this is both ridiculous and extremely misleading. Creating brand new software — a brand new firmware/operating system is fraught with challenging questions and potential security issues. It’s not just something someone whips off. If done incorrectly, it could even brick the device entirely, and can you imagine how the FBI would react then? This is something that would require a lot of engineering and a lot of testing — and still might create additional problems, because software is funny that way. Saying “you guys write software, so writing a whole new bit of software isn’t a burden” is profoundly ignorant of the technological issues. Update: If you want a long and detailed post from someone who absolutely knows how iPhone forensics works, and how incredibly involved creating this software would be, go read this blog post right now. In it, Jonathan Zdziarski, notes that the DOJ is flat out lying in the way it describes what it’s asking Apple to do, and it would be incredibly involved, and would create all sorts of risks of the code getting out.

Second, the Fricosu case is quite different. That was compelling someone to give up their own encryption key — something that not all courts agree with by the way, as some view it as a 5th Amendment or 1st Amendment violation. That’s quite different than “write a whole new software thing that works perfectly the way we want it to.”

As noted above, Apple designs and implements all of the features discussed, writes and signs the routinely patches security or functionality issues in its operating system, and releases new versions of its operating system to address issues. By comparison, writing a program that turns off features that Apple was responsible for writing to begin with would not be unduly burdensome.

This shows a profound technological ignorance. Yes, Apple updates its operating system all the time, but yanking out security features is a very different issue, and could have much wider impact. It might not, but to simply assume that it’s easy seems profoundly ignorant of how software and interdependencies work. Again, the DOJ just pretends it’s easy, as if Apple can just check some boxes that say “turn off these features.” That’s not how it works.

Moreover, contrary to Apple’s recent public statement that the assistance ordered by the Court “could be used over and over again, on any number of devices” and that “[t]he government is asking Apple to hack our own users,” the Order is tailored for and limited to this particular phone. And the Order will facilitate only the FBI’s efforts to search the phone; it does not require Apple to conduct the search or access any content on the phone. Nor is compliance with the Order a threat to other users of Apple products. Apple may maintain custody of the software, destroy it after its purpose under the Order has been served, refuse to disseminate it outside of Apple, and make clear to the world that it does not apply to other devices or users without lawful court orders. As such, compliance with the Order presents no danger for any other phone and is not “the equivalent of a master key, capable of opening hundreds of millions of locks.”

We discussed some of this above, but the issue is not the specific code that Apple will be forced to write, but rather the very fact that it will be (contrary to the DOJ’s claim) forced to hack their own phones to eliminate key security features, in order to allow the FBI to get around the security of the phone and access encrypted content. If the court can order it for this phone, then yes, it can order it for any iPhone, and that’s the key concern. Furthermore, again having Apple tinker with the software can introduce security vulnerabilities — and already this discussion has revealed a lot about how hackers might now attack the iPhone. I’m all for full disclosure of how systems work, so that’s okay. But the real issue is what happens next. If Apple looks to close this “loophole” in how its security works in the next iPhone update, will the court then use the All Writs Act to stop them from doing so? That’s the bigger issue here, and one that the DOJ completely pretends doesn’t exist.

To the extent that Apple claims that the Order is unreasonably burdensome because it undermines Apple’s marketing strategies or because it fears criticism for providing lawful access to the government, these concerns do not establish an undue burden. The principle that “private citizens have a duty to provide assistance to law enforcement officials when it is required is by no means foreign to our traditions.”

Again, this is a made up talking point. Protecting user privacy, as they demand it, is not a “marketing strategy.” It’s a safety and security strategy. You’d think, of all agencies, the FBI would appreciate that.

Anyway, you can go through the entire 35 page filing yourself, but these were the key points, and almost all of them are misleading. It should be interesting to see Apple’s response next week.

Filed Under: , , , , , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Dissecting And Dismantling The Myths Of The DOJ's Motion To Compel Apple To Build A Backdoor”

Subscribe: RSS Leave a comment
47 Comments
Richard (profile) says:

Apple messed up

If the data is retrievable by Apple writing code – then it is retrievable by the DOJ or the NSA writing code.

IF it is retrievable then Apple messed up when they designed the phone in the first place.

If Apple was working to open source rules then they could simply turn round and say to the DOJ “The information is available – you do it – if you can”.

The problem here is that Apple seems to have relied on Security by Obscurity.

Anonymous Coward says:

Re: Apple messed up

If the data is retrievable by Apple writing code – then it is retrievable by the DOJ or the NSA writing code.

Not really. Mobile device firmware updates — at least for iOS and Android — need to be signed by a digital signature. The DOJ and NSA do not have Apple’s signing key. Even if law enforcement had the skills to write an operating system, the hardware would reject it, as not being properly signed.

IF it is retrievable then Apple messed up when they designed the phone in the first place.

Not really. What law enforcement is looking to do is to have Apple make it easier for them to attempt to brute-force the PIN. Apple cannot retrieve the data any more than law enforcement can. About the only thing you can argue here is that the 10-failures-and-it’s-wiped rule could be burned into hardware, such that it could not be removed. That’s certainly possible, and Apple might do that in future devices.

Whatever (profile) says:

Re: Apple messed up

More or less correct.

“he key point: of course Apple has designed a product that allows technology to control access because that’s how encryption works.”

The problem is that Apple claims to have the secure chip thing for encryption, but for all that it depends on a pincode from the user as the true security – because the user has to be able to access the encrypted device, right?

The problem is that Apple’s security boils down to the pin code, and two very primitive security concepts: limit attempts, and delay per attempt. Both of those features are set up in IOS or other firmware, and pretty much everyone (including people who worked on it) agree that those things could be disabled without harming the secure key code.

Good encryption would require a pincode long enough to make brute forcing the system meaningless. Even at 8 digits (just numbers) the process would take a couple of years at 1 per second. Force it to be letters rather than numbers and the 8 letter passcode requires 208827064576 attempts. Since it must be manually entered on the device (ie, they cannot send thousands per second or operate in parallel) the security would be intact and there would be little or no discussion.

Apple appears to have made it too easy to brute force, and made the security against brute force be in firmware or OS code which can be overwritten under the right circumstances. That choice makes the FBI request and the court order possible, and Apple hates it.

Anonymous Coward says:

Re: Re: Apple messed up

Good encryption would require a pincode long enough to make brute forcing the system meaningless.

Apple is a consumer product manufacturer, and so usability concerns play a role in decision-making. Life is a series of trade-offs.

Since it must be manually entered on the device (ie, they cannot send thousands per second or operate in parallel) the security would be intact and there would be little or no discussion.

Part of what the DOJ is requesting is the ability to test passwords without manual input, to drive the time per test down to the ~80ms obtained from the PBKDF2 rounds.

Personally, I am surprised that they use as few PBKDF2 rounds as they do, such that it introduces only ~80ms delay.

Whoever says:

Laughable claim by the FBI

The claim that this order has nothing to do with any other phone is laughable.

Farook destroyed his personal phone. Why did he not destroy this phone? Obviously, it has no data that is useful to investigators. As pointed out in the article, he used the phone to communicate with work colleagues, so why not get the data off those phones?

The FBI knows all this, so why waste resources on this phone? The only reason is that the FBI thinks that this is a case where people will sympathise with them and this will help the FBI establish a precedent.

Chris Brand says:

Re: Laughable claim by the FBI

They even admit it – “make clear to the world that it does not apply to other devices or users without lawful court orders”. In other words it would apply to other devices and users with a court order.

I’m also curious about how you’d go about creating this software that only works on this one phone – either you write it and the first test is on the actual device (maybe deleting the data the FBI are after due to a bug), or you write it to run on some other phone (or all phones), test it there, then add in the “only on this particular phone” code.

Anonymous Coward says:

Re: Re: Re: Laughable claim by the FBI

Where an OS does a license check before installing upgrades, it does so by phoning home with some license key. Therefore if Apple do this, they would either have to modify their license database software, or use a ‘fake’ network to connect to a special version of that software.
The alternative is to install and run a special installer, which checks the devices ID and the runs the real of the install.
All approaches require Apple to do extra work on installing software to enable them to target a particular machine. Therefore targeting a specific machine is not particularly easy. The easiest way to do what is requested is to use a MAC to install the software via USB to the targeted phone.

z! (profile) says:

Besides “accidentally” bricking the phone, another option for Apple is to take a vvveerrrrrryyy lllloooooonnnnnnggggg tttiiiiimmmmmeeeeee to create/test the software, and then send the FBI a colossal bill for the services (although it would be better to get the money up front).

And if they -do- brick the phone, what can the FBI actually do besides wag a finger at Apple?

Rob (profile) says:

Fifth Amendment, and Corporations rights of Individuals, too! ;)

US Supreme Court cases Hobby Lobby, using Citizens United as precedent allow Apple to respond back to the federal court that they have individual rights, too. And if a person doesn’t have to turn over their passphrase to decrypt their device, then a corporation doesn’t have to decrypt devices either – not that Apple can do this anyways, they technically CAN’T without a previous backdoor installed. This case exposes the lack of technical knowledge of both FBI agents and Federal judges.

Anonymous Coward says:

Apple went to designing an encrypted phone because if they don’t the customers in the global economy won’t want their phones. Apple is a business, not a government branch. The government has already done serious harm to Cisco through the photos that were released by the Snowden revelations.

China won’t allow Windows later than XP because of the spying issue with Microsoft and the government. That’s a whole country that won’t be considering Cisco’s equipment. I haven’t heard the government stepping up to say they will cover Cisco’s loss due to their own actions.

Nor in all this do I hear the FBI saying that they know if this is done it will cause a major hit on Apple’s profit line they’d be willing to pay for. That part of it seems rather mysteriously silent.

Krish (profile) says:

To me the question is: why can Apple even do this? How can they push system updates to a locked phone?

The answer of course, is that they still feel like they own your phone as much as you do. This gets back to the issue, discussed here often, of companies remote-deleting content on your device because you don’t really “own” the content or even the device.

Maybe stuff like this make these companies give some control back to their customers. Of course, Stallman would say that the until everything is free software, you’ll never own your device.

TechDescartes (profile) says:

Circular Investigation

Maybe the FBI should be alerted to this attempt to gain access to a wireless device via hacks and bits of malicious code. According to their website, this is something about which they are very concerned:

Every day, criminals are invading countless homes and offices across the nation—not by breaking down windows and doors, but by breaking into laptops, personal computers, and wireless devices via hacks and bits of malicious code.

https://www.fbi.gov/about-us/investigate/cyber/computer-intrusions

GMacGuffin (profile) says:

Procedurally Bizarre Fallacious Preemptive Motion to Compel

The entire motion is disingenuously based on the false premise that Apple “repudiated” the court’s Order. Meaning, that Apple has made it clear that it is refusing to comply with the Order.

But Apple only objected to the order in Tim Cook’s post. Apple has not said, “We will refuse to obey this Order.”

And more importantly, Apple doesn’t have to comply with anything yet. It has until next week to file its opposition.

So the entire motion is procedurally suspect at the least, and based on DOJ saying Apple said something it didn’t say, and therefore … (butthurt or something)

Anonymous Coward says:

Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel

You sound like you are saying Tim Cook is being a Tricky Dick and his lawyers run the company. And we shouldn’t believe Tim’s rantings on his blog. Okay. But I like Mike’s latest pass the blame game better. Mike implies it is San Bernardino County’s fault for remotely disabling a password they didn’t have access to. SBC mistake only disable the backup feature, but the feature hadn’t been used since October.

Either way, this is an extraordinary situation where the All Writs Act applies. A number of people died because of these county workers and everyone should be helping solve the crime.

Anonymous Anonymous Coward says:

Re: Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel

What the hell are you talking about? The crime has been solved. The perpetrators are dead. Anything they don’t already know they can get from the other end of any communication, not that there is anything left to know. THERE IS NO COURT CASE, except where the FBI is trying to force a backdoor. Make no mistake, no matter how the DOJ and FBI twist their words, they are seeking a backdoor.

GMacGuffin (profile) says:

Re: Procedurally Bizarre Fallacious Preemptive Motion to Compel

… Point is there appears to have been no rational or legal reason for the DOJ to bring this motion to compel. It’s premature.

So the real reason must be that DOJ wanted to get their version of the story into the press/public to counter Apple’s statements (and it reads like it; and the press did eat it up). And if so, that’s not a legit use of our tax dollars in my book.

Anonymous Anonymous Coward says:

Re: Question!

I have read several lawyers commenting on that. It appears that on the face of it, this judge has the right to sign the order. Whether that order would stand up in a higher court is unknown.

On balance though, anecdotaly (I haven’t really counted) somewhere around 95% of all the comments I have read about it think it is wrong, legal or not.

Whatever (profile) says:

“And here’s where the misleading stuff really starts flowing. It absolutely is a backdoor. Anything that makes it easier for a third party to decrypt data without knowing the key is a backdoor. That’s the definition of a backdoor. That it comes in the form of making it substantially easier to brute force the passcode doesn’t change the fact that it’s still a backdoor. “

Actually, it’s not a backdoor. Apple would not be unlocking the device or providing a method by which they could access the device without knowing the pincode. They would only be removing artificial blocks which stop a brute force effort from obtaining the pincode in less than a month. This exists because Apple chose the way their system works, and allowed these features to be set in re-writable firmware or on the OS level, rather than being attached in a manner that would kill the key if changed.

So, no, there is no backdoor. Apple is being asked to remove the bubble wrap from their (relatively) easily picked locked. Nobody is asking them for a masterkey to this lock. The feds will force the lock once the bubble wrap is out of the way.

Anonymous Coward says:

Re: Re:

Actually, it’s not a backdoor.

Sure it is.

They would only be removing artificial blocks which stop a brute force effort from obtaining the pincode in less than a month.

Computer programmers and security analysts would describe that as a backdoor. A backdoor simply means the ability to access secured communications in a reasonable timeframe, where such access would not be possible the absence of the backdoor.

Wyrm (profile) says:

Re: Re:

I love that kind of mincing of words.
Trying to pretend two opposite things at the same time, while downplaying the most important of them.

On the one hand, it’s “only” removing a minor, nearly insignificant part of the security. Something “every expert” agree should not even be part of the security.

On the other hand, the FBI is just waiting for this insignificant security feature to get the data… because this negligible feature is completely blocking them out.

That’s quite a contradiction. It seems like this unimportant feature is actually quite useful, and a way to bypass it clearly match the definition of a backdoor. The fact that you despise this feature doesn’t make it irrelevant.

Ninja (profile) says:

Re: Re:

So would you buy a product if you found out it does not protect your data and that had false advertisement on its protection capabilities? I wouldn’t. What would happen if Apple lost all trust like that? Collateral damage?

If what the FBI is asking is actually feasible then that’s a major security flaw. One I’d think Apple will be quick to fix in the very near future. But even then two questions will remain: the trust on them will be completely broken so what if all criminals move to an uncrackable solution? Second, if Apple can actually comply with the court order now (costs be damned) and closes this gap in the near future in an upgrade to all iphones, what will the FBI do? Will the courts order the impossible?

Oh and you are spewing a lot of bullshit. If it is a bubble wrap then the feds can go through for sure. It’s not that simple. Besides, if the owner set a decently long password with special characters and all it doesn’t matter, it will be close to unckacable even using brute force (as a reminder, the hardware has a minimum physicaly limited interval of a few dozen milliseconds). What then? If they were careful enough to destroy two phones to erase data you’d imagine they were careful enough to do their homework.

In the end, everybody loses, feasible or not.

John Fenderson (profile) says:

Re: Re: Re:

“So would you buy a product if you found out it does not protect your data and that had false advertisement on its protection capabilities? I wouldn’t.”

I already do. I literally cannot think of a commercial network-connected product that I trust to protect my data, even though most claim they do. Not iThings, not Android, not my car, nothing. So pretty much every connected device I buy fits your category.

This does mean that I buy less things, because I only buy things that I am confident that I can adequately secure or operate without a net connection.

That One Guy (profile) says:

Two can play at that game

As much as the government and police like to paint anyone who defends working encryption as ‘helping criminals and terrorists’, given how much potential crime encryption prevents they, not the tech companies and others defending encryption are the ones doing everything they can to help criminals and terrorists.

Anyone calling for crippling encryption is calling for something that would be a massive boon to criminals, and put everyone at risk, and they deserve to be called out on it any time they do so.

Anonymous Coward says:

Automatic updates are a huge security hole.

Even with code signing certs, imo this is a flaw in Apple’s approach and I’m glad the FBI brought it to our attention.

I don’t really care how this case turns out as long as there’s no new legislation. Apple messed up and needs to build their phone so it’s impossible for them to help to unlock it or decrypt it.

Any device that can be updated without an active authenticated user logged in has a huge security flaw. A simple fix would be to only allow software updates to be triggered if a user is active. If the phone is on the lock screen, all updates should be pended until the user enters their pass code and starts actively using the phone. The update could be automatically triggered by the unlock so it doesn’t require user intervention.

IMO this should be the standard on all devices, including Windows and Android. Automatic updates are fine as long as they’re only triggered if a user is active. Although personally, I’d take it one step further and give the user an option to approve all updates.

Anonymous Coward says:

That was compelling someone to give up their own encryption key — something that not all courts agree with by the way, as some view it as a 5th Amendment or 1st Amendment violation.

Has anyone actually checked that this case is different? I wouldn’t know where to look in the licensing agreement for iOS, but somewhere in there would be the ownership rights retained by Apple, and it’s completely possible that the rights retained are sufficient to declare the software to be Apple’s property for this purpose.

nasch (profile) says:

Not that I’m happy about the warrant but…

It absolutely is a backdoor… And, yes, this impacts “every” iPhone.

The quote is, “The Order does not, as Apple’s public statement alleges, require Apple to create or provide a “back door” to every iPhone”. I don’t see anything in that that’s not true. You may believe that this means there will be orders in the future that will apply to every iPhone. But the DOJ is talking about this order. Does it apply to every iPhone?

If the court can order it for this phone, then yes, it can order it for any iPhone, and that’s the key concern.

Is it appropriate for the court to consider that? Isn’t the judge supposed to rule on the facts of this case? It seems to me that if this warrant is supportable under the Constitution and the All Writs Act (I don’t know if it is or not), then it should be issued. I can’t see how the judge could legitimately decide that it would be an appropriate warrant to issue but she’s going to deny it because of fears of what future warrants might be requested. And inversely if it’s not an appropriate warrant based on the law and the facts of the case, then one need not consider the possible future effects to deny it. So either way, what law enforcement might do with this in the future is certainly worth considering, but I don’t see how it’s relevant to the question of the validity of the warrant.

That One Guy (profile) says:

Re: Re:

The quote is, “The Order does not, as Apple’s public statement alleges, require Apple to create or provide a “back door” to every iPhone”. I don’t see anything in that that’s not true. You may believe that this means there will be orders in the future that will apply to every iPhone. But the DOJ is talking about this order. Does it apply to every iPhone?

Whether the order explicitly says so or not doesn’t terribly matter actually, as effectively it does via precedent and both sides know it(and in fact at this point I’m almost completely convinced that the DOJ/FBI’s entire interest in the case is solely in setting that precedent). Once they’ve got precedent that companies can be forced to bypass their own encryption, it will be used for other devices, to pretend otherwise is just absurd.

Is it appropriate for the court to consider that? Isn’t the judge supposed to rule on the facts of this case? It seems to me that if this warrant is supportable under the Constitution and the All Writs Act (I don’t know if it is or not), then it should be issued.

And that’s where the details of the case become important. If the DOJ was just asking Apple to hand over data from the device as they’ve done before, this wouldn’t be an issue. The order however goes significantly farther than that in ordering Apple to undermine their own security in order to provide that data, and that is where the real problem comes into play, and where the order steps over the line of what a warrant should be able to force compliance to.

This is not just a demand for data as a normal warrant would cover, it’s forcing someone to go out of their way, working on their own dime to undermine their own security, in order to provide that data.

Secular Absolutist (user link) says:

Tactical error

The FBI will ofyen outsource DNS diagnostics amd then the vendir will report back in evidentiary proceedings. If the Feds had similarly, thru the court,asked apple to extract the data and provide it then we will have a very different discussion. Instead the FBI is demanding a tool they then possess. Bologna.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...