Apple Might Be Forced To Reveal & Share iPhone Unlocking Code Widely

from the not-so-easy dept

Among the many questions swirling around the challenge to U.S. Magistrate Judge Sheri Pym’s Order that Apple create software to bypass the iPhone passcode screen, a matter of paramount public interest may have been overlooked: Even if the government prevails in compelling Apple to bypass these iPhone security features: (A) evidence for use in a criminal trial obtained in this way will be challenged under the Daubert standard (described below) and the evidence may be held to be inadmissible at trial; and (B) the Daubert challenge may require disclosure of Apple’s iPhone unlocking software to a number of third parties who would require access to it in order to bring the Daubert challenge and who may not secure the new software adequately. To state that neither consequence would be in the public interest would be an understatement in the extreme.

The Daubert challenge would arise because any proffered evidence from the subject iPhone would have been obtained by methodology utilizing software that had never been used before to obtain evidence in a criminal trial. The Supreme Court, in Daubert v. Merrill-Dow Pharmaceutical-Dow Pharmaceuticals, Inc., held that new methodologies from which proffered evidence is derived must, when challenged, be substantiated by expert scientific testimony in order to be admissible. In Daubert, the court stated that the criteria that must be utilized when faced with a defense challenge to scientific testimony and evidence are:

  1. Can the methodology used to reach the expert’s conclusion (the new software here) be tested and verified?
  2. Have the methodology and software been peer-reviewed and has the review been published in a peer-reviewed journal?
  3. Do the techniques used to reach the conclusion (here, to obtain the evidence) have an ascertainable error rate?
  4. Has the methodology used to generate the conclusion (the evidence) been generally accepted by the relevant scientific community?

Under the Daubert standards, introduction of evidence from the iPhone, electronic communications and data stored in the phone, would require the testimony of an expert witness to, among other things:

  • establish the integrity of the data (and its reliability) throughout the chain of custody;
  • explain whether any person or software could modify the data coming off of the phone;
  • verify that the data that came off the phone as delivered by Apple and held by law enforcement was the data that had originally been on the phone;
  • explain the technical measures, such as the digital signatures attached to the data, used ensure that no tampering has occurred and their likely error rates.

Such an expert would, in preparation for his or her testimony, require access to and examination of the software, as it is inconceivable that defense counsel would simply accept the testimony of the Apple personnel without also demanding that their own, third-party, experts have access to the code.

In addition, defense counsel would undoubtedly demand the right for their own third-party experts to have access not only to the source code, but to further demand the right to simulate the testing environment and run this code on their own systems in order to confirm the veracity of evidence. This could easily compromise the security of the new unlocking code, as argued by in the amicus brief filed with Judge Pym by Jennifer Granick and Riana Pfefferkorn from Stanford’s Center for Internet and Society (also covered previously by Techdirt):

There is also a danger that the Custom Code will be lost or stolen. The more often Apple must use the forensic capability this Court is ordering it to create, the more people have to have access to it. The more people who have access to the Custom Code, the more likely it will leak. The software will be valuable to anyone eager to bypass security measures on one of the most secure smartphones on the market. The incentive to steal the Custom Code is huge. The Custom Code would be invaluable to identity thieves, blackmailers, and those engaged in corporate espionage and intellectual property theft, to name a few.?

Ms. Granick and Ms. Pfefferkorn may not have contemplated demands by defense counsel to examine the software on their own systems and according to their own terms, but their logic applies with equal force to evidentiary challenges to the new code: The risk of the software becoming public increases when it is examined by multiple defense counsel and their experts, on their own systems, with varying levels of technical competency. Fundamentally, then, basic criminal trial processes such as challenges to expert testimony and evidence that results from that testimony based on this new software stand in direct tension with the public interest in the secrecy and security of the source code of the new iPhone unlocking software.

At best, none of these issues can be resolved definitively at this time because the software to unlock the phone has not been written. But the government’s demand that the court force Apple to write software that circumvents its own security protocols maybe shortsighted as a matter of trial strategy, in that any evidence obtained by that software may be precluded following a Daubert inquiry. Further, the public interest may be severely compromised by a court order directing that Apple to write the subject software because the due process requirements for defense counsel and their experts to access the software and Apple’s security protocols may compromise the secrecy necessary to prevent the proposed workaround from becoming available to hackers, foreign governments and others. No matter what safeguards are ordered by a court, security of the new software may be at considerable risk because it is well known that no security safeguards are impregnable.

The government may be well advised to heed the adage, “Be careful what you ask for. You may just get it.” Its victory in the San Bernardino proceedings may be worse than Pyrrhic. It could be dangerous.

Kenneth N. Rashbaum is a Partner at Barton, LLP in New York, where he heads the Privacy and Cybersecurity Practice. He is an Adjunct Professor of Law at Fordham University School of Law, Chair of the Disputes Division of the American Bar Association Section of International Law, Co-Chair of the ABA Section of International Law Privacy, E-Commerce and Data Security Committee and a member of the Section Council. You can follow Ken @KenRashbaum

Liberty McAteer is an Associate at Barton LLP. A former front-end web developer, he advises software developers and e-commerce organizations on data protection, cybersecurity and privacy, including preparation of security and privacy protocols and information security terms in licensing agreements, service level agreements and website terms of service. You can follow Liberty @LibertyMcAteer

Filed Under: , , , , ,
Companies: apple

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Apple Might Be Forced To Reveal & Share iPhone Unlocking Code Widely”

Subscribe: RSS Leave a comment
65 Comments
Aaron Walkhouse (profile) says:

Re: Daubert and defence counsel access are both irrelevant here.

Daubert requires a new methodology [such as DNA analysis]
to trigger a scientific review. That’s why the four prongs
cited are an obvious poor fit to this case.

1. This software is tested/verified by the sole defendant.
2. Peer review is not, and never will be, relevant here.
3. A backdoor works or it doesn’t. There can be no error rate.
4. There is no “relevant scientific community” for Apple’s
proprietary trade secrets, particularly when the original
court order specified the software be destroyed upon use.

The software proposed may be new but there is no new
technological or scientific research needed to make the
software or use it and development is limited to altering
or disabling established code already in use.

Likewise, the prospect of defence counsel demanding access
falls moot before the fact that Apple is the sole defendant
for this and future All-Writs cases on iPhones.

Other defendants may wish to suppress evidence obtained by a
backdoor but they won’t be able to get their hands on the
backdoor itself because the presence of the evidence proves
that the backdoor worked. Nobody would be able to argue
that a mere backdoor conjured up evidence that was not
already there to be found and they couldn’t use Daubert
because it’s just a backdoor and not a new scientific method.

Whatever (profile) says:

Re: Re: Daubert and defence counsel access are both irrelevant here.

I think you nailed it here. There is no new technology, there is no new methodology in play. Encryption (and the decryption) of digital information is well known and reliable technology.

Moreover, let’s be clear here: When you decrypt something, you either get the data in the clear or you get garbage (you failed!). It’s not like it’s suddenly add “I buried the body in the backyard next to the orange tree” in every document.

It seems like this argument is more of a defense lawyer trying to delay the inevitable rather than a strong legal argument. I’m not a lawyer, but even I can see this one as insanely weak and likely to be tossed.

nasch (profile) says:

Re: Re: Daubert and defence counsel access are both irrelevant here.

Other defendants may wish to suppress evidence obtained by a
backdoor but they won’t be able to get their hands on the
backdoor itself because the presence of the evidence proves
that the backdoor worked. Nobody would be able to argue
that a mere backdoor conjured up evidence that was not
already there to be found and they couldn’t use Daubert
because it’s just a backdoor and not a new scientific method.

At what point is it demonstrated that it was in fact just a back door and not a malicious software package that planted evidence in the device?

Aaron Walkhouse (profile) says:

Re: Re: Re:

At the point where it was used, in the presence of FBI and
Apple experts, to retrieve the data and put it directly
into the chain of custody. ‌  Everyone present would be
require to swear to every step of the procedure they
carried out. ‌  It’s standard practice which courts respect.

As all it did was unlock the phone there was no opportunity
for the backdoor to directly access the data; because that
was done by the iPhone’s normal operating system using it’s
normal, built-in functions that nobody can suspect as being
new or unique to that one phone.

It’s like getting a landlord to unlock a door for you.   He
never went inside after opening the door because that’s the
cops’ job. ‌ Later on, all he can testify to is that he opened
the door and let the cops in at that date and time; after
that the scene is in police custody and only they can keep
track of what they do and when; thus, a chain of custody.

nasch (profile) says:

Re: Re: Re:2 Re:

It’s standard practice which courts respect.

Of course, they also assume police never plant evidence or lie on the stand, and we see how well that plays out.

because that
was done by the iPhone’s normal operating system using it’s
normal, built-in functions that nobody can suspect as being
new or unique to that one phone.

I thought this was about replacing the normal software with new software that disabled security features.

Aaron Walkhouse (profile) says:

Re: Re: Re:3

The only functions involved in this All-Writs order is the
login screen itself. ‌ That login screen has very little
functionality because it is only designed for one task; so
any elaborate spy code to meddle with data would cause an
obvious case of bloat that any one of the people involved
could detect. ‌ You can be sure that Apple techs would not
let such shenanigans go unreported.

One tech unlocks the phone, signs off on what he did and
removes the tool, then the investigators go get the data,
every step of which is logged and signed for. ‌ The process
is rigorous.

After that it’s all unchanged IOS code being used by
investigators directly because the phone and it’s revealed
unlock code is now in their sole custody. The custom login
screen, no longer needed, would be replaced with the
original so the iPhone could be unquestionably certified as
absolutely in it’s original state; thus placing it’s
contents securely in a properly managed chain of custody.

If the folks operate as usual, dot all of their i’s and
cross all of their t’s, a court will have no reason to
question the process unless something unexpected happens;
like evidence showing up early or late in the process and
conflicting with the logs already in hand. That is rare.

nasch (profile) says:

Re: Re: Re:4 Re:

The custom login
screen, no longer needed, would be replaced with the
original so the iPhone could be unquestionably certified as
absolutely in it’s original state

The only way the phone is in its original state is if it’s factory reset. The fact that the original software was put back on the phone doesn’t prove anything about what else might have happened to the data. Everything you’re saying makes sense, but it seems to me (not a lawyer) falls short of proving that the data hasn’t been tampered with. But maybe the defense just basically has to take the investigators’ word that they didn’t screw with it. That wouldn’t surprise me.

Aaron Walkhouse (profile) says:

Re: Re: Re:5

The original state of the evidence here is when the suspect
last had it in hand
, not before the factory shipped a new phone. ‌
A blank phone is evidence only that a phone exists! ‌ ;]

If all that’s changed is the login screen, then after the
access code is obtained with it and the original login
screen is replaced, the logs of that process and the sworn
testimony of all involved prove the OS and data have not
been altered by the process at the time the data was
finally accessed and copied to FBI assets. ‌ Even then,
originals are preserved and locked away from subsequent
investigators while backups are also locked away to
preserve the chain of access.

Until the phone is unlocked it is impossible to alter the
encrypted data and after the phone is unlocked it is
impossible for anyone to have unsupervised and unlogged
access to the data.

Because chain of custody procedures are followed only one
person at a time has custody, usually supervising
and/or assisted by one or more people as he/she works.

These procedures, familiar to all officers, agents and
courts, are trusted for good reason; because it is
practically impossible to tamper with evidence without
leaving “fingerprints” and the logs show who had custody
when such “fingerprints” showed up.

The legal system has had decades of practice with these
procedures and it is very rare that someone finds new
loopholes to exploit. You may theorize that one may exist,
that someone has the means, motive and opportunity to
exploit it, and that someone also coincidentally has custom-
tailored false evidence to plant but the odds of [loophole]
+ [means] + [motive] + [opportunity] + [fake data that
fools everybody] all coming together at the same time is so
low that it tends to be impossible, especially in cases as
complex as this with so many investigators and lawyers involved.

It is those decades of history and case law which creates
trust in chain of custody. Defense attorneys in other cases,
[not this one because the criminals are dead] will often poke
and prod at the chain of custody because that is what they
are expected to do. ‌ Most of the time they only prove that
the evidence is solid.

AJ says:

Re: Re: Re:6 Re:

I think I read where Manhattan alone has upwards of 150 I phones in evidence that are “locked”. So if they create the tool, then destroy it per the court order. Are they going to have to do that for every phone law enforcement wants to get into? There could potentially be thousands of phones across the country. Then what? Russia? China? This is a horrible idea.

If I were Apple, this would compel me to release an immediate patch to all phones removing this as a possibility. It is after all, a security hole. They would be right to make it impossible to do, because if it’s possible, we will force them to do it when it suits us, and do so regardless of the long term broader implications.

Aaron Walkhouse (profile) says:

Re: Re: Re:7

I think the Apple plan is to prove that this proposal is
both an unreasonable burden and an unlawful expansion of
All-Writs.

Forcing highly valuable resources to this unprecedented
task costs a lot of money each time, especially as they
will not be compelled to keep a copy for future uses. ‌ The
FBI might claim they are willing to pay the bill but they
have no idea how big it would actually be and once they do
find out they will typically resort to asking the courts to
force Apple to pay the full price because “civic duty”.

Worse, forcing Apple to be State Safe-cracker for more cases
in an uncertain future will immediately devalue the entire
corporation and all of it’s products in the public view, and
thus on Wall Street. ‌ The immediate loss will be billions
even before the first phone is breached. If it continues
on to other phones in other cases [using this as precedent]
then those losses will become permanent and may even deepen
to the point where thousands of American and Asian jobs will
be lost forever. ‌ That is definitively an unreasonable
burden which a court ignores at Apple’s and it’s own peril.

All-Writs was crafted for access to available documents only!
Redefining it to force “landlords” of any kind to become
safecrackers for the state is clearly beyond the text and it’s
intention, no matter how the FBI want to portray it as an
attempt to “keep up with the times”. ‌ The courts will have
no other choice but to tell the FBI that they will have to
ask for new legislation because the courts don’t have the
authority to expand law past constitutional protections
or the actual text of the legislation.

Bending a law against the constitution to fit needs is not
unusual but actually breaking it or changing a law to create
new authorities or powers is legally impossible, inviting
sanctions against an offending judge..

nasch (profile) says:

Re: Re: Re:8 Re:

All-Writs was crafted for access to available documents only!
Redefining it to force “landlords” of any kind to become
safecrackers for the state is clearly beyond the text and it’s
intention, no matter how the FBI want to portray it as an
attempt to “keep up with the times”.

Not a bad analogy. Has anyone ever attempted to use the All Writs Act to compel a locksmith or safe manufacturer to crack a safe that they didn’t own? Not provide a key, but use their time and expertise to do it.

The Wanderer (profile) says:

Re: Re: Re:6 Re:

Either I’m misunderstanding what would happen here, or you are.

According to my understanding, in order to get this phone to install the modified code, it would have to be provided as an updated iOS image.

According to my understanding, when you replace or upgrade a smartphone OS version, you do so essentially wholesale; you drop in the entire OS image, replacing everything which was there before, not just the pieces which were changed.

If that’s correct, then there would be no way to replace just the login-screen code; you would have to replace everything. It’s possible (even likely) that the replacements for everything else would not be (significantly) different from what was there before, but there would be no way to verify that without looking at the source code.

Even if that’s not true, I’m not certain that your apparent assumption that there would be logs of the OS-update process which would show enough detail to determine whether anything other than the login-screen code had been modified is accurate. Certainly I’ve seen no sign of such logs on the Android side of the fence.

Beyond that, even if we assume that it can be proved that only the program(s) involved with handling the login screen were modified, there’s no reason why the login-screen program(s) could not (be modified to) include code capable of modifying other parts of the system – and I would be extremely surprised if there were enough logging to be able to catch it if they did.

Really, if you’re paranoid about every possible angle of attack and you don’t trust the people who are in charge of the operation to do the right thing and be honest about their actions and motives, there is no way to be certain that the modified code has not tampered with the data on the phone other than to see – and possibly to experiment with – the code itself.

Aaron Walkhouse (profile) says:

Re: Re: Re:7

Apple never said they couldn’t do it as a targeted patch.

In this case the court specifically ordered that just the
one module be replaced. ‌ That module, designed for just the
one function and that being on a cell phone, is very small.

That makes it impossible to add sophisticated search-replace
code which alters time stamps on files and also internal
IOS logs and filesystem structures sufficiently well enough
to fool forensics investigators on both sides of the case.

Don’t forget that this part of the job is for Apple alone.

The other logs I mentioned are not those of IOS, but of the
technical staff and investigators involved in all stages of
the procedure, and most of those will be written by Apple
staff who have sole and uninterrupted custody of both their
logs and the source code of the proposed tool. ‌ Apple can’t
be elbowed out of the way and not all FBI agents, techs and
officials can be compromised at the same time; so yes, the
combined logs and testimony of all involved on both sides
of the case does effectively make shenanigans a no-go.

SteveMB (profile) says:

Even somebody who learned everything they know about the law from CSI: Wherever and Ace Attorney can see through the obvious absurdity of the “Apple can keep the backdoor code secure so it won’t get out” argument. If the Feds actually allowed that (or pretended to allow that) it would open the door to scenarios like:

Feds: We’ve got a search warrant for this phone, but we can’t get in. Can you help?
Me: (Sees owner’s name engraved on phone back, and recognizes it as that of the asshole who stole my girlfriend, ran over my cat, and keyed my car) Sure! There’s just one caveat, though — I can’t let you look at the code I’ll be using because it might release a dangerous cyber pathogen.
Feds: Well… OK.
(later)
Feds: Geezus Q. Christ! We thought this guy was just stealing credit card numbers, and it turns out that he’s the world’s biggest kiddie-porn meth-lab jihadist ringleader!

cjstg (profile) says:

Isn't that the point?

Isn’t this what the whole fight is about? I’m not even a lawyer, and I realized the first time I read about this case how important it was that Apple simply refuse to do this (under any circumstances, including jail time for contempt of court). Once the phone evidence enters the courtroom, the entire process of unlocking the phone is subject to scrutiny. The precedence issue is secondary.

Jeremy Lyman (profile) says:

Re: Isn't that the point?

I think the precedence is more important than that. An exploit for this version of the OS in the wild is bad, but setting the standard that companies are legally obligated to pour whatever resources the govt says into breaking their own tech is worse. It becomes an issue for corporate bean counters who will cost and risk analyze encryption with the knowledge that they may need to break it at some point. It’s much more efficient to pre-engineer exploits. We’ll never get a secure piece of software again.

That One Guy (profile) says:

Re: Re: Isn't that the point?

It becomes an issue for corporate bean counters who will cost and risk analyze encryption with the knowledge that they may need to break it at some point. It’s much more efficient to pre-engineer exploits. We’ll never get a secure piece of software again.

I would(and have) argue that it’s worse than that.

Once the precedent is set that companies can be compelled to break their own encryption you can be sure that any move towards encryption that they cannot break will be painted as companies attempting to ‘avoid their lawful obligations by making their products immune to legally issued warrants’. At that point it goes beyond a cost/risk analysis of how much it costs to develop encryption versus how much it would cost to break it, and moves into the realm where it becomes effectively impossible for them to ever implement truly secure encryption, as they’d face a PR and potentially legal nightmare if they ever tried.

Anonymous Coward says:

The governments narrow tunnel vision to get what they want at any cost is beginning to look a lot like someone didn’t think beyond this was a great way to open up the encryption access. The ramifications of what it would really mean in courts and the legal world were of no consequence at the time this theory to force Apple to open their encryption to LEO occurred.

As time goes on and it is discussed, more and more this begins to look like a terrible idea.

aldestrawk (profile) says:

clarification

This article ought to have mentioned that any code used to update an Apple iPhone has to be digitally signed. Only Apple has the key necessary to sign such code. The FBI has not asked for that key and they will not be required to release it. This is the whole reason the FBI wants to compel Apple to write code that defeats their own security. The FBI may be capable of writing such code but they can’t update an iPhone with their version. The FBI also asked Apple to make the update work on only the one iPhone in question. The way to do this is have the update check for one or more of the unique Ids used only on that particular phone (e.g UUID, serial #, cell IMEI, Bluetooth and WI-FI MAC addresses). The presence of a digital signature also means that the FBI, or anyone besides Apple, cannot alter the code even if they had a copy of the, un-compiled, source code.
So, what’s all the worry about then? I don’t know the particulars of where, and how, these unique are stored on the iPhone. What may be possible though is to spoof these Ids to make another iPhone appear to be the one used by the San Bernardino terrorists. Another possible weakness is that every time a small change is made in the digitally signed code, it becomes easier to crack the key. A multitude of law enforcement agencies getting a new version for each case may allow the signing key to be discovered. I don’t know if that is realistic in this instance, but it is something that should be looked at.

ThatDevilTech (profile) says:

Re: clarification

I think that was discussed when all this initially started about the “one phone” bs. It may be about one phone THIS time, but who is to say someone doesn’t get the rogue code and revers engineers it for ANY phone to work. Or to trick the signing functionality? I don’t want Apple to do this for ANY phone. It just sets too big of a precedent.

sigalrm (profile) says:

Re: Re: clarification

everyone’s obsessed with the fact that the computer in question is a phone.

It’s a computer, with an OS/Firmware.

Functionality aside, It’s fundamentally no different than any other Internet of Things device.

“Dear Amazon: We think Individual X may be up to something illegal. Please provide a custom firmware for their Alexa….”

“Dear Samsung: We think Individual X may be up to something illegal. Please provide a customer firmware for their smart TV…”

art guerrilla (profile) says:

Re: clarification

“The presence of a digital signature also means that the FBI, or anyone besides Apple, cannot alter the code even if they had a copy of the, un-compiled, source code.”

wha ? ? ?
NOT a programmer, but this set my BS meter pegging; not sure how in hell you can make the “un-compiled source code” un-editable/copyable/etc…

Anonymous Coward says:

Expert examination?

“Such an expert would, in preparation for his or her testimony, require access to and examination of the software,”

Next proposal about that from the FBI: ‘We’ in the form of the US govt already have in place a standard procedure which should be used here, in the form of the TPP access procedures. The expert may be allowed to enter a room, with no pencils, paper, cameras or other recording devices, and then may look at a printout of the code (4-point type) and even perhaps at a hologram of the phone’s internals. Director Comey will insist that is a sufficient examination, his experts told him so even though he didn’t understand anything they told him (fully in the vein of the earlier TD story detailing his responses to Congress).

Surely a procedure which is deemed adequate for treaty examinations must be good enough for a mere phone.

Anonymous Coward says:

It’s time for a serious crowdmock of the cyber-pathogen idiocy. We want T-shirts, bumper stickers, signature lines. Herewith some modest proposals:

DORMANT CYBER PATHOGEN
(Do not wake | Do not turn me on)

I AM CYBER PATHOGEN
AND I VOTE!

CYBER PATHOGENS UNITE
Today San Bernardino, Tomorrow the (World|Underworld)!

UNLOCK THE iPHONE:
FREE THE SAN BERNARDINO CYBER PATHOGEN!

REAL CYBER PATHOGENS RUN ON ANDROID

CYBER PATHOGEN EXTERMINATOR
Your phone is pathogen-free.
You owe me $1,000,000

REAL PATHOGENS RUN ON DNA

(A pictures of a unicorn in some appropriate but improbable pose–sleeping, rampant, penned up, dead, etc.–is optional but strongly recommended)

Other media should’t be overlooked: say, tinfoil caps labelled “cyber pathogen protector” (with, of course, a unicorn head in the traditional red slashed circle); “cyber-pathogen-free” stickers to post on pay phones and power outlets–“let a thousand snickers bloom, let a hundred online shops contend.”

Dismembered3po (profile) says:

this certainly changes the calculus doesn't it?

THE KEY! THE KEY! THE KEY! THE KEY! THE KEY! THE KEY!

I’m not sure but I don’t think this code itself is a big problem.

The big problem seems to me that in order to validate the code, and that it works as advertised against a real device the expert would have to have access to Apple’s signing key.

Apple’s signing key.

APPLE’S SIGNING KEY.

jim says:

Re:

So in their defense, you say it’s okay for a foreign government to have that ability to unlock the phone anytime it wants, but not one trying a supposed drug dealer, and two dead people who shot at the cops? Interesting argument.
I believe with the source code, telling the machines how to operate, it can be reverse engineered to show what is needed to make it operate, and what functions are needed to minimumly operate the machine.
Believe they already have all the information off the device,but it is legally unusable. In both cases. So what else are they after? Or who? Unusable, no warrants at the time. But a court would let them get away with that, but to use as evidence, that would be a very odd court. But it could be presented to a grand jury as hearsay, for further action. But it still didn’t get to the issue, why did Apple not fulfill the original request? It wasn’t the privacy issue then. They did it on other occasions, why stop then? There is some other motive, but what. Fired the wrong guy? Didn’t say pretty please? Wanting paid for the last time?

sigalrm (profile) says:

Maybe part of the problem here...

is that people don’t understand what a secret key looks like.

This is a 2048 bit RSA key I just generated:

—–BEGIN RSA PRIVATE KEY—–
MIIEpAIBAAKCAQEAzSOE0cwXfpZdYP9NI1j7kqNth/oLho2k5gnlXMMrq6m1Ba/s
HbvcPwU7tdovxUYg9+LVsN2YB/jsi4jJG/njvO9O330IvQ8fKvbxezgvWdOGI+sP
fm22WTZqRTdQ6NfUjL8DlJWsJZxihXhNP9SHLsQ4aa9j4iTRzYl+H6oa0msr4sfs
hoHuOQpkszDGy0vJ2Gxr/N0VnxGrmsaVmgDuj514pNVgWr24L+SbhZb3fUfRztAP
ky+q5N1AtE/INUAdPuEz+oO/OBymLOW6LKB7RbOljWJzNev5RtfxiWdwiDfH2SH0
TsslEQDk6/Ea1Ckz5EvH6pi93+su6zc8vbmAgQIDAQABAoIBACvMvqow4n9TyaJR
QH4gnK5lmJhk6hsTmTbIvCE/Rs7DUHRjaI28s7z8+A/PA04iuB1VYH0AA1sIajEs
xovjoh2QFw4e20PKu8PnsA24JFwQjt6SbN94u2t289/NfMgKdUaL7k7GWlg5eMu4
sP3E+gwhN05RdYkuhWFWTwihwFJWz8ygoJHfvxxRMstD20uAntNMI7gmWAV1seDB
BGnmzdhk1Ge9qVHvkjxbQYDlhjKCpWJQNM9ivPjNb57/2KYiHOmh0RyKS7QIQYtl
3TppOoUwOrg9Ld55xkubRAuj13oHIXJewcT8DxOHjJp4zkNMqwbcpMRApQRhxk3l
x9MLvKUCgYEA6D7XaNfMTKoihk2yHYR9MMyazGJ49gAdSB3VdeT2qXJqJfBN7FkS
X7kkFhAreW/QI7zSfo88i2eJY/hKF38ok50BB7mVQR5hcIvhpYPa7O6F4C2WJOkv
GhOIMTrlpX+jo68VThEhhH3TlIICa0ou3Ga/8UiHhV2NyjDK1vf+8i8CgYEA4h7o
5m3P1GFT3Hw93m9U6aejBrB4yyg55yXg6VrJnt1y5sFMNpkZDoRyJhEEZi1bujNU
y0rCUvYfACnkgoRjoAenqiuvD1GyLfhBtGL8m0RzDikwk/kQSEd2UrjgGdmkKKyG
TsJzKY5aoMhhmb90fZbDOUfnFS5uip90izmifE8CgYEAllQe8MFGf5Vc9ZwTH+Ij
etPlm0heTbWzPnv5MO+87d+eb+JFPihFqWpYvmNHELrcelV91uf2Y7HoD6qmouDv
LeVhxlNNFjKJFeWlcJKRwe1/AKXhWxEJKRLdhChAf8jH7mqlGrwh+vXLX4Rr9nC1
NnrX4WF2P1BYODkvAsjR4IcCgYEAvV8xojn5Ql64gwEyN2V58a1JZULKByqLQ8B/
Wi+Eh53iqsrb7yXMzFGz35mE26XFGm3+57qWgDBLyjFLhNsnLFD85BFtrSC4XrN5
I397GvX6fbOVUXfXYREoUSMv27ZgOwgx+yfylqz3zYvD4aVsA/oNSZ2kNCMMxN/C
FQ+RuxUCgYA6yDOODkNRoYGsKrEcV3rtwk+tT1Avt+M9KiDpI9PAlnrna9DUoJ1W
cHmHpyeGAiVk7vBtwgPypi4jEjtksXKvJZ07P9qgAlNbnbjaI2Ubdi56GnuJskEg
bLVa9iFrZvyKhsGCPmsxMnxFLs58HwLveuxjICQ0pqGPC72byUZHiA==
—–END RSA PRIVATE KEY—–

That’s it. This is a textual representation of a 2048 bit RSA key. generate a CSR and a public key, and you can plug it into any Apache web server. Or use it to sign email. Or sign applications. And those signatures will be valid on any system with the public key installed as a certificate authority.

If you were to see Apple’s private key exported like this one is, it would look very similar, although (hopefully) 4096 bits instead of 2048 (twice as long). And it might be DSA, instead of RSA. I’m certain it’s stored in a _very_ tightly controlled environment.

This key fits trivially into a paste buffer. So would Apple’s. You could print it and type it in by hand if you were so inclined. Or take a picture and OCR it. And if that happens – just once – it potentially puts the security of every Apple device on the planet at risk.

Now, this is a simplistic example. I’m sure Apple’s implementation utilizes a hierarchy of similar keys, with limited uses, etc, all signed by a single, master key which is stored in tamper-proof hardware, requires multiple people to get to it, etc. But that master key only has to get exported once to the wrong individual to compromise the entire system.

Anonymous Coward says:

All phones have a “backdoor”, Its called auto updates. Apple is the only one that has a key to idevices. Apple does not have to “write a code” as they already have the code written.
Apple is playing “I want to protect our users from the govt” to “up” their status among customers. When all I hear is Lies from them about why they don’t want to assist the govt in accessing a known terrorists phone data. Apple is becoming a tool for the terrorists. Apple must want more terrorists to use their products. Apple has become a terrorist.

nasch (profile) says:

Re: Re:

Apple does not have to “write a code” as they already have the code written.

They have code already written to bypass the security lockouts on a phone? How do you know this? Neither Apple nor the FBI nor anyone else I’ve heard of is making this claim.

Apple is becoming a tool for the terrorists. Apple must want more terrorists to use their products. Apple has become a terrorist.

Toyota is becoming a tool for the terrorists. Toyota must want more terrorists to use their products. Toyota has become a terrorist.

Truthistruth (profile) says:

Why can't anyone secure my devices ? Including the great apple?

For almost 2 years I’ve been dealing with identity theft , breeches hacked emails cloned Apple ID Microsoft accounts that are more harm than good . I spend hours every single day on the phone or in person with all kinds of support , fraud and criminal investigators and get no where ? I have been going to the local county library to check accounts because my information on all my devices at home is not reliable given the fact that the Internet and router have been compromised repeatedly . Today when I went to the local library to check on my google accounts that were hacked into I find out that the libraries Mac computer is not accessible and can’t be used by the library staff because MY old Apple ID that had been compromised and removed from my phone had taken over their Mac !! Seriously not kidding and no I had not been there in a long time and I make sure anytime I use any id I signout to make sure my hackers don’t have it easier . No it was not easy to remove took most of the day and 3 levels of senior support with apple , yes I know how senior they are but by the third one that apple connected after the others couldn’t fix it really was someone who knew tech stuff but still no one can secure my devices 2 Apple ID still say signed into this phone ? Still have every account insecure . Did I tell you about the screen shots of my sign in account info in photos on the library computer !!

Truthistruth (profile) says:

It only happens in the movies

I love it when people say no body really gets hacked or if you check your credit you are giving them access to your information ? Like anyone needs your approval to access your information ? All this apple talk is just another “bailout ” to help apple sales . Now don’t you fell more secure with your non hack able iPhone

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...