Recovering an iPhone 5c Passcode

Remember the San Bernardino killer’s iPhone, and how the FBI maintained that they couldn’t get the encryption key without Apple providing them with a universal backdoor? Many of us computer-security experts said that they were wrong, and there were several possible techniques they could use. One of them was manually removing the flash chip from the phone, extracting the memory, and then running a brute-force attack without worrying about the phone deleting the key.

The FBI said it was impossible. We all said they were wrong. Now, Sergei Skorobogatov has proved them wrong. Here’s his paper:

Abstract: This paper is a short summary of a real world mirroring attack on the Apple iPhone 5c passcode retry counter under iOS 9. This was achieved by desoldering the NAND Flash chip of a sample phone in order to physically access its connection to the SoC and partially reverse engineering its proprietary bus protocol. The process does not require any expensive and sophisticated equipment. All needed parts are low cost and were obtained from local electronics distributors. By using the described and successful hardware mirroring process it was possible to bypass the limit on passcode retry attempts. This is the first public demonstration of the working prototype and the real hardware mirroring process for iPhone 5c. Although the process can be improved, it is still a successful proof-of-concept project. Knowledge of the possibility of mirroring will definitely help in designing systems with better protection. Also some reliability issues related to the NAND memory allocation in iPhone 5c are revealed. Some future research directions are outlined in this paper and several possible countermeasures are suggested. We show that claims that iPhone 5c NAND mirroring was infeasible were ill-advised.

Susan Landau explains why this is important:

The moral of the story? It’s not, as the FBI has been requesting, a bill to make it easier to access encrypted communications, as in the proposed revised Burr-Feinstein bill. Such “solutions” would make us less secure, not more so. Instead we need to increase law enforcement’s capabilities to handle encrypted communications and devices. This will also take more funding as well as redirection of efforts. Increased security of our devices and simultaneous increased capabilities of law enforcement are the only sensible approach to a world where securing the bits, whether of health data, financial information, or private emails, has become of paramount importance.

Or: The FBI needs computer-security expertise, not backdoors.

Patrick Ball writes about the dangers of backdoors.

EDITED TO ADD (9/23): Good article from the Economist.

Posted on September 15, 2016 at 8:54 AM45 Comments

Comments

Tom Hunt September 15, 2016 10:31 AM

Wasn’t there some talk of the phone’s actual encryption key being derived from a PBKDF of the PIN plus a securely-stored per-device random vector, or something? Thus, you could unsolder the flash and attempt brute-forcing that, but you’d be stuck brute-forcing AES256 or whatever, rather than a 4-8 digit PIN.

Bill September 15, 2016 10:37 AM

“The FBI needs computer-security expertise, not backdoors.”

Well, maybe. Out of a few hundred thousand computer security experts came that one paper.

And it only makes sense that TAO expertise comes not from that small group, but TAO is fed expertise from a wide range of .gov and contract researchers and grants.

So, what they need is to be funneled expertise from a wide set of resources into a team of skilled operators. Likewise, why would you ask Seal Team 6 to manufacture the high tech tools they use?

uh, Mike September 15, 2016 10:50 AM

In addition, I don’t believe anything the FBI said or says about breaking into iPhones.

When they said they got in anyway, but didn’t get any usable evidence, I believe they were lying. I believe they couldn’t get in, and made up a story to save face.

phessler September 15, 2016 10:52 AM

@Bill: Yes, one expert out of thousands proved it can easily be done. Probably only one bothered to a) attempt it, and b) publicly display the proof. I’m guessing lots did ‘A’, and sold the results to the highest bidder.

henry September 15, 2016 10:56 AM

@Bill

what they need is to be funneled expertise

That’s a very passive way of saying that the FBI themselves failed to gather the required expertise. Or even listen to the industry expert opinions that were provided, but undesirable.

It’s almost like the FBI needs some kind of … Federal … mandate to gather the expertise they need to … Investigate … the issues they face. Bureau.

Raelion September 15, 2016 11:18 AM

“The FBI needs computer-security expertise, not backdoors.”

How about they need enforceable punishments against corporations choosing not to comply with subpoenas?

There was no new concept being contended with in this case. There was no new technological barrier. This company just chose to not comply like any bank with a safety deposit box could. Except this one thought it would represent some new change in something something, because they had achieved some meaningful status they hadn’t hadn’t.

Since that time we’ve been discussing this on the basis that there is no crypto issue here, it’s just a wipe feature with a counter that needed circumventing.

FBI isn’t the org here that has a “we fkd up and need a fix” issue, Apple does.

Daniel September 15, 2016 11:54 AM

“The FBI needs computer-security expertise, not backdoors”

I am sympathetic to the “lawful hacking” argument yet this comment by Bruce is a particularly disingenuous way to frame the debate. Lawful hacking is a backdoor. Lawful hacking is drawing a distinction between a series of individually-targeted backdoors and a massive one- size-fits-all back door. It is not useful to say that the difference between a backdoors and a targeted attacks are impediments to scaling.

I worry that the way Bruce frames this issue leads the casual observer to believe that she is more safe than she actually is.

Anon September 15, 2016 12:13 PM

“Out of a few hundred thousand computer security experts came that one paper.”

@Bill, do you really believe there are a few hundred thousand security experts in the world capable of such a feat? You are likely off by a factor of 100 in that assessment. Try a few thousand who are both capable and willing to share, as @phessler pointed out.

Wael September 15, 2016 12:24 PM

One of them was manually removing the flash chip from the phone, extracting the memory, and then running a brute-force attack without worrying about the phone deleting the key.

The proper way is to not intrusively tamper with the device. preserving device state is important. This was covered a while back here and here. If the NAND is encrypted with a hardware key on a chip, such as a TPM or “something similar”, then desoldering the memory and brute forcing it offline would be an intractable task. RobertT spoke about PUF a while back (TPM qualifies, to some extent based on the implementation) here.

There were numerous other threads that talked about offline attacks to bypass anti dictionary attack protection mechanisms, and the proper steps that need to be taken by an “attacker” to extract the needed information. The first of these steps is to air-gap the device to inhibit any remote wipe commands (there are counter-counter defense mechanisms to that too :))

Marty September 15, 2016 12:37 PM

@hawk

The bad news is that someone broke into your house and stole your stuff.
The good news is that you can now tell your neighbors “See, I told you.”

Bob September 15, 2016 1:01 PM

@Marty

The bad news is someone broke into EVERYONE’S houses worldwide and rifled through EVERYONE’S stuff… The good news is that since EVERYONE commits 3 felonies per day, now EVERYONE can be put in prison an any dictator’s whim, anywhere in the world, even in so-called “free” countries… Wait, where was the good news again?

ab praeceptis September 15, 2016 1:46 PM

Some observations:

  • The “good old” triple I saw so often in real life. There is a) us (apple), b) our customers, and c) rest of world where each one is considered as the enemy of the former. r.o.w. is assumed wanting to attack customers and customers are assumed to attack apple. Note that “customers” beyond John and Marry also includes competitors. More broadly, a customer is anyone who gets his hands on the product, for whatever reason.
  • The vectors of interests as well as those of supposed attacks are different. While creating nice “security” powerpoint slides, apples by far highest true priority is their security, i.e. their desire to protect their technology, patents, market position, unique selling points, etc.

This can be clearly seen in Sergei Skorobogatovs paper. Examples are switching the protocol, private secret data area and most stunningly the bit 7 “glitch”.

  • The steps to protect us, apple, are quite different from those to protect customers. The latter are protected by a construction of rather standard mechanisms. Funnily, the marketing department had their say in the technical design more than one way. Obviously some mechanisms that are seen as high-end (e.g. 256-bit sym. crypto) were demanded as well as some construction that clearly demonstrates apple being serious and very capable regarding their customers security; no suprise there. But marketing also demanded the security to be constructed in such a way as to allow carrying customer data over to another (apple) device without much pain.

I don’t say that with moralistic undertones; I merely spell out an observation. But it’s an interesting one, because unlike what the customer might think (driven by marketing indoktrination) his security is not designed under the singular priority of maximum customer security. It actually is a blend of diverse factors, apples sles interests being of no less importance than customers need for security.

  • SbO (security by obscurity) is obviously well and very alive. Interestingly, SbO seems to be even preferred when its about apples own interests. One, or more precisely, the highest of which is selling phones. There are clear hints that apple wants it to be easy to brick a phone when trying to pry their mechanism open. While it’s more of a nuisance to a security researcher with a reasonably fitted lab, it’s a killer for the curious geek user or for the low level backyard “nand broken? No problem, will fix that” service.

It’s not that there are no ways. apple could, for instance, hardware hash their major chips. That would be a classical crypto approach. Mumble-jumble some chip PUFs and then pk encrypt a differentiator and you’re done. Nobody but apple can muck around and replace chips (or, more likely, force a new phone upon you plus a fee to “save” and transfer your valuable data). But that’s not the way they go. The way they go is the way of protocol switching and tiny “glitches”, i.e. SbO.

  • We should note that there is a (not so) new animal in the SbO zoo -> loads of money and the power coming with it. More concretely: one of apples tools is the plain fact that they have the size, the position, and the money to get customized, undocumented (except for apple) chips with some flipped connectors/pins, some added “secret” data store, etc. The desirable consequence? Among others they severy cut down the number of potential attackers by very considerably raising the entry barrier.
    For crypto you need brains and a 500$ computer. To play against what Sergei Skorobogatov describes, you need a PhD, lots of hands on experience, and a rather well equipped lab. To do it efficiently (as in “within reasonable time and with high success rates and reliability”) one needs an assortment of people with PhDs, lots of experience, and a high end lab.

In other words: apple strongly biases the game in favour of fbi, nsa, and the likes.

In yet other words: They are already way ahead on their way to backdooring everything but the backdoor won’t be in crypto.

  • I posit, that that was their real problem, namely how to backdoor everything in a way that still looks good to customers and, even more importantly for apple, that opens the backdoor for government agencies (supposedly us-american ones only) but not for competitors.

  • Finally I’m amazed to still find quite few analogies between IT-sec and military. The way I read it (and saw it quite often in companies) they often tend to apply a rather classical military mindset when it’s about the protection of their interests. Not the worst of all mindsets and one that after all survived and prooved itself valuable for millenia.

Deontic Doi September 15, 2016 2:32 PM

Spend lots of time and effort and money strengthening encryption. Then spend lots of time and effort and money helping FBI goons negate strengthened encryption. Right, Einstein, run a giant arms race with yourself. Where do they find these idiot savants to write at Lawfare?

Landau is a classic example of the kind of technical robot who needs adult supervision to think with words. She appears to have no inkling that the right to encryption CALEA affirms is an integral part of your freedom from self-incrimination. And that in a civilized country, sometimes FBI might have to put on their big-boy pants and investigate things instead of dredging the minutia of citizens’ lives like Stasi do.

That crap like that seeps in this blog makes one wonder if there’s anyone on the Tor project board who is clear on the human rights concept they espouse. Maybe Alison Macrina will have to make a block diagram of it for everybody, because it’s really not that hard once you Read The Frickin Manual.

Wael September 15, 2016 4:34 PM

@Emma,

Have you take a look at this article?

Yes. Doesn’t seem to support 5C “yet”.

I wonder if security firm use utility like this for these devices.

They use more sophisticated tools that aren’t normally available to the public.

Jon Do September 15, 2016 6:42 PM

It still comes down to a brute force attack even if the 10 strike feature is broken. Which means if the passcode is extremely complex and long, this method still doesn’t “break” the encryption, just a small safety feature.

Wael September 15, 2016 7:49 PM

@Bruce Schneier,

So many links… From: Patrick Ball writes about the dangers of backdoors.

In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States.

Aaaaand, from your online Bio…

a fellow at Harvard’s Berkman Center

Seems a no-brainier they utilized this survey result

Anon10 September 15, 2016 11:27 PM

The article is slightly dishonest in that most people on the Apple side of the debate believe the FBI should immediately disclose any vulnerabilities it finds to Apple, so that Apple can presumably eliminate the vulnerability. Most people on the Apple side of the debate don’t want vulnerabilities to remain undisclosed or for those vulnerabilities to go unpatched, as seen in the cries for the FBI to disclose the vulnerability it used in the San Bernardino case.

unbob September 16, 2016 5:38 AM

@Raelion:

“How about they need enforceable punishments against corporations choosing not to comply with subpoenas?

There was no new concept being contended with in this case. There was no new technological barrier. This company just chose to not comply like any bank with a safety deposit box could. Except this one thought it would represent some new change in something something, because they had achieved some meaningful status they hadn’t hadn’t.”

You’re trolling right?

  • Equating a telecommunications / computing device with a safety deposit box.
  • Ignoring the questionable application of the All Writs act
  • Disregarding the implications for other user if it is mandated that encryption be breakable by the company

I’m sorry. Haven’t had my coffee yet. Not sure why I even began to reply. Carry on.

Eire Old Boy September 16, 2016 7:14 AM

What happens if Apple or its Chinese subcontractor start to make a FIPS 140-2 phone for the government or associated agencies (wink, wink) that self-destructs if the case is opened? I wonder how far that bird would fly before being shot down. What’s good for the goose is good for the gander! But, of course, turkeys do not vote for Thanksgiving (or Xmas, for the Euros, including Britain for at least next Xmas.)

Local September 16, 2016 8:04 AM

“Then spend lots of time and effort and money helping FBI goons negate strengthened encryption. Right, Einstein, run a giant arms race with yourself. Where do they find these idiot savants to write at Lawfare?”

Encryption wasn’t a problem needing circumvention. Where’d they find you?

Raelion September 16, 2016 8:18 AM

@unbob

“You’re trolling right?”

No, let’s dance.

“Equating a telecommunications / computing device with a safety deposit box”

No, equating one company served with a subpoena for its customer’s information, stored in a service provided by the company which it absolutely circumvent the security protections for, on a Tuesday.

If you want to claim that Apple is special because they bevel the edges of their products in an aesthetically pleasing way, state it clearly.

“Ignoring the questionable application of the All Writs act”

To no greater degree than Apple did with the last warrant for the same device, same challenges, same everything they complied with before they decided to stop doing that for PR purposes.
Want to call me deplorable for doing it for frivolous reasons?

“- Disregarding the implications for other user if it is mandated that encryption be breakable by the company”

Literally not an issue involved here. Did you spot anyone mentioning the super difficult in theory algorithm at all? No? There’s a reason for that.

Raelion September 16, 2016 8:24 AM

“What happens if Apple or its Chinese subcontractor start to make a FIPS 140-2 phone for the government or associated agencies (wink, wink) that self-destructs if the case is opened?”

I believe Snowden was the name of the foreign government subcontractor with the proposal for phone that self-destructs if opened.

Seriously, did we first locate a user in need of a phone cover that you need to solder onto your phone to give you an LED notification of something but not actually protect you from anything.

Clive Robinson September 16, 2016 8:28 AM

@ Raelion,

Since that time we’ve been discussing this on the basis that there is no crypto issue here

That’s actual not true.

There is a very fundemental crypto issue involved, that was to force Apple against their will to,

A, Write backdoor code.
B, Sign it with their Private Key.

Whilst arguably many people with access to the source code could do A it is only people with access to the Private Key that could do B. Importantly without B step A is a fairly pointless excercise.

But let’s look at step A again, the source code is a “trade secret” belonging to Apple that has a very high degree of Intellectual Property involved with arguably a very very high value. The disclosure of which would with little doubt cause irreparable harm to Apple and it’s share holders. Under US law, Apple are required to protect such assets to their best ability, or get not just sued by the sharholders but face criminal sanctions, loss of lively hood and imprisonment. You could argue that the FBI could pay another person to reverse engineer the source code from another phone of the same model etc. But again that would involve “criminal” activity under US law so the FBI by paying somebody to do that, would be part of a criminal act as well as conspiracy and incitement, something that even the FBI are not legaly alowed to do.

But the FBI tried very old –probably repealed– legislation to compell Apple to carry out steps A and B. Whilst Apple is not a living entity it’s employees are, thus by simple extension the FBI were trying to put them in the position of a slave, that is without free will. You might want to check the various fundemental parts of US legislation as well as the international treaties they have not just been involved with drafting but signed up to do since the latter part of the first half of the last century. Again it is not something the FBI should be doing.

I could go on but as others have noted your general presentation is indicative of bias beyond reasonable disagreement with what is known, which calls into question either your understanding of the events or your motivation.

Clive Robinson September 16, 2016 8:54 AM

@ Bruce,

The problem with the method is that technically it involves “tampering with evidence” and had a more than insignificant chance of “destroying evidence”.

I suspect that the FBI were atleast sufficiently aware of the out come of either case in court, irrespective of the fact that the alledged terrorist was deceased.

It needs to be said that if they had removed chips (tampering) they would come up against an issue that has dogged computer forensics for years. That is the right of the defendent to not only contest evidence but employ their own forensic examiners to check for tampering, falsification and a whole host of other wrongs that the FBI, their emoloyees or agents could have inadvertantly or deliberatly carried out.

At best it would have been a high risk approach, at worst it could have formed case law with very significant negative outcomes for them. Due to the fact they had drummed this up into a very high profile case I’m doubtfull that they would have wanted to take on that risk. However by “playing dumb” and trying to force others, they were in effect “externalising the risk”. Unfortunatly they miscalculated on the likely response and outcome, even trying to blackmail Apple into compliance had the opposite of intended consequences, their only option left at the end of the day was to “back down” and this they again did badly, but atleast avoided other caselaw being formed that had negative consequences.

Unfortunatly in the US you have the “both sides pay their costs” on court action. If however Apple or other defendents could recover their costs the result to the FBI would likely have been “head rolling” at the very least. The fact that the FBI do not have to pick up a defendants costs, means that they can take very high risk court action without personal liability. Perhaps it is time that this should be changed so that it stops what amounts to malicious prosecution.

Clive Robinson September 16, 2016 9:20 AM

@ Hawk,

Is this good news or bad news?

It rather depends on your perspective.

For the FBI and DoJ it’s actually bad news. As I explained above there are risks involved with the process and it’s questionable as to if they would be able to use the method and then present the results in court.

Further there is now a further “hidden rod” for lawyers to hit the FBI/DoJ over the back with. If the FBI come to court and present evidence from this model of phone in the future then they can and will be chalenged as to how the evidence was obtained. People are not going to believe other methods rather than this “evidence tampering” were used thus the FBI would be forced into revealing their supposed “other method” –which they currently maintain is secret– or having the judge through the case out.

This may well force the FBI/DoJ not to prosecute cases due to this risk.

Which brings up the question as to if the money they supposedly spent on this “other method” was a wise decision.

However as has been noted above by others this “secret” method that “miraculously appeared at the last minute”. Thus enabling the FBI to avoid what would almost certainly have been a significant loss in court, may not actualy exist…

If that is the case than the FBI/DoJ could be in for a very rough ride indeed, because even though it is known that the FBI lie to defendents and likewise assumed that they commit perjury all the time to get guilty verdicts. They get away with this as proof is usually either unavailable or difficult to get at because judges are reluctant to act upon defendents claims (though this is starting to change, and the release of the Stingray documentation will give rise to further reasonable suspicion).

So for some types of criminal and even those seeking the truth then there is a germ of good news about it.

Marcos Malo September 16, 2016 1:07 PM

@Clive

I did a bit of research (an ongoing hobby of mine) wrt to government attempts to circumvent and weaken encryption, so let me throw in some stuff I’ve found.

The All Writs Act (AWA) is still law—in and of itself it is not obsolete, despite its age. It’s the basis for a lot of case law. What the FBI and the Department of Justice tried to do here was use a completely novel interpretation of the AWA—meaning that there was NO precedent and that it would have vastly expanded government power.

Additionally, this government interpretation would overturn a law that Congress enacted—CALEA. Lower courts can’t just overturn laws mass by Congress, only the Supreme Court can do that, and only on the basis of a law violating the U.S. Constitution. The AWA is not part of the constitution. The government interpretation of AWA does violate the constitution.


The rationale given by James Comey, director of the FBI, was that evidentiary considerations were secondary. The prime motivation for hacking the phone was to prevent other terror attacks. He made a number of emotional appeals that were variations of the “ticking timebomb” scenario. (Despite overwhelming evidence that the handset contained nothing of value.) No doubt that this was a misrepresentation of the FBI’s true motives (as seen in the another contemporary case in NY*).


A lot of hay is being made (correctly) about the government’s interpretation of the AWA violating the 5th amendment, but in the San Bernardino case, that was moot: the suspects were dead and the dead are not afforded posthumous constitutional rights.

More broadly, the attack on encryption does affect everyone else’s 4th amendment rights. If the government got its way, it would have impinged on the constitutional rights of people in the U.S. The 4th amendment affirms the rights to secure one’s person, property, papers, and effects, with a very narrow government exception on a case by case basis. For example, the government could demand that a lockbox maker hand over a master key in its possession. However, it could not demand that the lockbox maker design a master key when none existed. That would be a violation of other constitutional rights against involuntary servitude.


* Memorandum and Order by James Orenstein, magistrate judge, Eastern District of New York https://img.nyed.uscourts.gov/files/opinions/Order%2015mc1902.pdf if you haven’t seen this document it is worth reading. Judge Orenstein dismantles the government’s arguments—the same arguments it makes in the San Bernardino case.

Anon September 16, 2016 3:56 PM

@Marcos
However, it could not demand that the lockbox maker design a master key when none existed. That would be a violation of other constitutional rights against involuntary servitude.

This is the type of overly creative argument that makes for fun law school debates, but would get you laughed out of the courtroom if you tried to argue it before a judge. If the government couldn’t force you to work for them, people couldn’t be drafted into the military, forced to serve on juries, or required to perform community service.

Clive Robinson September 16, 2016 4:31 PM

@ Bill Eccles,

Everybody knows there isn’t such thing as “local electronics distributors” anymore.

Not sure where you live but I’ve got three “local electronics distributors” stores within a half hours walk of where I live. And maybe another six within a 20Km radius two of which are outlets for some of the biggest distributors in Europe.

I Guess it’s the old problem of “Location location location”…

65535 September 16, 2016 10:19 PM

@ Marcos Malo

“What the FBI and the Department of Justice tried to do here was use a completely novel interpretation of the AWA—meaning that there was NO precedent and that it would have vastly expanded government power.”

Ding ding ding! You are the winner!

There was no current precedent for violating the Fourth amendment and re-interpreting the AWA.

The DOJ/FBI gambit to set such a precedent would have drastically expanded their fishing power to dig through personal data and enlist major phone providers to help them did through personal data – they failed.

Luckily, Apple saw what was at stake and fought the FBI tooth and nail. The DOJ/FBI backed-down.

I applaud Apple for making a public stand [although we will not know what is behind the curtain because the overused/abused phrase “National Security”].

Anon10 September 16, 2016 11:33 PM

Let’s deconstruct some of the inconsistencies and illogical statements in Patrick Ball’s argument.

the lesser-known hazard is that it could jeopardize the safety of human rights activists, primarily those based abroad who rely on U.S. encryption tools to do their work and In February, a study by the Berkman Klein Center for Internet & Society at Harvard University focused on some 865 encryption products from 55 countries, two-thirds of which were built outside of the United States.

If the FBI did get a backdoor, for one product, then why couldn’t human rights activist switch to one of the other 864 cryptography products? Maybe, the loss of business could hurt Apple or even the US economy, but it seems human rights activists could easily adjust if you take his 865 number at face value.

The Yezidi have one of the strongest cases of genocide that I have seen in recent years, but to bring justice to their people, they will have to prove it in a fair trial of the perpetrators.

Here, Ball is full on delusional. He seems to live in some fantasy where Iraq isn’t in the middle of a civil war and it would be possible for law enforcement to arrest ISIS and then try ISIS members in court, as if Iraq is comparable to Germany.

Jim N September 16, 2016 11:54 PM

@ Anon10,

“If the FBI did get a backdoor, for one product, then why couldn’t human rights activist switch to one of the other 864 cryptography products?”

They could if they knew which ones out of 865 are backdoored. Perhaps, you can point us to a list.

soothsayer September 22, 2016 8:32 PM

The moral of the story is what?

If FBI could extract the chip and do what the prof. did .. then it’s not a security issue so FBI’s claim was bogus .. but so is grandstanding by Apple.

All it means is that if you can do something for $1.00 we will spend $1000000 to do the same and feel we have saved the planet.

Where I come from there is a saying — you can touch your ears by moving your hand up .. or you can run your arm though your legs — bend over and do the same.

Is this science or security or stupidity .. I vote for the last.

Clive Robinson September 23, 2016 1:47 AM

@ soothsayer,

Is this science or security or stupidity .. I vote for the last.

Undoubtedly but by whom and why?

This method will work against any password if you get lucky relatively quickly, but… The odds of that happening are based on the strength of the password. So only if the “owner is stupid”.

However the method has a probability of damaging the phone beyond use and destroying it as evidence. The odds of damaging the phone is related to the number of trys carried out, so at some strength of password the “agency” will have wasted not just a very singular amount of effort and resources, it will also have shot it’s self in the foot… Thus the “who’s stupid” roles will have reversed…

Leaving the question of just what stength of password is needed to make fools of the agency?

soothsayer September 23, 2016 7:42 PM

@Clive

I think you missed the point totally.
The method DESTROYS the phone .. by removing the chip first ..

Recovery is trivial technically but at fair bit of cost. (not prohibitive .. but still expensive – nothing a law enforcement doesn’t spend on a single lawyer during any investigation).

It is stupidity on everyone’s part to support APPLE .. they were wrong on this issue and no amount of hyperventilating will deflect that.

FBI, on the other hand wanted a a precedence – their goal, nefarious as it is will prevail in the end. For good or for bad .. society will not let this “dark parallel word exist forever”

All APPLE and their legionaries have done is burnt a few 100M’s in the name of privacy – that’s what I meant by voting for stupidity.

Clive Robinson September 24, 2016 7:15 AM

@ Soothsayer,

The method DESTROYS the phone .. by removing the chip first ..

That is not destroying the phone, any more than taking the engine out of a Mustang would destroy it. You can put it back a number of times before it becomes “damaged beyond repair”. The only question then is “excluding accidents how many times?”.

Aside from the mechanical issues, each chip you use has a limited life time due to being turned on and off. Further Flash and other PROMs have limited reprograming life times.

These are the more obvious “engineering” issues. Which also have a bearing on “resource” issues and thus the “monetary” issues.

With regards,

FBI, on the other hand wanted a a precedence – their goal, nefarious as it is will prevail in the end. For good or for bad .. society will not let this “dark parallel word exist forever”

Your meaning is far from clear. Yes the FBI will push as hard as possible for whatever makes their life easier. But to say they will prevail in the end is far from certain. To do so society will have to meakly accept the situation for eternity. The one thing history tells us is that empires good or bad fail from within thus it’s more reasonable to assume the FBI and it’s works will fail. Further laws get repealed updated or changed thus caselaw precedent can be wiped out at the stroke of a legislators pen.

Which brings us back to what society will and will not accept. You suggest that society will not accept a “dark parallel world” but you are ambiguous thus it is unclear if you mean the FBI or those the FBI seek.

Which brings us to your final point,

All APPLE and their legionaries have done is burnt a few 100M’s in the name of privacy – that’s what I meant by voting for stupidity.

To which the obvious comment is “What price freedom?”.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.