Decrypting an iPhone for the FBI

Earlier this week, a federal magistrate ordered Apple to assist the FBI in hacking into the iPhone used by one of the San Bernardino shooters. Apple will fight this order in court.

The policy implications are complicated. The FBI wants to set a precedent that tech companies will assist law enforcement in breaking their users’ security, and the technology community is afraid that the precedent will limit what sorts of security features it can offer customers. The FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

The technology considerations are more straightforward, and shine a light on the policy questions.

The iPhone 5c in question is encrypted. This means that someone without the key cannot get at the data. This is a good security feature. Your phone is a very intimate device. It is likely that you use it for private text conversations, and that it’s connected to your bank accounts. Location data reveals where you’ve been, and correlating multiple phones reveals who you associate with. Encryption protects your phone if it’s stolen by criminals. Encryption protects the phones of dissidents around the world if they’re taken by local police. It protects all the data on your phone, and the apps that increasingly control the world around you.

This encryption depends on the user choosing a secure password, of course. If you had an older iPhone, you probably just used the default four-digit password. That’s only 10,000 possible passwords, making it pretty easy to guess. If the user enabled the more-secure alphanumeric password, that means a harder-to-guess password.

Apple added two more security features on the iPhone. First, a phone could be configured to erase the data after too many incorrect password guesses. And it enforced a delay between password guesses. This delay isn’t really noticeable by the user if you type the wrong password and then have to retype the correct password, but it’s a large barrier for anyone trying to guess password after password in a brute-force attempt to break into the phone.

But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key. This is what the FBI ­ and now the court ­ is demanding Apple do: It wants Apple to rewrite the phone’s software to make it possible to guess possible passwords quickly and automatically.

The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications: Authorities have the phone in their lawful possession, and they only need help seeing what’s on it in case it can tell them something about how the San Bernardino shooters operated. But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.

Make no mistake; this is what a backdoor looks like. This is an existing vulnerability in iPhone security that could be exploited by anyone.

There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world. Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

And while this sort of attack might be limited to state actors today, remember that attacks always get easier. Technology broadly spreads capabilities, and what was hard yesterday becomes easy tomorrow. Today’s top-secret NSA programs become tomorrow’s PhD theses and the next day’s hacker tools. Soon this flaw will be exploitable by cybercriminals to steal your financial data. Everyone with an iPhone is at risk, regardless of what the FBI demands Apple do

What the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm. Powerful governments, democratic and totalitarian alike, want access to user data for both law enforcement and social control. We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order.

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.

This essay previously appeared in the Washington Post

The original essay contained a major error.

I wrote: “This is why Apple fixed this security flaw in 2014. Apple’s iOS 8.0 and its phones with an A7 or later processor protect the phone’s software as well as the data. If you have a newer iPhone, you are not vulnerable to this attack. You are more secure – from the government of whatever country you’re living in, from cybercriminals and from hackers.” Also: “We are all more secure now that Apple has closed that vulnerability.”

That was based on a misunderstanding of the security changes Apple made in what is known as the “Secure Enclave.” It turns out that all iPhones have this security vulnerability: all can have their software updated without knowing the password. The updated code has to be signed with Apple’s key, of course, which adds a major difficulty to the attack.

Dan Guido writes:

If the device lacks a Secure Enclave, then a single firmware update to iOS will be sufficient to disable passcode delays and auto erase. If the device does contain a Secure Enclave, then two firmware updates, one to iOS and one to the Secure Enclave, are required to disable these security features. The end result in either case is the same. After modification, the device is able to guess passcodes at the fastest speed the hardware supports.

The recovered iPhone is a model 5C. The iPhone 5C lacks TouchID and, therefore, lacks a Secure Enclave. The Secure Enclave is not a concern. Nearly all of the passcode protections are implemented in software by the iOS operating system and are replaceable by a single firmware update.

EDITED TO ADD (2/22): Lots more on my previous blog post on the topic.

How to set a longer iPhone password and thwart this kind of attack. Comey on the issue. And a secret memo describes the FBI’s broader strategy to weaken security.

Orin Kerr’s thoughts: Part 1, Part 2, and Part 3.

EDITED TO ADD (2/22): Tom Cook’s letter to his employees, and an FAQ. How CALEA relates to all this. Here’s what’s not available in the iCloud backup. The FBI told the county to change the password on the phone—that’s why they can’t get in. What the FBI needs is technical expertise, not back doors. And it’s not just this iPhone; the FBI wants Apple to break into lots of them. What China asks of tech companies—not that this is a country we should particularly want to model. Former NSA Director Michael Hayden on the case. There is a quite a bit of detail about the Apple efforts to assist the FBI in the legal motion the Department of Justice filed. Two good essays. Jennifer Granick’s comments.

In my essay, I talk about other countries developing this capability with Apple’s knowledge or consent. Making it work requires stealing a copy of Apple’s code-signing key, something that has been done by the authors of Stuxnet (probably the US) and Flame (probably Russia) in the past.

Posted on February 22, 2016 at 6:58 AM218 Comments

Comments

Alan February 22, 2016 7:21 AM

Is that the best argument you can make? News Flash: you lost. As long as manufacturers like Apple insist on controlling the keys to the phone, they will be required open it when subject to a warrant. The “everyone is backdoored” argument won’t fly, because the law and courts believe everything can be controlled with more laws.

Vesselin Bontchev February 22, 2016 7:27 AM

It doesn’t matter that Apple’s position is “right”, “correct” and “moral”. They will still lose.

tz February 22, 2016 7:34 AM

Here is where Apple erred. Even back with PGP, the passcode is hashed thousands of times – now much faster on new hardware, but the secure enclave could use a hardware hasher/scrambler that would take a while (100mS should be enough) to create the key from the passcode and other key material so it couldn’t be accelerated, and maybe use other hardware (charge a cap for each attempt, can’t try until discharged) to slow things down. They would have to remove the casing from the chip and edit silicon to speed things.

Nick Johnson February 22, 2016 7:43 AM

tz, Apple did not err in that regard; the passphrase is properly keystretched before being applied; your figure of 100ms is more or less correct. That’s still a much weaker protection than the seconds, minutes, or hours of delay that the secure enclave can normally enforce on each wrong password guess.

Chris February 22, 2016 7:46 AM

Please correct me if I am wrong, but is encryption not regulated by US weapon export law? If Apple does not comply, could the authorities simply prohibit iPhone sales overseas? This might not resolve inner US encryption, but may put enough pressure on Apple -or any other company- to cave in and remove secure encryption altogether.

Steve Holmes February 22, 2016 7:52 AM

I think where Apple err’ed is allowing a signed software update of a locked phone without firmware forcing the data on that phone to be wiped. This is essentially the back door and it exists today in all of Apple’s phones.

I can see that Apple doesn’t want to permanently brick phones of customers that have forgotten their passwords, but if they want to have this functionality, and secure user data at the same time, any update to a locked phone need to assure that all user data is wiped before any code update takes place.

The question now is can Apple close this back door? Assuming the court forces Apple to develop the hack and unlock this phone can they in the next software update or in future phone lock down the “keyless upgrade” backdoor?

John S February 22, 2016 7:52 AM

I do wonder when we will reach a point where the tech industry and governments sit together and work out a way to strike a balance between conflicting objectives. We all know that extremes of anything invariably do not work and alienate another extreme. Most of us do not operate our daily lives at extremes yet these conversations keep pushing out to the edge, rather than coming into the middle.

Governments around the world have a plethora of laws that enable them to seize goods, search people, properties and other assets, and it is reasonable for them to wish to do the same on digital assets. Governments have the right to change the laws, companies and individuals have a right to challenge those laws if the law is not tested or not clear.

Citizens have a right to expect privacy but I know of no country where a citizen’s right to privacy in all scenarios is sacrosanct in law – citizens know they can be stopped and searched on the street. They know their house and other assets such as cars can be searched, they know any paperwork can be taken away and examined as can telephone calls, bank records et al. We accept that in many countries data privacy laws expect paper to be protected as much as digital data, it seems reasonable that at the other end, citizens must accept law enforcement have access to digital as well as paper.

I am not advocating or supporting any side, I am advocating that we stop behaving like gangs in the kindergarten and collectively work out how to strike a balance – and I accept there probably isn’t a perfect balance – but life is not perfect.

However, it cannot be beyond the intelligence, creativity and will of mankind to work out how we can live much of our lives where we can balance privacy and security and security and surveillance.

A good starting point would be for us to be crisper on our language – no one is talking about banning all encryption; no one is talking about unlimited, unfettered access to all data. OK occasionally you get a crazy voice, but let the voice of the reasonable outweigh the voice of the crazy.

What is being talked about is how does law enforcement legitimately get access to data that maybe encrypted. If we do not want Governments to do stupid things and change the laws that will really cause harm to the tech industry, then now is the time to work together.

If I was organised crime, I’d be laughing all the way to the bank at our collective stupidity.

I know of no better forum than this to solve this problem!

Lisa February 22, 2016 7:53 AM

Law enforcement is always the biggest proponents for having police states. It makes their job that much easier.

Basically we are in a suituation were we need to decide if it is worthwhile to kill the tech industry or be willing to accept that a few bad people might get away.

Make no mistake, once it becomes common knowledge that all of our tech devices can be used to turn on us, effectivity turning them into mass spy equipment against the public, it will kill the tech industry. Having TV’s which record all living room conversations, laptop & tablet webcams recording video, watches recording all physical activity, phones recording communications, etc., who would want to buy these and trust them? Better to live as a luddite, stuck with 1980’s like technology.

Computer technology are tools for the mind, and like the mind need to be afforded the same privacy and protection against self-incrimination, if they are too be trusted. Otherwise we end up with a distopian police state future in which every thing thought and done by the public is accessible to the state, far worse that what was depected in 1984.

Yes terrorists and child pronographers are horrible, and need to be wipe off the planet. But there are limits to the amount of harm we have to do to get rid of them. Just because these horrible people breath oxygen, are we willing to detonate all of the nuclear weapons to burn up it up? Of course not.

But I would also state that we should also not be willing to give up our fundamental rights and liberties as well.

If law enforcement in the USA really cared about stopping horrible people, then they should be focused on the low hanging fruit, and avocate for sensible gun control, which would be far more effective in stopping future San Bernardino mass shootings than any information on any iPhone, with less harm to the public’s fundamental rights and liberties.

cervin February 22, 2016 7:55 AM

What strucks me is the fact that no-one until now suggested a “big-picture” summary of this case.

In clear words the fact that Apple (like all other GAFAs, Facebook leading), a huge corporation that has infiltrated itself so deep into the privacy of many users of the Internet, could potentially develop more surveillance and social control on its customer base than government agencies, and begins to feel it has enough influence and opinion back-up to litterally refuse legal orders from the country it originated from and alledgedly “loves” (sic).

This, not surprisingly, repeating the well-rounded and oh-so-arrogant ranting that we protect your privacy and the freedom of the Internet, what you do with your phone is none of our business etc.

The problem is that Apple doesn’t clearly take position. On the one hand, they want to shape the digital world and replace the old pre-numeric order, but on the other hand, they will not take responsibility if their policies prevent mass killings from being enquired or even avoided, because alledgedly humankind is a big boy.

So this is nothing more than yet another fight for power, the main stake being who owns and controls the information. But, please, Apple, stop telling us you do this for our own good. And please, government agencies, stop pretending this would just be a one-off jailbreak.

65535 February 22, 2016 8:00 AM

“The FBI wants to set a precedent that tech companies will assist law enforcement in breaking their users’ security, and the technology community is afraid that the precedent will limit what sorts of security features it can offer customers… the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to…this is what a backdoor looks like. This is an existing vulnerability in iPhone security that could be exploited by anyone. ” –Bruce S.

The “precedent” is the key word. Once the camel gets its nose in the tent the rest of it is soon to follow. This is exactly what the FBI/DEA/Local Police want. They want unlimited, low cost surveillance of everyone. And, so will every other country no mater how horrid their human rights record is. This case should be fought tooth and nail.

Maria February 22, 2016 8:19 AM

This debate is as much about economic security as it is about privacy or surveillance. Mandating weaker security is like creating a regulation that increases the chance that airplanes could crash or bridges could fall down.

A phone that can be compromised to reveal criminal activity can also be compromised to reveal trade secrets, information about critical industrial infrastructure, or sensitive business financial information. This threat to our economic security comes from both criminal enterprises and from outside governments like China.

Attempts to weaken encryption compromise the integrity of our nation’s industrial infrastructure and are a serious threat to our economic security.

Cherimoya February 22, 2016 8:20 AM

The end result in either case is the same … the device is able to guess passcodes at the fastest speed the hardware supports.

Two conclusions so far:

  1. Weak passcodes must be phased out on all devices, regardless of software security features.

  2. Given that the FBI botched the data recovery by hastily resetting the iCloud password, and failed to mention this crucial detail to the judge in the All Writs motion, a compromise may have been possible earlier had not the Feds bet the bank with All Writs. This strongly suggests that the motion is all about getting the precedent, rather than this specific case.

Richard Karash February 22, 2016 8:37 AM

Michael Hayden’s take on #AppleVsFBI in 7-min video. Very interesting. I think Hayden is a pretty insightful voice in all this.

http://theweek.com/speedreads/606641/exnsa-cia-chief-michael-hayden-sides-apple-fbi-iphone-encryption-fight

Top lines: FBI is wrong; looking too narrowly. On security alone, America is more secure with unbreakable end-to-end encryption. “Last similar battle was over backdoor in Clipper Chip. NSA lost that battle. That marked the beginning of the greatest 15 years of electronic surveillance; we figured out ways to get around lack of back door with bulk collection and metadata.”

More: End-to-end encryption will make our industry stronger, and that makes America’s security even stronger. Jim Comey thinks he’s the “main body” and we all should move/adjust based on his position; he’s wrong, American industry is the “main body.”

And, finally, “If I were in Comey’s position, I’d be saying what he’s saying.”

(Approx quotes)

Charlie Gordon February 22, 2016 8:58 AM

@John S keptical is here to parrot the statist party line at an intellectual level appropriate for people who passed the Wonderlic dumb-cop test with flying colors

  1. Stuff a strawman full of absolute privacy

  2. Whip out a non sequitur proving to morons that citizens must accept law enforcement access

  3. say the magic word balance one mo’ time

  4. evoke the intelligence, creativity, and will of all mankind [no bitchez needed] to do what the UNHCHR already did

  5. call the language crispness police on yourself and flush your strawman down the toilet

  6. try lamely to flatter people who already know they are way smarter than you

  7. Cue-wheedee!

Mike Amling February 22, 2016 9:02 AM

I wonder how difficult it is to change the public key that the phone uses to verify the digital signature of an update. Is there only one such public key or is there also an NSAKEY?

Is the Unique ID subject to a side-channel attack? Is it ever used for anything other than entering a passcode? IIRC, side channel attacks typically need a lot more than 10 passes.

When I connect my locked iPhone to my Macintosh’s USB port, iTunes starts up and does a backup of the phone. One wonders what data on the phone would have to be altered for the phone to trust the FBI’s Macintosh.

OldFish February 22, 2016 9:04 AM

@JohnS

Balance between conflicting objectives?

OK, I’ll bite.

Where do you think lies the proper balance between an individual’s right to as high a degree of privacy as can be purchased or created and the government’s unsupported by the Constitution desire for everything an individual views, writes, speaks, everywhere they travel, everyone they know, and everything they buy, to be recorded and held in escrow?

Jeroen-bart Engelen February 22, 2016 9:15 AM

I really don’t get all the commotion. The government isn’t asking for a backdoor, at all. In the motion they filed on the 19th they even say so explicitly (See: http://documents.latimes.com/doj-motion-apple-comply-fbi/).
They want Apple’s help to use a security flaw in one of their devices to unlock it. The software that Apple needs to create for that is owned completely by Apple, does not have to leave the Apple office and can be destroyed after the phone has been unlocked. The FBI does not request a copy of that software.

Now you could argue that they are setting a precedent. Well is it really a precedent? Didn’t companies already cooperate to provide user data and if that data was encrypted with a key held by the company, also provide that key? Isn’t that why companies are trying to make this encryption technically impossible for them to decrypt, rendering the requested (or court-ordered) aid useless.

Brad Koehn February 22, 2016 9:27 AM

I don’t think allowing a firmware update without the user password makes it any easier to access data on the phone, provided the phone has a secure enclave. Can the enclave firmware be upgraded without the user password?

PGPer February 22, 2016 9:28 AM

John O. Brennan is correct, “people freely give up their privacy for technology”. I don’t believe it will make a difference to the average neck bent – lost in iPhone” user. They will continue to use weak passwords, refuse to manage encryption keys and whine to the authorities when their bank account has been drained. Privacy and encryption will always be available to those have a need for it. Apple will loose this case and other tech companies will follow.

Steve Holmes February 22, 2016 9:35 AM

Brad, “Can the enclave firmware be upgraded without the user password?”

Apparently it can, (see Bruces’ addendum):

“If the device does contain a Secure Enclave, then two firmware updates, one to iOS and one to the Secure Enclave, are required to disable these security features”

de La Boetie February 22, 2016 9:40 AM

Sometimes it’s worth following the money. Of course, this requires us to put our high-faluting principles aside and “haggle about the price”.

Societies tend to be very hypocritical about how much is a human’s life worth? A life cut off by an act of terrorism? The cost of preventing a death by terrorism? A life in another country? But effectively societies make these calculations all the time, e.g. speed limits, road safety expenditure, foreign aid etc. But there’s a strange presumption that somehow terrorism and some other crimes out-trump everything.

Apple’s current marcap is $534bn, and much of that is tied to the brand. So, fairly simply, anything which could harm Apple’s brand costs a fortune, so if they were compelled to do as the FBI requested and that damaged their brand (it would) – say to the measly tune of 0.1% – that would be around $500m lost to Apple’s shareholders. Burdensome.

I don’t suppose the FBI is going to cover this amount, but who would decide how much money it’s worth to unlock this phone? How many future lives would that save, and is it worth it on that basis (even if you asked that question within another budget)? Am I asking US society, whatever that is, or Apple, or the FBI – or indeed, other countries and their citizens who are affected?

Right now, the protagonist’s positions are clearly aligned with the money (Apple) -or lack of bearing the costs (FBI who get some CYA) – whereas those bearing the consequential losses (or possible benefits) have no say at all, partly because they are un-represented by a supine executive arm, or else because the decision is being made by a court of a domineering superpower, who appear to be prepared to ignore the rule of law when it comes to the marvellously vague national security bit.

Of course, that superpower may find its making of reality leads to more iatrogenic consequences.

flowbee February 22, 2016 9:42 AM

Jeroen-bart Engelen,

Government will then be permitted to ask for all kinds of software under the guise of law enforcement. This is a valid point. Maybe they are asking for 10 programmer hours of work this time. Next time, 20 hours. Next time… What happens when one of these custom packages gets into the wild?

Way up in the comments, someone asked if the encryption is regulated for export. American export officials basically want to know what encryption methods are being exported. If one uses known protocols, RSA, PGP, etc. there’s no problem. They return a rubber stamped document in a couple of days. If you are doing something truly unique, that’s when things bog down.

Lots of talk about the OS in question, but it’s cheap talk. It’s a proprietary OS. Chances are excellent it’s riddled with exploits.

Jordan February 22, 2016 9:49 AM

Very interesting development. I think more people need to be paying attention to this case, as how it goes could determine a lot about what is legal to seize from someone’s data. Thanks for sharing this.

Thomas February 22, 2016 9:49 AM

Can’t the FBI just extract disk data from the internal SSD and run the brute force decryption on a general purpose computer (or cluster of) ?

Jeroen-bart Engelen February 22, 2016 9:51 AM

flowbee,

According to the motion they filed they can only ask things that are not an “undue burden”. So apparently they can already ask for help, but there is a limit for what they can ask. Although this seems like a really vague specification to me. And how can these custom packages get into the wild? They can be fully controlled by Apple as no source of iOS has ever leaked I see no reason why we would think such a package would leak.

This discussion on encryption export control is useless. It’s an american phone, used by an american citizen within american borders. So even if Base64 would fall under restricted ‘cryptography’ for export it would not matter for this case.

CallMeLateForSupper February 22, 2016 9:58 AM

@Jeroen-bart Engelen

Yes, the filing speaks at some length to claim that a backdoor is not requested. It does this by framing in terms of backdooring encryption – which FBI lobbied for last fall and got roundly scolded for – but then immediately explaining that what they want in this case is not a backdoor because it does not break encryption.

OK, FBI, you’re not seeking a bookdoor that breaks encryption. It is a fack, however, that you are asking for a backdoor, one that breaks into a device. You do, therefore, seek a backdoor.

b February 22, 2016 10:11 AM

I really don’t like everyone calling this “a backdoor.” To me, a backdoor has several properties missing here:
1) Secret
2) Intentionally weakening an otherwise secure product
In this case NEITHER of those is true: the existence is NOT secret and the product was NOT secure. There is absolutely NOTHING preventing Apple from implementing a secure product. You really think Apple was just too stupid to do this? This isn’t the NSA suggesting a secret routine that “looks” correct but in fact dramatically weakens what should be secure. This is a product that “a whole bunch of nation states have presumably already hacked” – so this is just making official what we are almost certain already exists! So “OMFG China” is really “well, the Chinese already have Apple’s private keys, or they wouldn’t allow iPhone sales in the first place.”
I think the solution is two-fold: Apple does this, and Apple makes the iPhone secure against nation states.

Mark February 22, 2016 10:26 AM

It is definitely a backdoor.

Governments/intelligence agencies can dress it up all they like with their doublespeak and PR rubbish. They are asking for a change that will render the encryption more or less worthless.

What are the chances that was a decent passphrase was chosen?

Alphaman February 22, 2016 10:32 AM

@b
I think you missed a couple points in your assertions:

1) Secret
The FBI needs Apple to supply a code signing cert in order to insert this new code in the OS. Only Apple holds that cert, in secret.

2) Intentionally weakening an otherwise secure product
Removing the ability to wipe the phone after 10 bad password attempts PLUS adding the ability to electronically guess the PIN at electronic connection rates is intentionally weakening an otherwise secure product.

flowbee February 22, 2016 10:50 AM

@Jeroen-bart Engelen,

Today’s “undue burden” is 10 programmer hours. The next “undue burden” is 40 hours. The next 100 hours. Each is an undue burden in a billion dollar corp.

Separately,
Nobody outside a few in their coding farm knows how secure Apple’s system is. Nobody. Claims about the security of Apple’s product need to be treated with a great deal of skepticism.

End of innocence February 22, 2016 10:50 AM

Just from the legal perspective…

FBI request states, quote:

The government acknowledges that CALEA exists, but it says: “Put simply, CALEA is entirely inapplicable to the present dispute [because] Apple is not acting as a telecommunications carrier, and the Order concerns access to stored data rather than real time interceptions and call-identifying information.”

The referenced CALEA also states, quote:

(1) Design of features and systems configurations. This subchapter does not authorize any law enforcement agency or office

(a) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services;

(b) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.

Referenced article:

https://cyberlaw.stanford.edu/blog/2016/02/calea-limits-all-writs-act-and-protects-security-apples-phones

There you have it… It’s an attempt by the government for getting Apple in line with the rest of the Silicon Wally companies, such as Google (Alphabet if you prefer), Facebook, Microsoft, et al…

Patrick Holmes February 22, 2016 10:57 AM

It is a mistake to say that quantum computers do not weaken symmetric key encryption, except by one bit.

This seems true when viewed only as a brute force problem (attacks against the key), since you need the output from step n as the input (feedback) to step n+1. In other words, it can’t be solved as a singular problem like factoring. But watch out.

Certain block ciphers have long since been reduced to sets of algebraic statements. Now, Chinese researchers are using groups of entangled particles to weaken this exact kind of problem. They’re not the only ones. This is going very fast. Follow the money.

Nick Johnson February 22, 2016 11:09 AM

The problem I have with everyone calling it a backdoor is that it simply doesn’t meet the definition. A backdoor is, by definition, a way of accessing a device or service that bypasses the usual authentication mechanism. What the FBI is asking for doesn’t bypass the authentication mechanism, it weakens it to make it easier to attack.

Call it what it is: a vulnerability. Stop trying to use the word “backdoor” where it doesn’t belong.

David Leppik February 22, 2016 11:16 AM

The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications

Not just the technological, but the legal considerations. People see this as particular to this case and this phone. But the law doesn’t support “just this once” or “just in an emergency.”

The request is either legal or it isn’t. Every case is equally an emergency under the law. If it’s deemed legal, any police department can demand this treatment. If not, none can.

And that’s just US law. As others have mentioned, the only thing keeping oppressive governments from requesting the same access is the fact that no government has this level of access.

herman February 22, 2016 11:17 AM

So what we learned so far, is that the FBI doesn’t have the ability to subvert Apple’s phones and that the NSA probably also cannot, or doesn’t want to admit that it can by sharing a tool with the FBI.

Clive Robinson February 22, 2016 11:19 AM

@ Bruce,

The original essay contained a major error.

Not realy surprising, due to the “vacuum of information” that arises from keeping “Trade Secrets” Apple’s shareholders expect.

You however get significant “Brownie Points” for being honest about it, which few others appear to be doing currently.

@ ALL,

Lets be clear on one thing, the supposed “security flaw” in Apple’s phone is not a “technical flaw” but a “sociological flaw”.

If we don’t get to understand that then we are abdicating our responsibilities as humans.

The big issue is “human failings/limitations” and it is this that is responsible for the supposed flaw in the design of the phone.

If humans could,

1, Remember random long strings,
2, Enter them flawlessly and quickly.
3, Not blaim others for their failings.

Then Apple would not have had to engineer a way to make a 256bit AES key concealed by a very weak passphrase, and in someway stop the passphrase weakness being an issue.

With hindsight it is easy to say “Apple should have done…” but all such protections have costs, many would make the average lifetime cost of each phone to high for the phones to be profitably marketed.

As I’ve repeatedly mentioned on the blog “Return Costs” are very significant and can kill any profit in the blink of an eye. Thus Apple needed to design the phone to minimise return costs as much as possible.

For quite some time now the use of Flash ROM has been the leading way of reducing return costs across the industry. This is because software is the least reliable part of products, and products are usually put on the market long before they have been sufficiently tested. We as consumers have come to accept “Patch Tuesday” as “normal”, and “upgrading an OS” and “reflashing a BIOS” etc as grudgingly acceptable.

We also want “no cost no hassle” security, if we trip a security mechanism and “brick the device” we want to get it back in full working order as quickly as possible, and still have all those cute “cat pictures” / selfies / etc / etc be there. We don’t want to accept that our stupidity has cost implications, if we want security.

Trying to walk down this line has been near impossible for designers due to the business drivers, so we instead we have “mitigation built in”…

However even though the mitigation could be made more secure, most users don’t want it to be. It’s not just the cost or the recovery convenience, most importantly it’s the “pretty whistles, baubles and bells” of having the latest “upgrade” that users don’t want to loose.

The one good thing about the visability of the whole FBI-v-Apple issue is people are waking up. And with it some are begining to think that maybe a little extra cost and the small loss of upgradability in some security features of the OS might now be worth it.

If that view gains traction then the FBI / DOJ / Executive may well come to regret this little spat with Apple.

Because phone manufactures will put in more secure Hard Ware Enclaves with sufficient mechanisms in place that it will stop the FBI et al from trying this again. And in the process the FBI will find that other areas that the manufactures have previously been able to help the FBI with will also stop. Yes some of the more idiotic users will lose some recovery conveniance, but they will get much stronger security protecting their privacy.

But one thing phone manufactures could do right now that would render the whole argument mute, is to make a standard simple interface which can alow the phone to be used not as the data terminating device but as one further up the communication chain.

Thus encryption of communications and other security aspects could be moved off of the phone into another device of the users choice… One that is simpler and a whole lot more secure in design. Such devices have been looked at as part of making eBanking more secure so it’s not exactly a new idea (in fact the design of such devices has been discussed on this blog in the past).

In the mean time, for those that need the best secrecy they can get and are prepared to accept a whole load of usage limitations, they could use Pencils and Paper One Time Pads.

Even though the NSA can see the ciphertext in transite, and the FBI could get at the ciphertext on the phone and in it’s backup etc, it would be of no use to them without the OTP, or some side channel path to the plaintext.

However although this moves the problem off of the phone it has a host of other KeyMat and side channel issues which would need to be solved securely. Thus outside of certain limited senarios OTPs are not realy practical for most users.

ianf February 22, 2016 11:24 AM

@ Steve Holmes thinks “where Apple err’ed is allowing a signed software update of a locked phone without firmware forcing the data on that phone to be wiped out. This is essentially the back door in all Apple’s phones.

We thank you for your thinking, and invite you to do some rethinking preferably in some other fora. Because IF a software update wiped out user data, THEN no phones would ever be updated (auto iOS updates are OPT-IN and, besides a locked phone, require that it be connected to mains. For added security Apple could have created another threshold: update only while in the presence of the “designated home” geo-location and/or IP#/ router/ WiFi—but elected not to). This is not THE backdoor that you envision, but a well-documented, user-friendly update mechanism. That the FBI wants to abuse it does not a backdoor make.

[…] update to a locked phone needs to assure that all user data is wiped before any code update takes place.

Because of WHAT HEAVENLY PRISTINE PHONE PRINCIPLE? (supply the numbered quote from which gospel).

@ John S wonders “when we will reach a point where the tech industry and governments sit together and work out a way to strike a balance between conflicting objectives.

Wonder no more: that’s the essence of what is called corporativism, a state of affairs where business interests are of greater importance to the government, than the wellbeing of its people, and where any internal opposition is suppressed with brute force. Where the electorate is viewed as a necessary evil to provide a semblance of a mandate every X years, rather than as the principal. A policy mantle most often associated with fascism.

Besides, the “tech industry” is neither homogeneous, unified, nor unifiable, but composed of conflicting spheres of patronage and interest. Only in times of great crises (like the WWII) is it able to cooperate at large, and then only for a short while.

[…] I am advocating that we stop behaving like gangs in the kindergarten and collectively work out how to strike a balance

You should join Jason Richardson-White and Lucifer’s Lubricant in their 100-year Kampf to create a better world, if not The Thousand Year Brave New World. Honest. If short on ideas how-to, use “The Fountainhead” and “Atlas Shrugged” as blueprints (first try these shortcuts).

@ Lisa: “Law enforcement is always the biggest proponents for having police states. It makes their job that much easier.

Not always, though I presume it fairly common in some countries, no less due to the penalizing mentality of the USA, the world leader in incarceration per capita. Apparently, the U.S. taxpayers who foot the bill are A-OK with that.

we need to decide if it is worthwhile to kill the tech industry or be willing to accept that a few bad people might get away.

That’s a very simplified, indeed simplistic juxtaposition. Killing the tech industry (besides: how?) has never been on the agendum; please redefine the dichotomy.

@ cervin, who complains that “no-one has suggested a “big-picture” summary of this case.”

Which ALLEGEDLY is, that [confused line of thought deleted for brevity], but concluding with this allegedly deep insight “so this is nothing more than yet another fight for power [by both Apple and the FBI], the main stake being who owns and controls the information;” and followed by pleading and wishful thoughts of “please, Apple, stop telling us you do this for our own good. And please, government agencies, stop pretending this would just be a one-off jailbreak.

    Fingers crossed and hope to die, pray they’ll listen.

@ Maria states that this debate is as much about safeguarding the economy [rather than your “economic security“] as it is about privacy or surveillance. […] Attempts to weaken encryption compromise the integrity of our nation’s industrial infrastructure and would be a serious threat to our economy.

By and large correct, except that the FBI, with the White House etc behind it, think they can have this “NOBUS cake,” eat it, and safeguard it—at least as well as they did with the OPM data, if not better. All the while the enemies to our FREEDOMS alternatively suck their thumbs, and sit on them.

medgeek February 22, 2016 11:37 AM

Lisa wrote: “If law enforcement in the USA really cared about stopping horrible people, then they should be focused on the low hanging fruit, and avocate for sensible gun control, which would be far more effective in stopping future San Bernardino mass shootings than any information on any iPhone, with less harm to the public’s fundamental rights and liberties.”

A hearty amen to that. Day in and day out, 90 people are killed by guns in the US. This swamps any danger from terrorists as Bruce has pointed out in the past. Rubio says he needs a gun to protect his family from ISIS. Is that so, Marco? ISIS operatives running around the streets of Miami? Really?

parabarbarian February 22, 2016 12:09 PM

I was talking with some friends of mine about this. One pointed out the I-Phone in question does not use a separate hardware keystore so any dump of the firmware plus storage will contain a hash of the passcode. The FBI can dump the image, locate that hash an attack it. If it is a six digit number it might take a whole second or two to crack on a desktop. If it is ten digits then maybe one or two minutes. Alternately, they can load the image onto an emulator and hack away. If a hacker can figure out the changes required he can patch the OS in memory on the emulator to add the no-erase “feature” the FBI is demanding.

Giuseppe1956 February 22, 2016 12:30 PM

IMO this is not a backdoor, backdoors are hidden code that leaves a system fully functional, what the judge is asking is for Apple to design a tool to overwrite, not bypass, the security subsystem making it useless. More a sledgehammer than a lockpick.

That piece of software would fit within the definition of malware that exists in a number of legislations, so creating and digitally signing it would be a crime and expose Apple to endless litigation. As an international company that could happen in jurisdictions that are not likely to care much about the wishes of the US LAE or consider them “above the law”.

Technically the key element here is the digital signature, where the private keys are under Apple’s control, while many could probably reverse engineer and create the attack code signing it would either be impossible for anyone but Apple or reveal capabilities that are “just rumors” today.

I would actually like to see Apple go to court and lose, the precedent would probably cover all sort of malware, including DRM breaking software, unless a judge is willing to affirm people personal lives are less worthy of protection than copyrighted works.

Dirk Praet February 22, 2016 12:45 PM

@ cervin

… and begins to feel it has enough influence and opinion back-up to litterally refuse legal orders from the country it originated from and alledgedly “loves” (sic).

Unless you like Donald Trump believe the US can be made great again by turning it into some kind of police state or banana republic, it is reasonably common in democratic societies under the rule of law that private undividuals and companies alike can legally appeal government warrants and subpoenas.

@ John S

I do wonder when we will reach a point where the tech industry and governments sit together and work out a way to strike a balance between conflicting objectives.

The simple answer to that is that the government in essence has never been interested in such a debate as long as legally and technically they had the upper hand and were able to (secretly) access pretty much anything they wanted. Snowden changed all of that causing both the tech sector, civil liberties organisations and privacy-conscious users to up the ante. Judging from the FBI’s recent machinations, the government is still not interested in a debate and instead continues to try and force the hand of anyone coming between them and their (assumed) divine right to have unfeathered access to all details of both our physical and digital lives.

@ Jeroen-Bart Engelen

They want Apple’s help to use a security flaw in one of their devices to unlock it.

No, they don’t. Read page 13 of the motion: the government wants Apple to “provide or employ modified software, modifying an operating system – which is essentially writing software code in discrete and limited manner”.

This is probably the most important sentence in the entire document. In Bernstein v. United States, the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment to the Constitution of the US and that the government’s regulations preventing its publication were unconstitutional. It can be argued that any order forcing Apple to against its will re-write code that in itself and under no other statute is illegal boils down to “compelled speech” and thus is unconstitutional.

@ Chris

Please correct me if I am wrong, but is encryption not regulated by US weapon export law? If Apple does not comply, could the authorities simply prohibit iPhone sales overseas?

I don’t think so. This is exactly what Bernstein v. US was all about.

@ Nick Johnson

Call it what it is: a vulnerability.

More like demanding the subversion of an existing security control.

@ Cherimoya, @ 65535

This strongly suggests that the motion is all about getting the precedent, rather than this specific case.

That’s exactly what it is. As pointed out in the @Grugq’s analysis that was previously referenced, it is highly unlikely that much relevant additional data can be retrieved from the phone.

@ Thoth

When the software issuer is coerced into signing a backdoor image, the image would be rejected without the device owner signing it as well.

From a legal vantage, I’m not sure this would offer much protection. In the US, a defendant in refusing to sign might invoke the 5th Amendment, but this is unlikely to hold up in court if he has just perpetrated an act of terrorism or the prosecution can make a hard case that they already have a pretty good idea what they will find on that phone. Same outcome in the UK under current RIPA legislation.

Jeff February 22, 2016 12:56 PM

@ianf You’re not understood @Steve Holmes correctly. He said “locked phone” — so he’s saying don’t allow an update to a locked phone. If the phone in unlocked, then the update is okay because the user has entered his security code.

Eric February 22, 2016 12:56 PM

The problem for Apple is not that they can keep things secure by refusing to comply. The problem for them is it is now out there that in principle it is possible to bypass these controls with modified firmware. It merely becomes a question of how does one get the modified firmware. The FBI is merely asking Apple to provide it so they don’t have to do it themselves, which I presume would be harder.

This is going to be tricky going forward, in that Apple will now feel obligated to make it impossible to upgrade the software to bypass these controls, but it isn’t clear how they might do this without sacrificing usability.

And for that matter, Apple might choose to fix things so that only code signed by Apple can be installed on the phone (if that’s not the case already – I actually kind of assumed this to already be the case). But then whenever a case like this comes up, everyone will be knocking on Apple’s door with a court order. Unless of course nation-states have already stolen or cracked the signing certificate…

Fritz February 22, 2016 1:33 PM

A blogger who blogs under Zidziarski and is a skilled iPhone forensics expert has an entry on the ramifications of the court order that, once you read it, you realize that whatever Apple writes will have to be made available to a multitude of people and third party testers to validate. This is a guy who knows what he’s talking about, because this is his business. Definitely worth reading: http://www.zdziarski.com/blog/?p=5645

Lawrence February 22, 2016 1:36 PM

I heard that Apple is going to be releasing a new app to compliment this security feature. It’s call “PedoFile”. It can keep wives, cops, FBI, Judges from looking at all your pictures and file. 😀 …and they hired Jared Scot Fogle from Subway to be their spokesman too. Thanks, Tim Cook.

Seriously, it is no different that a court order forcing a bank to drill out a lock to the security deposit box when there is sufficient evidence to think there is a murder weapon there. As far as intimacy goes, I am sure some pictures are very intimate to some sicko’s somewhere. There isn’t a judge, a politician, or even the public would stand to let a perv walk for the sake of privacy.

mishehu February 22, 2016 1:54 PM

@Lawrence :

I take issue with comparing this to the bank drilling out a lock under order of the FBI. There’s no murder weapon to be found on the iPhone, and the former possessor of the phone is dead. Any data that they collect off of it is of very limited use at this time. On top of that, if they [the gov’t] have been remotely competent about all that bulk metadata collection that they’ve been doing to all of us, they, in all likelihood, already can connect the dots to any other potential threats that the perps were in contact with.

No, this is just like the the equivalent of the Rosa Parks case: she wasn’t the first black lady to refuse to comply on the bus. She just happened to be cherry-picked because her reputation was untarnished unlike the others before her. This too is a cherry-picked case, and the FBI has glommed onto it to further their own goals of weakening encryption for the rest of us. If you don’t let the FBI snoop around, then the terrorists have won, or something like that, right?

Alan February 22, 2016 2:00 PM

@Dirk Praet: Nice try, but the court is not going to force Apple to publish the lock bypass code, so no matter what you might consider it to be, the court is not going to consider it to be speech.

kronos February 22, 2016 2:14 PM

@ Lawrence: There isn’t a judge, a politician, or even the public would stand to let a perv walk for the sake of privacy.

Um, how do you know for sure they are a perv unless you decrypt or break into their personal electronic device?

Apple Fan February 22, 2016 2:17 PM

http://arstechnica.com/tech-policy/2016/02/apple-ceo-tim-cook-complying-with-court-order-is-too-dangerous-to-do/

https://assets.documentcloud.org/documents/2716997/Tim-Cook-Emails-Apple-Employees.pdf

“Some advocates of the government’s order want us to roll back data protections to iOS 7, which we released in September 2013. Starting with iOS 8, we began encrypting data in a way that not even the iPhone itself can read without the user’s passcode, so if it is lost or stolen, our personal data, conversations, financial and health information are far more secure. We all know that turning back the clock on that progress would be a terrible idea.”

“Our fellow citizens know it, too. Over the past week I’ve received messages from thousands of people in all 50 states, and the overwhelming majority are writing to voice their strong support. One email was from a 13-year-old app developer who thanked us for standing up for “all future generations.” And a 30-year Army veteran told me, “Like my freedom, I will always consider my privacy as a treasure.”

“Could Apple build this operating system just once, for this iPhone, and never use it again?”

“The digital world is very different from the physical world. In the physical world you can destroy something and it’s gone. But in the digital world, the technique, once created, could be used over and over again, on any number of devices.”

“Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”

“Again, we strongly believe the only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.”

Richard February 22, 2016 2:32 PM

Wonder if someone could answer my question. I don’t have an iPhone (stil using a flip phone). But I read on the Apple support site that if a Apple iPhone user loses the password, Apple support, after identification verification, will reset the person’s password. I don’t know if that is the password for their iPhone or the password for their Apple Account id. But even if the later, I understand that from the Apple Account, a person can reset their phone password.

So isn’t this a “backdoor” of some sort? And did the FBI do something to mess this up anyway?

ianf February 22, 2016 2:38 PM

@ Curious […] “Apple basically already has a backdoor to the iPhone

Fortunately, we’re not yet compelled to rely on the Counterpunch magazine’s technical parroting expertise.

@ Peter: “Robert X. Cringely suggests that the Obama admin has orchestrated this episode with the deliberate intent to lose, in order to establish a precedent that the government cannot order what it is asking. I wish I could believe it….

That’s one of the scenarios that has to be considered, as I already did it here twice—if in passing.

@ herman […] NSA probably also doesn’t have the ability to subvert Apple’s phones…

UNCALLED-FOR JUMP TO CONCLUSIONS ALERT. All that can be opined with some degree of certainty is that whether the NSA has that capability or not, it would not share it with the FBI. Because if you acquired crown jewels in secret, you want to keep them secret.

@ medgeek […] “Day in and day out, 90 people are killed by guns in the US. This swamps any [body count] from terrorists […]”

Ah, but don’t you see, they died mainly because they weren’t adequately armed to defend themselves (if at all), were not a proactive deterrent to street crime. If the [13 Nov 2015 Paris Bataclan concert] “Eagles of Death Metal” frontman still thinks that every concertgoer there should have been armed, then WTF has he “learned” from that ISIS “near-death experience?”

@ parabarbarian […] The FBI can dump the image, locate that hash an attack it. If it is a six digit number it might take a whole second or two to crack on a desktop. If it is ten digits then maybe one or two minutes.

    Piece ‘a cake! So why didn’t they?

Alternately, they can load the image onto an emulator and hack away. If a hacker can figure out the changes required he can patch the OS in memory on the emulator to add the no-erase “feature” the FBI is demanding.

    IF. On an FDE image. Noted. Next Bright Idea please.

@ Jeff You’re not understood @Steve Holmes correctly.

What’s there to not understand (unless you meant misunderestimated in which case I rest me case).

He said “locked phone” — so he’s saying don’t allow an update to a locked phone.

That’s not what he wrote. He wrote “… update of a locked phone without firmware forcing the data on that phone to be wiped out” (i.e. after first erasing it). Perhaps you shouldn’t speak of things you know nothing about.

    On a recent wake-up, on still-locked screen, my iPad prompted me to update to iOS 9.2.1, offered an option right within the dialog to do it later, had me enter the PIN (“1234” ;-)), sign “Agree” to some TL;DR changes in EULA, then exited quietly to the Springboard (=iOS desktop). The update happened some time at night, I presume after the incremental iCloud backup, WHEN the iPad was locked, and connected to the mains. That’s how they do it (iPad is one generation older than iPhone 5c, May 2014).

Apple Fan February 22, 2016 2:43 PM

@Richard

Telephone resets via support are for your iCloud account. Ideally you should also you 2FA in conjunction with your iCloud password.

Apple openly admit that they will comply with law enforcement requests to provide iCloud data – in fact they did just that in this case. However here the suspect disabled backups approximately 2 months prior to the shooting.

Nevertheless Apple do encrypt iCloud data however they do maintain the master key. The best thing for anybody security conscious to do is disable iCloud on their device.

Because the suspect’s phone was owned by San Bernardino State they reset his iCloud password themselves which caused the suspect’s recovered phone to stop syncing with iCloud. How this would have made a difference I’m not sure because we are told in other news that he had disabled iCloud.

We wouldn’t even be having this discussion if he’d used a strong password as brute forcing it would have taken hundreds or even thousands of years.

Moral of the story: use a strong alphanumeric password.

Daniel February 22, 2016 2:50 PM

Brcue writes in the OP, “This is an existing vulnerability in iPhone security that could be exploited by anyone.”

That’s not true and you know it’s not true because you qualify that statement in your correction text with “The updated code has to be signed with Apple’s key, of course, which adds a major difficulty to the attack.”

So the correct statement is that “This is an existing vulnerability in iPhone security that can be exploited by anyone with Apple’s private key.”

The question then becomes just how secure is that key?

wumpus February 22, 2016 2:51 PM

@ianf “UNCALLED-FOR JUMP TO CONCLUSIONS ALERT. All that can be opined with some degree of certainty is that whether the NSA has that capability or not, it would not share it with the FBI. Because if you acquired crown jewels in secret, you want to keep them secret.”

Cracking a device that you have physical access to can hardly be considered a “crown jewel”. Best guess is that they want precedent.

ianf February 22, 2016 3:30 PM

@ wumpus

if [the NSA] acquired crown jewels in secret, they want to keep them secret.

    Cracking a device that you have physical access to can hardly be considered a “crown jewel”. Best guess is that they want precedent.

The NSA had physical access to that iDevice? And ?WHO? is it that wants precedent… the NSA? the FBI? Which part of the metaphor “crown jewels” standing for NSA potentially reverse-engineering of the iOS, in effect having 100% access at all times, is it that you don’t consider being crown jewels?

PS. If you are going to question such minor statements of mine as those above (you’re welcome), at least try to think logically (THINK LOGIC! in Appleparlance)

tbroberg February 22, 2016 4:23 PM

I don’t get why they need Apple. They have all the ciphertext on the flash, right? Do they need to know anything besides how a password becomes a key? Is it hard to reverse engineer that?

Mr Bungle February 22, 2016 4:36 PM

Taking a “hardline” security perspective, none of this matters. It could easily be cover for a deeper issue, such as an NSL-driven backdoor. If you want to be truly paranoid, the Error 53 patch could have been the real backdoor, and this is the cover to prove that backdoors “don’t exist.” The timing certainly lines up.

If you have truly sensitive material, keep it offline, and always encrypt before bringing it to a networked device. Full stop. Nothing else is guaranteed.

Of course, most people are probably fine without this level of protection. But lawyers, journalists, the politically active, and other keepers of secrets must be more cautious.

jones February 22, 2016 4:39 PM

If Apple promises to protect customer security, and then deliberately does something that may undermine customer security, that could create a liability issue for them, should this exploit turn up in the wrong hands one day.

Major Variola February 22, 2016 4:41 PM

Apparently all the Apple users always relied on the Goodness of Apple, Inc. to secure their data.

That is, they had no security.

The login-throttling and key-zeroing need to be part of the hardware security IC, so that the OS does not matter.

So the FBI will need a FIB (focussed ion beam).

Mr Bungle February 22, 2016 4:45 PM

Wanted to clarify my post: if you think that a backdoor in iOS9 isn’t possible because of Secure Enclave, then you’re just not being creative enough. This specific type of backdoor would not be possible, true, but there are plenty of other ways to weaken the system’s security.

Don’t use unverifiable technology period, but especially not when its creator must comply with the arbitrary demands of an untrustworthy regime.

Khoder bin Hakkin February 22, 2016 4:48 PM

Don’t you think that disabling OS-based login throttling and key-zeroing are as easy as Apple commenting out some pound-defines in a iOS make config?

Swing and a miss February 22, 2016 4:59 PM

@iafn

The argument is that an iDevice exploit that requires physical access to the device is not so valuable that the NSA would covet it like Smaug’s hoard. NOT some nitpicky argument about whether “crown jewels” is an appropriate metaphor.

The exploit is of no particular use to the NSA’s policy of vast, generalized data collection. It can only be applied on a case-by-case basis as the NSA expends resources on covertly gaining access to the target device and deploying the exploit. However, the FBI would have great use for the exploit as the effort in gaining physical access is inherent in the processes of obtaining a search warrant or “probable cause”. Therefore, if the NSA did, in fact, possess such an exploit, the utility of being the only organization in posession of the exploit would not prevent it from sharing with the FBI.

Dirk Praet February 22, 2016 6:09 PM

@ Alan

… but the court is not going to force Apple to publish the lock bypass code, so no matter what you might consider it to be, the court is not going to consider it to be speech.

Whether or not the “FBiOS” code is published is irrelevant. If code is upheld to be free speech, then the government cannot force the speeker to change it for the sole purpose of suiting the governments needs. That would be compelled speech, and which is unconstitutional. Full stop.

@ End of innocence

The government acknowledges that CALEA exists, but it says: “Put simply, CALEA is entirely inapplicable to the present dispute [because] Apple is not acting as a telecommunications carrier

On top of the arguments cited in the Cyberlaw blog rebutting the government’s position, it is fair to say that Apple as a company is in fact in the telecommunications carrier business as it is currently trialling an MVNO service in the US. An MVNO or Mobile Virtual Network Operator is a wireless communications services provider that does not own the wireless network infrastructure over which the MVNO provides services to its customers. Google’s Project Fi is doing the same in using Sprint and T-Mobile’s infrastructure and combining them to become a “super-carrier”.

If CALEA would override the All Writs Act in this particular case, one could still argue that Section 1002(b)(3) contains a provision requiring the telecommunications carrier to decrypt stuff if it provides the encryption service AND holds the keys to decrypt them. But which is not applicable here since the government is not asking for decryption of data, but for a modification in the iOS bypassing a number of security controls that would otherwise brick the phone when trying to brute force the password.

@ Anjin

Good news is that, according to Will Strafach, the requested update for the phone could be tailored to only apply to this specific phone so that it wouldn’t be possible to run the update on any old 5C out of the box and without Apple resigning.

I would probably have been more surprised by the opposite. But even Strafach in point 4) is arguing that it’s less trivial than the FBI would like the general public to believe in that some parts of this hack would possibly represent a major security concern to Apple, thus making the request “unreasonable” or “burdensome”. The main issue however remains that in accomodating this one request, it wil create a precedent for not only the US but other governments worldwide to make countless of similar and even more complicated and burdensome requests.

@ Khoder bin Hakkin

Don’t you think that disabling OS-based login throttling and key-zeroing are as easy as Apple commenting out some pound-defines in a iOS make config?

Please read Will Strafach’s opinion referenced by @Anjin.

@ jones

If Apple promises to protect customer security, and then deliberately does something that may undermine customer security, that could create a liability issue for them, should this exploit turn up in the wrong hands one day.

Which is just a matter of time. Given Apple’s position, it’s quite reasonable to assume the company has been infiltrated by numerous foreign IC agents and on more than level. Once such a hack is known to exist, everyone is going to go after it. Indeed creating yet another serious security risk/liability for both Apple and its users, and which in court could undoubtedly be argued to represent an unreasonable or burdensome request.

Niko February 22, 2016 6:14 PM

@Maria

The simple answer is don’t store sensitive trade secrets on your iPhone or any device that you expect to be lost or stolen. Use a vpn to access the corporate network, which contains all the sensitive data “at rest”.

jetole February 22, 2016 8:28 PM

I may have not fully comprehended this story. I have not had the time to read all of the details and I have read many comments but not all and may have skimmed past some important points on the ones I read. There have just been too many for me to go over as it’s been a busy couple of days outside of this topic so if I am parroting questions or comments already made than I apologize about that.

Having said that, here is what I understand which leads me to what I don’t understand. From what I have read, the FBI wants Apple to make it so that they can A) try passwords more quickly, B) do so without risk of having data deleted and C) only apply it to this phone that they already have in their possession. They want these options so that they can brute force the password so that they can gain access. Correct me if I am wrong.

Here is what I do not understand, why does the FBI require the assistance of Apple on this matter at all. If I am not mistaken, this can be achieved in the most efficient manner by removing the block storage (SSD, flash, etc), imagining the data and then attempting to brute force a password generated key / hash to decrypt the data. By doing it outside of the scope of the parent OS than there is no means that the data could accidentally be destroyed by too many tries and if it is somehow damaged than it can be restored from a known good copy of the image of the data.

Additionally, by doing it outside of the scope of the parent OS than there is no means that the parent OS can slow down how quickly this can be attempted. As far as I know, the only means that attempts can be slowed outside of the parent OS are via key stretching methods such as PBKDF2, scrypt, bcrypt and similar methods of forced loop iteration to create a key from a password. If the latter method is used to create the key from the password than how can Apple provide a means around this method by updating the firmware on the phone that the FBI already has in their possession.

I have to say again that I may have missed some vital points, I could have completely misunderstood something and someone may have already answered this question but I am keen to find out why what I have suggested could not have been done and how Apple will even have the ability to aid in this when the FBI is requesting assistance for the phone that they already have in their possession.

Niko February 22, 2016 10:00 PM

@David Leppik

This is pure speculation:

As others have mentioned, the only thing keeping oppressive governments from requesting the same access is the fact that no government has this level of access.

We simply have no way of knowing whether or not Apple has already built in “back-doors” if you want to call an alternative firmware that, in their Chinese iphones.

Spooky February 23, 2016 1:13 AM

@jetole

Agreed. As you and a few others above have already pointed out, this seems utterly trivial from a technical standpoint when you have the actual device in your posession. My thoughts mirrored yours exactly–lift the hash signature, reverse engineer the transformation to generate it from user input and offload the actual brute force work to a suitably sized server pool (if we’re talking about a numeric input less than 10K, my ancient laptop would suffice, or 100 idle FBI agents). You could patch the login routine to NOP the eventual data deletion, after side-stepping Apple’s anti-RE mechanisms. You could work with the actual hardware by remounting all components in a test rig and triggering a breakpoint every time a write operation is about to be sent to the flash microcontroller, etc. There are so many obvious ways to approach this, their objections cannot possibly be on technical grounds. So, I’ll add my voice to the others concluding that U.S. LEAs are basically trying to set a legal precedent to force silicon valley companies to roll over whenever they ask, “because terrorism.” Latent facism masquerading as a security concern…

re: an earlier comment on OTPs…

Physical OTPs are quite manageable, btw; sensitive conversations that might require them tend towards brevity. Also, people’s textual communications rarely exceed a few thousand pages (a few MBs) and the smallest, cheapest flash media available usually gives you several GBs. Enough for a lifetime of chatter between two people. Naturally, OTPs on flash are preferred but paper works just fine. For conversations of a more limited nature, small vocabularies can be encoded as randomized trigraphs, etc. If current trends prevail, we may all end up using (automated) OTPs far more often.

RonK February 23, 2016 3:39 AM

@ Anjin

Thanks for the link, this was an issue I was just thinking about. Having a publicly known ID number for each device, which can be checked by firmware generated by legal demands, would seem to be a prudent way to attempt to minimize the downside of this attack on your device security.

As for the linked article about setting a longer passphrase, I can only wonder about exactly how much of the simple entropy, which was assumed by the article in its calculations, could be eroded using high-end forensic techniques, like AFM or high-resolution spectroscopic scanning of the display.

On a final note, I wonder when they will release the movie where the terrorist has modified the hardware and software of his phone so that after the long legal battle the FBI has won enables them to unlock his phone, it explodes anyway, having noticed that it hasn’t been unlocked within some kind of watchdog period…

Mark February 23, 2016 3:43 AM

This article from Ars Technica suggests the custom firmware could/should be tied to this one specific iPhone via embedded serials:

The FBI’s request is that the special firmware be tied to the specific device. Every iPhone contains a multitude of unique identifiers that are baked into its hardware (the serial number, the cellular radio IMEI, and the Wi-Fi and Bluetooth MAC), and the court order explicitly states that the custom firmware must be tied to the San Bernardino phone’s unique identifier, such that it can only run on that specific phone.

Assuming that this can be done (and done robustly), it means that even if the custom firmware were given to nation-states or even published on the Internet, it would not serve as a general-purpose way of performing brute-force PIN attacks. It would be useless on any device other than the San Bernardino device. To make such leakage less likely, the court order does allow for the possibility that the custom firmware might be used only at an Apple location, with the FBI having remote access to the passcode recovery system.

http://arstechnica.com/apple/2016/02/encryption-isnt-at-stake-the-fbi-knows-apple-already-has-the-desired-key/

Clive Robinson February 23, 2016 4:17 AM

@ Spooky,

Enough for a lifetime of chatter between two people.

And that is one of the major problems with OTPs, if you need to chat to ten unrelated people securely then you need ten pairs of pads. However if it’s a group of ten people who want to chat then each person needs nine pads or ninety in total, which is a nightmare to keep properly in sync and distribute, or you need to designate one “in station” that everybody communicates through which has other issues.

So whilst OTPs are great for just one private link, they realy do not scale well. You can look up the history of how things were implemented by the “Diplomatic” services up untill the semiconductor age. The procedure was to use symmetric encryption for person to person encryption (such as the British Typex), that would then be super-enciphered by use of a comms link based One Time Tape (such as the British Rockex) in a formal routed net. Likewise you can look up the details of how the Russian Spy / Diplomatic network communicated (see VENONA) they used only OTPs and out stations (spy) communicated with regional in stations (control). Due to production issues between 30 and 40 thousand OTP pages were duplicated and thus the traffic under them which was plaintext became readable, giving valuable highlevel inteligence to the US&UK, which allowed various Russian Spies to be identified.

For those seriously considering the use of OTPs even in paper and pencil form, I would still recommend you use compression and the double encipherment of super-encryption and keep your traffic short and sparse.

The only real use OTPs have in larger nets these days is for “emergency use” such as key transfer where a “bug out” or similar has caused the loss of crypto kit or keymat, where the simplicity of pencil and paper and battery comms kit or POTS may be all thats left.

Clive Robinson February 23, 2016 5:51 AM

@ jetole,

Here is what I do not understand, why does the FBI require the assistance of Apple on this matter at all.

It’s a question that many are asking, and one that is far from clear.

The FBI actions so far have been to do the opposite of what Apple recommend. As a result the FBI have shut themselves out of other avenues of enquiry. If this is incompetent or maliciousness is not clear… However by their own admission the FBI have approached Apple on atleast seventy other occasions and have followed Apple’s recommendations successfully and the data the FBI has wanted has been obtained. Which is why people are begining to think it was malicious intent by the FBI. Which gained further momentum when the FBI tried to pass it off onto the phones owner, who then went with a public refutation. But worst of all and potentialy fately for their case the FBI did not declare this as part of the evidence submited to the court or Apple, which is a major no no.

But it’s also becoming clear to many that this is a politicaly motivated attack on Apple by the FBI / DOJ / Executive, who want 100% data access 100% of the time, without going through the legal procedures you would expect for papers in a locked draw in your home.

But whilst this gives motive and insight to the court action it does not answer the technical question.

In theory yes there are techniques that might be used by the FBI but they carry significant expense and risk. The FBI therefor argue that as this data is of the utmost importance –which is extreamly doubtful– Apple being the designers of the system are best placed to facilitate the extraction of the data from the phone.

What is unknown is, if what the FBI claims is true or not. This is because the FBI are basing all their arguments on assumptions because there is a “vacuum of information” about the internals of the phone which are quite rightly Apple’s “Trade Secrets” and they have a right to keep them that way.

Apple claim that there is not a method in place that will alow the passphrase system to be bypassed, and this is probably true.

The FBI counter that whilst there may not currently be a method in place Apple could put one in place because they have the Private Key of the signing process used to prevent unauthorised software updates.

How ever this is where it gets messy due to the rules of evidence of “not tampering” otherwise the “Fruit of the poison vine” comes into play and any data extracted would be inadmissible as evidence at a future trial. Thus the FBI are asking not for the firmware to be changed but some utility to run from within RAM.

For various technical reasons this may not be possible. But the FBI has shot themselves in the foot, because they are trying to say that what they want will only be for that phone and that phone only which is actually technically quite difficult to do, and more that once done Apple can destroy it. Well actually that raises a problem, because it becomes part of the evidence chain, thus if Apple destroy it then the evidence is nolonger evidence. Thus this brings further askance on what the FBI are upto, because they should know this…

The technical reason for why you can not just lift the data of the phone is what is actually involved.

The data is protected by a 256Bit AES master key, which is not stored on the phone. It is built each time from the user passphrase and other “hidden” effectivly One Way variables when the phone is unlocked. This makes the process not just One Way, it also makes it hardware dependent, so it has to be done on that phone and that phone only because of the One Way variables.

As I said in theory the FBI could extract those variables and but it would be very high risk and it would probably still not give them the AES master key to get at the data.

But there is another question that is yet to be answered which is has Apple put other “anti-tamper” protection mechanisms in place. The knowledge of how to make software anti tamper/debug/trace mechanisms has been around for over a quater of a century and it’s a reasonable assumption that Apple know about them in some depth (they started through anti-copy techniques for games and business software on the Apple ][).

What is not currently known publically is if Apple has or has not used such mechanisms. And they are quite within their rights not to say so either they are “Trade Secrets” and thus have a protected status, as their loss would be an “undue burden” on Apple, the AWA would fail. Thus the FBI going the route they are. But the question then arises about “meta information” that is can a trade secret be rendered non secret and thus an undue burden by knowledge of it’s existance but not it’s method. There is case law around to suggest this is the case, thus if the court agrees on argument then the FBI will fail…

It’s all very messy, but then it tends to be when you bring politics into a court to try to make case law, to get arround the fact that the legislation making arms of Government have said no on repeated occasions.

It worries me that an entire nations privacy and security falls to which bunch of lawyers can better bamboozle a judge…

Clive Robinson February 23, 2016 6:06 AM

@ Mark,

The ARS article is working on assumptions as this comment clearly indicates,

    Assuming that this can be done (and done robustly),

Arguably it can not without further secret knowledge known only to Apple. And as history has shown secret knowledge of high value does not remain secret for very long.

But there is another issue, Apple could have put in anti-tamper “copy protection” etc code into the original iPhone software, it’s not overly difficult and something that has been done for over a quater of a century in various ways.

If Apple have then it’s game over even if Apple gave a copy of their software update signing key to the FBI. The FBI would have to go another route via the hardware it’s self and try to get at those hidden variables.

Alan February 23, 2016 6:32 AM

@Dirk Praet

Whether or not the “FBiOS” code is published is irrelevant. If code is
upheld to be free speech, then the government cannot force the speeker
to change it for the sole purpose of suiting the governments needs. That
would be compelled speech, and which is unconstitutional. Full stop.

It seems you know very little about how the law actually works. You can be jailed indefinitely for refusing to testify in both criminal and civil cases. That is compelled speech and it is very much constitutional. But also irrelevant in this case, because unpublished unlock code will not be considered speech.

Dirk Praet February 23, 2016 7:18 AM

@ Alan

You can be jailed indefinitely for refusing to testify in both criminal and civil cases.

Although not applicable in this case, in the US one cannot be compelled to testify against oneself. Look up something called the 5th amendment to the US constitution. Equivalents exist in quite some other democratic countries too.

because unpublished unlock code will not be considered speech.

Please be so kind as to back up your claim with pointers or references. It makes for a more intelligent and informed discussion than just spouting unsubstantiated opinions, wouldn’t you say?

Thoth February 23, 2016 7:37 AM

@Clive Robinson
I think it’s more of political posturing rather than security of the software patch although if the entire iPhone architecture was truely designed to respect user’s privacy and personal security, it would have, as many of us mentioned, only react to the updates and execute them once the user (which is already dead) keys in the iPhone PIN or password to authenticate and authorize the update of any sorts.

It’s about time 4 digit PINs are put to obsolete status and move up two steps to 6 digit PINs or at least a 12 character password for highly security combined with the use of a tamper resistant hardware (not just a hardware which is what most ARM/Qualcomm/Samsung chips are) with difficulty to update the firmware of the security feature of the chip to enforce PIN tries and security functions.

Maria February 23, 2016 7:44 AM

@niko – This isn’t just about encryption or the integrity of at-rest data. This case would set a precedent that governments can compel device manufacturers to develop and install malware on their devices. VPN software can’t protect information that’s accessed using a compromised device.

BrianD February 23, 2016 8:07 AM

iCloud sync won’t work now because the iCloud password was changed. If Apple has backups of their servers, why couldn’t they restore the password to its previous value (I realize that’s less trivial than just doing a simple restore)? Then the FBI could plug the phone in and let it sync to iCloud. Seems like everyone wins – FBI gets the data, and Apple doesn’t have to make a custom OS.

c1ue February 23, 2016 8:23 AM

@lisa @medgeek
I agree – since guns were used to kill these people, they should be banned.
So should jelly donuts – they kill nearly 2000 people every day due to heart disease.
We should also ban all knives – they are used to kill 1/4 as many people as guns, and 1/4 of bad is also bad.
And cars! Cars kill almost as many people as guns.
We should all live in a world with rounded edges and padded walls…

SchneieronSecurityFan February 23, 2016 9:41 AM

It’s only a matter of time before a government agency of some type requests that PINs be stored “in the cloud” – with either the manufacturer or cellphone provider.

CR February 23, 2016 9:53 AM

And today the WSJ reports that the DOJ now wants Apple to unlock another DOZEN phones.

Wait until Russia, China, Iran, Cuba, North Korea, France, UK, Aus, Canada, Mexico etc all want “only 1 phone” unlocked.

Privacy? Who needs it. /sarc

DanSteele February 23, 2016 10:57 AM

Maybe apple should just add a “erase after x days” feature? Similar to erasing after a number of failed PIN attempts, the user could have the option to wipe their phone if it hasn’t been unlocked in a user-specified period of days. Odds are most people with a smartphone, at least the ones who would enable this feature, don’t go more than a couple days without unlocking their phone. This would make the issue a moot point if the phone sat in custody for more than a week. Of course it would have to be a low-level function so as not to be bypassed, but I’m sure apple could work that out.

National Security Patrice upskirt February 23, 2016 11:13 AM

@news, Thanks for reminding us that Comey’s attacks on legal privacy rights go beyond the domestic bell jar. Outside the US hermit kingdom there are grownup countries with independent courts that don’t hide from the law of privacy and diplomatic communications.

https://wikileaks.org/nsa-201602/

ianf February 23, 2016 11:42 AM

    This summary is largely ADMINISTRIVIAL, posted FTR, and not so little reminiscent of one previous thread’s meandering exchange. MAY BE IGNORED IN ITS ENTIRETY.

Wrote @ Swing and a miss: “The argument is that an iDevice exploit that requires physical access to the device is not so valuable that the NSA would covet it like [adolescent fantasy simile for riches].

Nope, that wasn’t the argument. Let me walk you through the motions:

1. Steve Holmes asserts that any “automatic update to a locked phone needs to assure that all user data is wiped before any code update takes place.

2. I counter with that “IF a software update wiped out user data, THEN no phones would ever be updated.” Stands to reason.

3. Jeff chirps in with that “I had not understood Steve Holmes CORRECTLY; he said “locked phone” — so don’t allow an update to a locked phone.”

4. in a multi-addressee posting I then have to explain to Jeff that Steve Holmes’ key point was erasing contents of the phone prior to the update, and why this was a bad idea. In the same post, in response to herman, I also claim that, whether the NSA is able to crack iPhones or not, it would not share that secret with the FBI.

5. next a wumpus jumps into the fray, and disparages such a THEORETICAL crackability as being “the NSA’s crown jewels,” because “they want a precedent” [undef(“they”;) though presumably the FBI?]. Very logic[k]al.

6. I then ask wumpus which part of the metaphor standing in for NSA potentially reverse-engineering of the iOS, in effect having 100% access at all times, is it that he doesn’t consider “crown jewels?”

7. wumpus disappears, but then you, Swing and a miss don’t miss a mo to declare the above “some nitpicky metaphor argument,” while abusing typography to bold the unsubstantiated claim of “being the only organization in possession of [an iOS] exploit would not prevent [the NSA] from sharing it with the FBI.” (muddy justification follows).

    Because, presumably, the NSA and the FBI are like=this, like thieves on fire (=no doubt soon declared to be another nitpicky metaphor).
    Because the NSA just lurves the FBI, and vice-versa.
    Because there’s this true American can-do spirit of cooperation between these two branches of the same government.
    Because there is touchy-feely goodwill and benefits in it to be had by the NSA.

That said, do you see the p.a.t.t.e.r.n.? I question one unsound assertion, it could have ended there. Instead, 3 more one-post-wonders rise to the challenge, each one arguing something new and peripheral to my refutation, only to disappear soundlessly into the background. Yes, I know that you all can type, but how about sucking on some fish heads first before straddling the keyboard… apparently they’re rich in [P]hosphorus compounds said to abet thinking.

Skeptical February 23, 2016 11:51 AM

Apple’s argument is rotten at the core. If you’ve already bitten into that bright, shiny, polished, deliciously ripe press release, think twice before swallowing.

It’s agreed that the vulnerability in question already exists. It’s there. No one forced Apple to create the vulnerability.

The post here reprises the vulnerability-exploit-rapid-proliferation postulate. That is, a vulnerability once known will lead to the creation of at least one exploit and that exploit (exploits) will rapidly proliferate.

If you truly believe that once a vulnerability exists it will not be long (“tomorrow” if I take the above post literally) before it is exploited, then it makes little difference to the security of anyone who owns this phone – or ANY phone with the same vulnerability – that Apple utilize it to unlock this phone.

And in that case, the balance of equities here is clear. On the one hand, there is no significant gain in security by Apple not doing as the court has ordered; on the other hand, a device belonging to a known terrorist, used by that terrorist, and possibly containing intelligence or evidence of value, will remain inaccessible if Apple does comply with the court order.

In other words Bruce, your own premise compels the conclusion that from an ethical vantage Apple ought comply with the order. The vulnerability is known, the device is insecure, and as it will in short order be exploited anyway if it hasn’t already, we may as well exploit this particular phone sooner in case there exists perishable intelligence on the device.

I can see the counterpoint: “yes, but if firms and individuals can be required by a court to aid the government in exploiting a particular device, then we will all be less secure, because this effectively means that the government can enlist their aid in exploiting any class of devices they have created.”

However, the counterpoint elides the fact that the assistance that the US Government can request has sharp limits. The government cannot ask for the unreasonable. The government cannot ask Apple – in this case – to spend a year figuring out how to unlock the phone, for example, while neglecting its business). Or rather it could, but the court would reject this.

Nor is the court ordering Apple to alter products it sells to its customers. Apple is free to design devices to whatever security specifications it pleases.

Nor is the court ordering Apple to just “find some way” to unlock the phone. That would also be unreasonable.

Let me put this another way. The court order is not a mechanism by which the government can achieve mandated backdoors or enlist companies on broad fishing expeditions for vulnerabilities and exploits to be used at the government’s discretion in the future.

Instead the court order is a mechanism by which the court may require a company to open an identiifed door that the company installed itself on its own initiative. And it is a door to a room that the government, acting on behalf of the public, has a compelling interest in being able to enter.

This framing of the issue as requiring Apple to “create something new” is beside the point. Essentially the “new”-ness of the software would be one factor considered under a “reasonableness” analysis. The question is how difficult or burdensome is it for Apple to do this. That the software is new may make it more difficult; but not all new software is hard to write. Some new software is quite easy to write. Indeed, some software is designed quite deliberately to ease the process of altering it as needed; and some entities that regularly alter certain software develop tools and processes to render modifications easier to design and deliver.

The other arguments raised on Apple’s behalf are not persuasive either. This is not a free speech issue. CALEA does not apply the way that some here seem to believe (nor quite frankly am I sure that those people have thought through the implications of their argument). And it is irrelevant as to whether county or federal personnel made a mistake in resetting the password (this does not affect either the public interest in access or the court’s power to compel reasonable assistance – but I suppose the issue has muddied the water for some).

Apple adorned that press release with every right-to-privacy buzzword it could find. And remarkably it has managed to persuade a large number of people to accept, perhaps unwittingly, the implicit premise that Apple’s proprietary code is itself a security feature too important and too precious to allow even the possibility of it being compromised in a government investigation into a lethally violent act of terrorism. Apple’s message is: you can only trust us.

I’m not sure anyone has yet realized how much damage Apple has done to the very causes in which it has wrapped itself. Leaving aside the specious nature of their arguments, an unlikely victory for Apple in court would certainly motivate the passing of legislation that many here would find problematic.

When a company, or a very wealthy subset of businesspersons in an industry dominated by an oligarchic structure, begins to pound the table and speak sonorously of the importance of ethics and of its concern for the rights of those everywhere, cast a wary eye. Apple did not outsource out of a desire to raise the living standards of poorer countries, and Facebook is not offering internet access out of the kindness of its heart.

This is an industry dominated by strong networking effects, and they are racing for marketshare among international consumers. Public displays like these burnish their brands in that effort, and the cooperation they will be forced to render to governments like the PRC or Russia, or worse, will of course not be as open to public view or disclosure as the US legal system. I have no doubt that even while Apple is rousing its army of attorneys to fight in the United States, some of their executives are speaking soothingly to foreign government officials elsewhere, assuring them of their cooperation.

Skeptical February 23, 2016 12:02 PM

Slight error in my comment:

will remain inaccessible if Apple does comply with the court order should actually be will become accessible if Apple does comply with the court order.

Nick P February 23, 2016 12:47 PM

@ Skeptical

Well-spoken argument. Thanks to both sides’ obfuscations, I don’t know enough details to discuss it more thoroughly. If it’s existing vulnerability, your argument probably stands. If it’s not, theirs should be the default as government only builds on precedents like that. I 100% agree on Apple being a wolf in sheep’s clothing regardless of the outcome. Probably a publicity stunt to sell iPhones more than anything. 😉

One thing that did stand out was your claim that the CALEA argument about telecom manufacturers didn’t apply. Would you care to elaborate on that one?

65535 February 23, 2016 1:28 PM

Emptywheel notes that FBI chief Comey has been untruthful and he is expanding the use of the infamous All Writs Act.

‘First One All Writs Act, Then Another, Then Fourteen’

“a list of those requests has now been unsealed in Orenstein’s docket. The list lays out the 11 orders Apple has received to help unlock phones since the latest update in Orenstein’s drug case, and mentions 3 before that; between all known AWA orders, they cover 17 devices (including an iPad)… Of course, he [Comey] wasn’t honest about wanting an open debate, or about this being just one request. But that’s just one in a long line of things Jim Comey hasn’t been honest about.” –emptywheel

https://www.emptywheel.net/2016/02/23/first-one-all-writs-act-then-another-then-fourteen/

Since the FBI screwed-up the unlocking of the San Bernardino shooters iPhone, I hope some judge will toss out the Warrant [AWA] against Apple. This is a power grab by the FBI. The FBI should be stopped.

[Bill Gates back-tracts on siding with the Government]

UPDATE — 10:25 AM EST — Poor Bill, so misunderstood, now backpedaling on his position about Apple’s compliance. This, from a Fortune 100 technology adviser…~shaking my head~ Rayne

‘Update: In an interview with Bloomberg TV this morning, Gates says he was “disappointed” that reports placed him in the FBI’s corner. Here’s what he said in response to the supporting the Feds:

‘“I was disappointed because that doesn’t state my view on this. I do believe that with the right safeguards, there are cases that the government on our behalf, like stopping terrorism that could get worse in the future, that that is valuable. But striking that balance—clearly the government has taken information historically and used it in ways we didn’t expect, going all the way back to the FBI under J. Edgar Hoover. So, I’m hoping now we can have the discussion.’ Bloomberg via gizmodo

http://gizmodo.com/bill-gates-sides-with-fbi-over-iphone-unlocking-1760750683

‘Gates Disputes Report That He Backs FBI in Apple Case’

http://www.bloomberg.com/news/videos/2016-02-23/gates-disputes-report-that-he-backs-fbi-in-apple-dispute

@ Dirk Praet

‘This strongly suggests that the motion is all about getting the precedent, rather than this specific case’.- Cherimoya

“That’s exactly what it is. As pointed out in the @Grugq’s analysis that was previously referenced, it is highly unlikely that much relevant additional data can be retrieved from the phone”.- Dirk Praet

That is exactly what its – an attempt to set and irreversible precedent.

Swing and a miss February 23, 2016 1:41 PM

@ianf
You are making two fundamental misunderstandings here, and your constant attacks on fellow posters’ intelligence is doing nothing to help clear the misunderstandings.

Regarding the Steve Holmes argument.
I think where Apple err’ed is allowing a signed software update of a locked phone without firmware forcing the data on that phone to be wiped.
Your reply summarized in point 2 is certainly correct but misses Steve’s point completely. Steve is implying that locked phones should be wiped (and Jeff tries to make this easier for you to understand). You are arguing against wiping phones regardless of the lock state, which was never a part of Steve’s proposed solution.

Regarding the Wumpus argument.
Cracking a device that you have physical access to can hardly be considered a “crown jewel”.
You argue that the percieved value of an Apple iOS exploit would preclude the NSA from sharing it with the FBI. Wumpus was saying that the exploit in the context of this discussion could hardly be a metaphorical “crown jewel” as it requires physical access. I try to help you understand by explaining why a physical-access exploit is not as valuable as the various remote-access exploits the NSA possesses (certainly not a “crown jewel”). And in your administrative summary, you iterate over organizational rivalries four times, even though the argument was never about whether rivalries would prevent interdepartamental sharing. Why?

To succinctly answer your question “which part of the metaphor standing in for NSA potentially reverse-engineering of the iOS, in effect having 100% access at all times, is it that he doesn’t consider “crown jewels?”:

  1. Deploying the NSA’s custom iOS would require physical access.
  2. A physical access exploit is necessarily targeted.
  3. It costs many resources to gain physical access to just one device.
  4. Employing a physical-access exploit doesn’t scale to populace-level exploitation.

What is the source of these constant misunderstandings? Are you perhaps ESL? And why do you direct such poignant hostility at the posters who are trying to help you understand?

AvidReaderAppleVsFBI February 23, 2016 3:40 PM

At 5:30 pm, local time, events today may include one at an Apple store or another location near you or at FBI headquarters:

… “similar gatherings
https://www.facebook.com/events/1645036165762086/
(from above facebook link: https://www.fightforthefuture.org/ )
are planned in cities across the country
https://www.dontbreakourphones.org/
, and at the FBI headquarters
https://www.facebook.com/events/458361131032530/
in Washington, DC, organized by our friends at Fight For The Future with transpartisan support from numerous organizations including CREDO, Demand Progress, the Bill of Rights Defense Committee / Defending Dissent Foundation, Downsize DC, and others.”

https://www.eff.org/deeplinks/2016/02/apple-americans-and-security-vs-fbi
(from second to last paragraph in the eff.org link)

anon February 23, 2016 4:56 PM

I thought they had hacked 3 other fones from the couple.

They actually expect the “work” fone have more/better info ?

slight brain queef February 23, 2016 6:37 PM

Listen up! Skeptical’s here with the official propaganda line for heel-clicking asskissers currying favor with government tax parasites.

Equities! That settles it. Always the wannabe spook, skeptical parrots spook cant: ‘Equities.’ Does that mean legality? No. Does that mean constitutionality? No. Does that mean compliance with US obligations? No. CIA lifted a vague old notion from the philosophy of natural law. CIA latched onto that nonstandard term so they can flout the law with secret red tape in the name of some alternative concept they pulled out of their ass. ‘Equities’ is CIA’s way of saying fuck law, we’re going to correct it – just like they corrected the prohibitions of torture, murder, armed attacks on civilian populations, and coercive interference. Equities is totalitarian newspeak for the brainwashed. It doesn’t work on thinking human beings.

Then lots of maundering legaloid nonsense which skeptical wisely tarts up as ‘ethical vantage.’ Wisely, because if you ever tried to pass it off as law it would flunk you out of Whittier. It would flunk you out of Florida Coastal. It would flunk you out of Ave Maria School of Law and Pizza Science. It features a mercifully unsubstantiated wave of the hand at CALEA from our lex faex expert.

Then we cut to the chase – skeptical’s favorite part, the emotional manipulation. Thrill to the smarmy sanctimony as skeptical tries to hijack your disgust for corporations! With shit for arguments, skep’s trying to make you take sides. And why not? Both litigants are perfidious scumbags. If you’re a cognitive mediocrity like skeptical, you just pick a side and make shit up for ’em. Skeptical assumes you share his statist enemy neuroses, so he throws in some official enemies to make you see red, Grrr! Like you’re some broke low-normal cold war redneck freshly washed out of Phase I RMT. Like his paraprofessional peers.

Niko February 23, 2016 8:03 PM

@Maria

If LEOs can’t get at data from the telecom providers(because everyone switches to end to end encryption) and can’t get data with the assistance of the device manufacturers, then the FBI is not going to give up and go away. Instead, they’ll invest in their own hacking capabilities. If the slippery slope leads us to malware, so does a court ruling in favor of Apple.

Nick P February 23, 2016 10:46 PM

@ Bruce and others

Justice Dept Wants Apple to Unlock Nine More iPhones

What was predicted on Hacker News, here, and elsewhere came to pass faster than I expected. The prediction was that this was all about setting a precedent for access to be used at will by the Feds. This article says they already have a bunch of others waiting for whatever happens in this case. Further, the Feds reject the idea that this access should be limited to important cases like terrorism: “What we discover is that investigation into one crime often leads into criminal activity in another, sometimes much more serious than what we were originally looking at.” So, they plan to be using it so much that crooks might as well stop using iPhones if they win this case.

I mean, I’d have recommended that anyway. It’s just that their secret, collection provisions might kick in once they win. Apple would know they’ll loose in court and cooperate with orders that tell them to collect in secret then lie about security. Same M.O. FBI asked judge for in Lavabit case. I won’t forget that.

Barbara Glassman February 23, 2016 10:52 PM

You write, “In my essay, I talk about other countries developing this capability with Apple’s knowledge or consent.” Did you not mean “without”?

T. Harrell February 23, 2016 11:48 PM

To me this seems like some public misdirection. The FBI wants the masses to believe current security is overwhelming even to resources of the world’s top super-power.. The “secure enclave” can be updated the same way IOS can; in fact it’s loaded each boot and can be updated over DFU..

A bootrom vulnerability would allow a ramdisk or unsigned image AND SE/Trustzone key-service spoofing. The FBI actually had this around the 3S period by using jailbreakers work(Ramdisk loaded, TEE isolation wasn’t there yet). There aren’t bootrom/”SecureRom” vulnerabilities for recent generation Apple devices(you can’t even buy one for millions), and even same-generation same-model devices could have different bootroms.

THE REAL REASON THE FBI IS BEGGING: Even the JB devs haven’t bypassed the Trustzone kernel protection and you need kernel execution to even try to break SE which is in a separate TEE isolation. RE and exploit dev on modern Apple devices is very expensive; hence the $1,000,000.00 bounties.

They could use un-tethered code execution, because there are still remote attack surfaces via baseband, battery”(even in lock-mode), but then they are back at the kernel protection and later the SE interface which nobodies defeated yet. They need the data short-term, so all this matters.

jetole February 24, 2016 3:09 AM

@Clive Robinson

If what you say about a new key being created each time a pin is entered is correct, that raises some red flags to me. If this really is the case than either the iPhone is re-encrypting the data each time a pin is entered or it is only re-encrypting a static key. The latter would seem more likely since re-encrypting the data each time just isn’t a good idea but than the static key becomes an attack vector. There’s also a third option, that Apple may be using some other method that I’m not aware of.

As per the software defined tamper resistance you mentioned, if the data storage device, the onboard flash, is imaged than this allows the FBI and law enforcement to attempt to break in without having to be concerned about triggering any software defined tamper resistance because this would be outside of the scope of the OS or any software on the phone and if the data is damaged in the process than they can just restore it from a clean image.

As per the encryption type being used, it’s probably irrelevant. Where the FBI would, most likely, need to focus their effort would be on brute forcing the password used to create the key. They can try and brute force the encryption key itself but it would probably be more computationally expensive (take longer with more servers in the cluster) than the password used for key generation. One exception to this rule would be if the key is created from a key derivation function such as pbkdf2, scrypt, bcrypt, etc in which case it may be less computationally expensive to brute force the actual encryption key.

Either way, what I don’t understand is A) why the FBI is not using this method and B) what can Apple provide for the phone the the FBI have in evidence that can, in any way, expedite their process of gaining access. I was hoping someone may be able to explain that to me but it seems, so far, that some people seem to be in agreement that the FBI’s actions aren’t logical. I don’t know if that really is the case. I really am expecting that I somehow have overlooked or misunderstood some aspect.

Skeptical February 24, 2016 5:40 AM

@Nick P: One thing that did stand out was your claim that the CALEA argument about telecom manufacturers didn’t apply. Would you care to elaborate on that one?

The All Writs Act grants the courts power to compel the reasonable assistance of third-parties where Congress has not otherwise granted specific power to do so. The idea that some seem to have is that if CALEA is such a specific instance of Congress doing so, then the AWA is not applicable here.

The idea is extremely weak. Apple is not a telecommunications carrier, and the data sought here is not that which is the subject of CALEA: real-time intercepts of communications along with associated metadata.

That’s really all that need be said on the subject – it doesn’t really pass the smell test. But, so far as the provision I’ve seen cited in support of the idea, that provision merely allows telecommunication carriers to purchase and design whatever equipment they choose – subject of course to the requirement that they be able to provide law enforcement agencies with the assistance specified. It’s the difference between government setting minimum safety standards for motor vehicles, for example, and government specifying which motor vehicles can be produced.

Now, a few people have been mentioning precedent.They should understand a decision such as this carries little weight as precedent. It is not at all binding on any other court, and it carries very little persuasive value with other courts.

We might see a spike in the number of applications for orders compelling assistance by Apple – but that’s due to what appears to be a change in policy by Apple (to resist requests for assistance no matter how flimsy the legal case for doing so), not any change in law or practice by the US Government.

Thoth February 24, 2016 8:24 AM

@Ars
It’s possible and I have already suggested such probing into iPhone and other mobile device hardware but the risk of “damaging the evidence” is very high and there is no “clean way” to ensure risk mitigation in the accidental (and very frequent) destruction of the chip and it’s data contents when attempting physical attacks on the chip.

john February 24, 2016 8:39 AM

Geez I hope Apple doesnt shoot this Jim Comey dead for 500,ooo. He’s a good man trying to take everyones privacy across the world for a few dead souls. Nice script FBI. 1 soul lost …hehe for 7 billion peoples privacy. Thats a Win Win x 7 billion – 1.

slight brain queef February 24, 2016 8:54 AM

In which Skeptical continues his headlong retreat from legal reasoning, ethics or any vestige of decency. Now he backs off another remove and applies his ineffable ‘Smell Test.’ That’s the key. To see it his way you just gotta think like you sniff rancid meat.

The government does in fact have a smell test, PPD-28. PPD-28 requires the government to consider, Can we get away with it? And if we get caught, Is it worth it? After all the disgrace and national dishonor, international countermeasures, criminal sanctions, wrecked alliances, impaired comity, ruined commercial, economic, and financial interests, will you still be glad you went and did it?

You will of course recognize the process. This is how criminals think. Let’s run through a few easy examples.

  • Wilful killing of guys who are incriminatingly tall? Hell yeah!
  • Wilful killing of guys who look incriminatingly tall because they’re surrounded by children? Hell yeah!
  • Wilful killing of guys who are incriminatingly shy one leg? Hell yeah!
  • Wilful killing of journalists? Hell yeah!
  • Wilful killing of 55,000 civilian noncombatants by baked-in bias of Bayesian priors? Hell yeah!
  • Rape? Hell yeah!
  • Object rape! Hell yeah!
  • Child rape? Hell yeah!
  • Penis-slitting? Hell yeah!

OK. Now that you’ve got the idea, let’s apply it to the case at hand.

Some CIA plant in our task force hid the evidence so we wouldn’t find the spook who was running the terrorist as an agent. Again, darn it! Just like they did in OKC, WTC 1, WTC 2, Amerithrax, and the Boston Marathon bombing. What do we do? Well… we could always force a half-trillion dollar company to destroy their products, their reputation, and their market by fucking up thousands of staff years of intricate, conscientious work. But… Is it worth it?

HELL YEAH!!!

Nick P February 24, 2016 11:31 AM

@ Skeptical

I see the dispute immediately:

“Apple is not a telecommunications carrier”

The article that promoted the CALEA argument pointed out that many have a misconception that CALEA applies to just carriers. The government does that intentionally in its arguments because it makes overreach easier. 😉 The actual law says this:

“This subchapter does not authorize any law enforcement agency or office

(a) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services; ”

Apple definitely manufactures [via third parties], “provides,” and “supports” telecommunication services with their iPhone offering. So, the law clearly says they don’t have to change the design to enable L.E. access. No backdoors per U.S. law.

Then, we get this section on encryption:

“(3) Encryption. A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.

Now, this one talks about only a carrier but Apple carries iMessage traffic, security config, and phone updates. So, it might be applied to them. That last line is critical. It can be read a few ways. The most intuitive is that cooperation happens if the company themselves have the keys to the encryption because they’re doing the encryption themselves. End-to-end or on-device stuff might dodge that. The second point, “possess information necessary,” is another way to dodge it. Yet, I’m pretty sure one or both of these apply to this specific phone case as Apple has some way to update and debug these phones. They could provide access in that case which would be possibly legal under a CALEA claim.

Yet, if they don’t have these, then per CALEA they don’t have to provide assistance as a carrier, manufacturer, provider, or support organization. That’s a powerful legal argument that I’ve rarely seen. It also guides how to set up one’s equipment to avoid such orders. That’s already been happening all over as a response to their overreach. As in, government not giving a shit supplemented by Snowden leaks led to the largest adoption of non-backdoored crypto in a long time. My side appreciates at least that result of their scheming actions. (slow clap) 😉

“They should understand a decision such as this carries little weight as precedent. It is not at all binding on any other court, and it carries very little persuasive value with other courts.”

The lawyers are arguing otherwise. I’m out of my depth here so I can’t say. Except, Apple’s lawyers, EFF’s lawyers, and all kinds of others that fight these cases regularly think this could affect future decisions. So, I default on believing it can lacking expert counteropinion.

“We might see a spike in the number of applications for orders compelling assistance by Apple – but that’s due to what appears to be a change in policy by Apple (to resist requests for assistance no matter how flimsy the legal case for doing so)”

Maybe. There’s an old maxim about increasing resources available means activity increases to use them all. Passage of the Patriot Act with its weak standards of suspicion leds to tens of thousands of requests. The requests kept going up every year. So, a spike might be government taking advantage of extra legal muscle because it can, due to Apple’s resistance as you stated, or both. We can’t say from our vantage point due to secrecy involved on Fed’s side.

Curious February 24, 2016 11:42 AM

Christopher Soghoian makes aninteresting point about NSA obviously don’t trust updating of firmware over air. Presumably for smartphones, not sure.

Clive Robinson February 24, 2016 12:04 PM

@ jetol,

I said,

The data is protected by a 256Bit AES master key, which is not stored on the phone. It is built each time from the user passphrase and other “hidden” effectivly One Way variables when the phone is unlocked.

That is “the key” –singular– is built from the secret information. It is the same key if the secret information has not changed.

Parallel February 24, 2016 12:16 PM

Bruce: “There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world.”

I bet that CIA already has that hacking software.

The restriction is that its results cannot be used to sue people.

Its result cannot be used to incriminate whistle blowers, for example.

This Apple VS FBI case might lift that restriction once for good. As a result, law enforcement organizations will use less parallel reconstruction and sue more people.

Dirk Praet February 24, 2016 4:05 PM

@ Nick P, @ Skeptical

I see the dispute immediately: “Apple is not a telecommunications carrier”

I refer to my reply to @End of Innocence on this topic here.

joensuu February 24, 2016 9:40 PM

well there is always this Bittium “Tough Mobile”, and Android-based alternative to iPhone…

http://www.bittium.com/BittiumToughMobile

Although Bittium is kind of “interesting”…(besides that it is not exactly cheap) it has functionality that allows the manufacturer to check remotely whether the applications are those installed out-of-the-box, or if they have been modified (whether the manufacturer could then e.g. remove applications without the end users consent I do not know).

Also the data in the phone is automatically cleared if the screws for the back panel are removed. Although this may not stop someone from drilling direcly through the back panel (while leaving the screws intact).

Clive Robinson February 24, 2016 11:58 PM

@ joensuu,

Also the data in the phone is automatically cleared if the screws for the back panel are removed.

Usually called “Anti-Tamper” such physical defences have for many years been an interesting “arms race”.

I’ve come to the conclusion quite some time ago that you need both hardware and software anti-tamper systems if you want sufficient security for doing the likes of eBanking.

The fact that the same technology allows you some privacy and thus liberty against oppression, is what sticks in the craw of the likes of amoral authoritarians like the FBI’s Comey. Who believe in the “Might is right” “crush all who you see as beneath you” attitudes. Thus they hate people and technology who stand against their irresponsibley antisocial world view irrespective of how harmful that view realy is to society at all levels. They likewise refuse to acknowledge that with authority over others comes a responsability not to cause others harm. Such attitudes, behaviour and lack of morals and empathy make them congenitaly unfit leaders in a true democracy. If left unchecked they would burn the haystack of society to find their imagined needle that is either not there or doing very little or no harm.

Thoth February 25, 2016 4:21 AM

@joensuu
The kind of cover removal trip switch is the weakest and most commonly deployed anti-tamper mechanism. There are many other mechanisms taken from the literature of smartcard and hardware security module security.

A list of anti-tamper hardware and software approach known to the public:
– Cover removal trip switch
– Serpentine designed anti-tamper mesh. The first 2 metal layers of the IC chip can be integrate with the serpentine mesh as it’s circuitry.
– Radiation and light spectrum sensors to detect radiating IC chips and using light to erase the EPROM data.
– Electrical sensor circuits to detect power glitching attacks.
– Self-checking CPU. Infineon has a product that uses dual 16 bit CPU to check each other and encrypt each other. Not sure of how it works in detail though. The dual 16 bit CPUs scored Infineon an CC EAL 6+ certification.
– Tamper detection epoxy (doesn’t provide tamper response and reaction).
– Tamper detection epoxy entangled with serpentine mesh. Better if the mesh material reacts to acid used during de-capping. (Provides tamper response if de-capping attempts are detected). Is said to be used in IBM made PCI based HSMs.
– Obsfucated codes with deliberate inclusion of codes and dummy functions baiting attackers to attack it.
– Whitebox crypto to make DPA and SPA hard when not knowig how the whitebox crypto are arranged (security via obscurity) and protected by “Trade Secret”.

You would notice most protection are done on a very small surface which is the critical CPU chip itself where you find lota of anti-tamper measures.

Read up on HSM and smartcard literature and attack vectors to get more knowledge on these stuff.

CallMeLateForSupper February 25, 2016 6:37 AM

Very interesting interview of ACLU attorney Ben Wizner by Cyrus Farivar. They talk about 1) FBI vs Apple and 2) Edward Snowden.

“I’m not surprised when I see polls that say that Apple should turn over the information to the FBI. What our community has failed to do effectively is to change the framing, so that people understand that security isn’t on one side and rights on the other in this kind of dispute. But that actually security is on both sides—different kinds of security.

“So long as the focus is on a terrorism investigation in the US, I think it’s going to be hard to get high levels of support for what Apple is doing.

“This really is the government recognizing that, over time, encryption is going to be an obstacle to certain kinds of investigations—it will. There’s no question about that. And then finding the most emotionally resonant battlefield on which to have this skirmish.

http://arstechnica.com/tech-policy/2016/02/snowden-lawyer-bill-of-rights-was-meant-to-make-governments-job-more-difficult/

Dom February 25, 2016 9:48 AM

@ Clive Robinson

Hi Clive

I really do not understand your point:

============================

The data is protected by a 256Bit AES master key, which is not stored on the phone. It is built each time from the user passphrase and other “hidden” effectivly One Way variables when the phone is unlocked. This makes the process not just One Way, it also makes it hardware dependent, so it has to be done on that phone and that phone only because of the One Way variables.

As I said in theory the FBI could extract those variables and but it would be very high risk and it would probably still not give them the AES master key to get at the data.

As jetole righly said:

============================

As per the encryption type being used, it’s probably irrelevant. Where the FBI would, most likely, need to focus their effort would be on brute forcing the password used to create the key. They can try and brute force the encryption key itself but it would probably be more computationally expensive (take longer with more servers in the cluster) than the password used for key generation.

(actually, they could do both in parallel and see which ones succeeds first).

Would you be able to explain in what respect, assuming the encrypted data can be physically copied off the device, jetole’s suggestion could not be implemented?

Regards,

Dom

jetole February 25, 2016 11:42 AM

@Clive Robinson and @Dom

Clive, OK. I misunderstood what you meant about the key and that makes more sense. FYI you mispelled my handle / nick.

As per what Dom had said about why it could not be possible to copy the data, I am basing this entirely on theory as I have not disassembled an iPhone ever but theoretically there is no way to stop someone from accessing the raw medium. There can be plenty of physical tamper resistant features to prevent someone from physically removing the storage medium but there is no such thing as a physical means that can eliminate the ability. The worlds best safe can still be broken into given enough time and resources and I am betting that Apple has probably not implemented any means nearly advanced enough to keep properly trained law enforcement from being able to remove an intact storage medium. I even feel confident enough to say that it’s probably pretty likely that, if given the right tools, the storage medium could be properly removed in a unaltered state by a common geek who has done similar tasks with similar devices.

Encryption would be the key to security in this case, I presume, and I imagine that physical tamper resistance would be a low priority if it was considered at all.

There’s this old concept that the only secure system would be one where no one has physical or virtual access to the device i.e. buried in concrete at the bottom of the ocean. The idea and concept elaborates to imply that no machine can be secure when someone is able to have physical access to it. The point I am making is that I imagine if I sat down with the phone and the proper tools that I would likely be able to remove the storage medium in a intact and unaltered / uncorrupted manner and if I work for the feds I could probably order a box of iPhones to do trial and error on.

This is all hypothesis since I have not done anything like this with a iPhone, I may be mistaken and I am eager to hear from anyone who would know if this either can or cannot be done in such a simple process.

Mats J February 25, 2016 5:44 PM

When I was thought cryptanalysis in Sweden the last mentioned method was “cryptanalysis with a baton”. You get hold of the guy with the encryption keys and beats the …. out of him until he hands over the encryption keys. Maybe not the preferred method for the honorable FBI but some other hostile government agencies, foreign or domestic could be tempted to use it. Often not so costly as a high tech attack. Who is the key holder at Apple? How do they guard against that attack? Can Apple guarantee that there is no spies inside the organization? Are all the cleaners security wetted (TI and Smartcards)? Buy the way the guy who service the printer who is he? (NSA Soviet embassy Washington) ad infinitum.

Spooky February 25, 2016 7:14 PM

In other news, federal magistrates declare P = NP, warning that all violators risk being held in contempt…

“Complexity Theorists Soberly Concerned by Recent Court Ruling; Opt to Wear Ubiquitous U.S. Lapel Pins, While Continuing Forbidden Research.”

Not yet, anyway.

🙂

Dirk Praet February 25, 2016 7:57 PM

@ Mats J

When I was thought cryptanalysis in Sweden the last mentioned method was “cryptanalysis with a baton”. You get hold of the guy with the encryption keys and beats the …. out of him until he hands over the encryption keys.

Commonly known as <a href=”https://xkcd.com/538/ target=”_blank”>5$ wrench or rubber-hose cryptanalysis these days. In the SB case unfortunately not applicable because both the phone’s user and his wife were killed by death.

@ joensuu

… well there is always this Bittium “Tough Mobile”, and Android-based alternative to iPhone…

(giggle) These guys need to change their name. They’re not going to sell a single phone in the UK.

shady February 25, 2016 9:25 PM

How any competent tech org can do this without apple help.

  1. Discharge phone battery
  2. Unsolder flash memory chipset from phone
  3. Read flash memory contents into a set of files, one for each chip. These are an image of the phone software and encrypted payload.
  4. Copy original images into new chips, solder onto phone PCB
  5. Turn on phone, try up to 10 codes. Exit if you unlock phone. Else phone clears.
  6. Repeat 4,5 using the original image from 3 until success achieved.
  7. For extra credit: build a flash emulator that connects to phone processor and allows automated reloading of the image from 3.

Easily tried on other phones to perfect process before attempting on target phone.

What am I missing? Is there some other persistent storage present?

ianf February 26, 2016 2:07 AM

What am I missing?

You’re missing the point, or rather the lot.
[Authoritative Answer from the Authoritative Answerserver].

@ Dirk Praet […] “Commonly known as 5$ wrench or rubber-hose cryptanalysis… unfortunately not applicable because both the phone’s user and his wife were killed by death.

Wael’s asleep, so I’m availing myself of the ready opportunity to ask in earnest what he’d simply make unsophisticated ha-ha funny point of: how so, unfortunately?; and, of course, about the scientific underpinnings for your concluding tautology regarding the SB shooters’ demise. Please elaborate or else.

@ Tõnis in The Importance of Strong Encryption to Security […] “Loved Apple’s brief … all except for the Conclusion where Apple states that it has “great respect” for the “professionals” at the DOJ and FBI. There’s no reason Apple should glorify tyrants even if it believes (erroneously) that their intentions are good.

Quite. Only Apple is an American company, a country the size of a continent (even if only half of that) where certain heady amounts of chest-thumping, and repeated pleading allegiance to something or other are expected and viewed as necessary attributes of Being American And Proud of It. Call that concluding declaration a servitude of sorts, PR-mandated expression of Loyalty (always spelled with the capital “l”) to the Ideals (ditto) of the American Freedoms (as well). In practical terms, it was a bone thrown to the unenlightened electorate that’s ready to chew on Apple’s leg should it deny The Saintly Law Enforcement sought access to the (we both know) worthless piece of kit.

Wael February 26, 2016 2:28 AM

@ianf,

Wael’s asleep

Nope![1] I was placing an order right here. I am not sure if this tiny SBC was referenced here, but I’m believe @Figureitout would be interested. I couldn’t find a Raspberry Pie Zero in stock anywhere, so this one seemed an attractive alternative for $9.00

[1] I’m glad you write well, except when you f##k up grammar and still think you’re right. At any rate, I suspect your Ouija board got subverted by NSA. Quick! Wrap in aluminum foil (and don’t forget the ground), lest it gives you the wrong information again 😉

ianf February 26, 2016 3:44 AM

@ Wael “was placing an order for a C.H.I.P.… not sure if this tiny SBC was referenced here, but I’m believe @Figureitout would be interested. I couldn’t find a Raspberry Pie Zero in stock anywhere, so this one seemed an attractive alternative for $9.00

BIG SPENDER NIGHT, eh? Poker, cigars and buxom barmaids aplenty—saw this in a Hohollywood movie. Yes, the C.H.I.P. was referenced here 3 months ago, together with #pizero, or rather #fuggedaboutpizero – as it has priced itself out of the market. I’d so love to play with either one of these, but, unfortunately, my long-time hacker pals have all grown up, died or moved elsewhere, and I don’t feel confident enough to hack them tutti solo.

And never you mind me grammar[*], part of my id-obfuscation effort. Besides, I got my punctuation wright, and that’s what matters.

[^*] unsubstantiated ill-willed accusations of improper deployment, of (spooky Latin etc).

Clive Robinson February 26, 2016 3:59 AM

@ Shady,

How any competent tech org can do this without apple help.

And it won’t work…

First of all the data in the flash memory file system is stored encrypted –as is the file system as well– with a 256bit master AES key from which other keys are derived. So what you get is a “random bag of bits”.

Secondly none of the keys, survives a power down etc they are rebuilt each time you unlock with your passphrase.

But to stop the type of attack you describe Apple put a wrinkle in the system which is a hidden or write only variable that is –supposedly– randomly written in the factory. This is also used to mix in with the passphrase to generate the master key.

Thus the hidden variable which is tucked away inside an SoC type chip with in built flash memory is not available outside of the chip by the normal processes. Which ties things to that SoC. Muck up on the passphrase and the hidden variable gets overwritten.

To get at the hidden variable without the passphrase is a risky venture in that it involves in the ordinary case the use of dangerous acids to get the chip out of it’s packaging. You then need to work out not just where the hidden variable is stored but how, you then need to get it out of the storage in some way. Depending on who you listen to the chance of damaging the hidden variable is between 25-98% for an organisation that is proficient at doing such things. The process is unlikely to be a core competence of the FBI.

Outsourcing such a task also has “legal issues” to do with “the chain of evidence” and potential “fruit of the poisoned vine” challenges etc.

But… It also assumes that Apple have not taken other anti-tamper issues. Currently it’s Apples “Trade Secret” and they are not saying, nor are they actually required to do so currently (though that might change).

e defense against the AWA is “not possible” another is “undue burden”. The DOJ / FBI paperwork has tried to block the “not possible” argument by “suggesting” a method. The reality is that they are in all probability taking a guess to make the judge think Apple are being unreasonable, thus the judge has pronounced. Which Apple now has three options, comply in some way, appeal upwards or at a pinch reveal some of their trade secrets to disprove the FBI assumptions. Which means that the “undue burden” defense comes into play.

But as I’ve already pointed out “officially” nobody knows how to do what the FBI are asking. The assumption is Apple’s burden is developer time. But… what of the burden that falls on the developers? The will become easily “known” to fairly simple inteligence gathering, as will the details of their families and loved ones. The knowledge of bypassing Apples protection systems would be of immense value to many who also won’t let little things like ethics or criminal sanction get in their way. Thus the judge / DOJ / FBI are painting targets on these peoples backs. How would you describe that level of burden? Im sure the people think it wildly unacceptable, the DOJ don’t let it get in the way of their careers as for the FBI, I get the feeling Comey, would welcom a few of those people getting killed or tourtured, as it would serve as a warning to others not to stand against “the good guys”…

HO February 26, 2016 4:48 AM

I have 2 (probably simple-minded) questions concerning the San Bernardino shooter case:

  1. Why can the original phone’s contents not simply be cloned to allow a brute
    force attack to proceed on the clones without fear of destroying the original?

  2. If the FBI hands the phone over to Apple, and (with the search warrant) asks
    them to produce its clear contents, how does this create an additional back door
    beyond Apple’s current back door of its signing key ownership? Apple isn’t obligated to disclose the means they used to obtain the clear data. It seems to me that they are targeted by the search warrant since they OWN iOS and simply assign customers an EULA. Nor do Apple and the FBI need to disclose whether Apple successfully met the FBI request, assuming that the info isn’t used as evidence.

ianf February 26, 2016 5:04 AM

Very simpleminded. Read this topic oriented threads of the past 2-3 weeks (use the search box at the top of the page to search for FBI and Apple), there are the answers. And how come you expect being served, when you don’t even care to do some prior research?

65535 February 26, 2016 5:34 AM

Apple’s legal Response in court [Summary via Techdirt];

‘We Read Apple’s 65 Page Filing Calling Bullshit On The Justice Department, So You Don’t Have To’- techdirt

not too surprising to see the crux of Apple’s argument. In summary it’s:

• “The 1789 All Writs Act doesn’t apply at all to this situation for a whole long list of reasons that most of this filing will explain.

• “Even if it does, the order is an unconstitutional violation of the First Amendment (freedom of expression) and the Fifth Amendment (due process).

[and]

“This is not a case about one isolated iPhone. Rather, this case is about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds of millions of individuals around the globe. The government demands that Apple create a back door to defeat the encryption on the iPhone, making its users’ most confidential and personal information vulnerable to hackers, identity thieves, hostile foreign agents, and unwarranted government surveillance. The All Writs Act, first enacted in 1789 and on which the government bases its entire case, “does not give the district court a roving commission” to conscript and commandeer Apple in this manner. Plum Creek Lumber Co. v. Hutton, 608 F.2d 1283, 1289 (9th Cir. 1979). In fact, no court has ever authorized what the government now seeks, no law supports such unlimited and sweeping use of the judicial process, and the Constitution forbids it….Indeed, the government itself falls victim to hackers, cyber-criminals, and foreign agents on a regular basis, most famously when foreign hackers breached Office of Personnel Management databases and gained access to personnel records, affecting over 22 million current and former federal workers and family members. In the face of this daily siege, Apple is dedicated to enhancing the security of its devices, so that when customers use an iPhone, they can feel confident that their most private personal information—financial records and credit card information, health information, location data, calendars, personal and political beliefs, family photographs, information about their children—will be safe and secure… The government says: “Just this once” and “Just this phone.” But the government knows those statements are not true; indeed the government has filed multiple other applications for similar orders, some of which are pending in other courts.2 And as news of this Court’s order broke last week, state and local officials publicly declared their intent to use the proposed operating system to open hundreds of other seized devices—in cases having nothing to do with terrorism. If this order is permitted to stand, it will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent. Once the floodgates open, they cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote. As Tim Cook, Apple’s CEO, recently noted: “Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks…

“…if Apple can be forced to write code in this case to bypass security features and create new accessibility, what is to stop the government from demanding that Apple write code to turn on the microphone in aid of government surveillance, activate the video camera, surreptitiously record conversations, or turn on location services to track the phone’s user? Nothing.

“…Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a project manager, and either a document writer or a tool writer…. No operating system currently exists that can accomplish what the government wants, and any effort to create one will require that Apple write new code, not just disable existing code functionality…. Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes…. In addition, Apple would need to either develop and prepare detailed documentation for the above protocol to enable the FBI to build a brute-force tool that is able to interface with the device to input passcode attempts, or design, develop and prepare documentation for such a tool itself…. Further, if the tool is utilized remotely (rather than at a secure Apple facility), Apple will also have to develop procedures to encrypt, validate, and input into the device communications from the FBI…. This entire development process would need to be logged and recorded in case Apple’s methodology is ever questioned, for example in court by a defense lawyer for anyone charged in relation to the crime…

..”.the government’s flawed suggestion to delete the program and erase every trace of the activity would not lessen the burden, it would actually increase it since there are hundreds of demands to create and utilize the software waiting in the wings….. If Apple creates new software to open a back door, other federal and state prosecutors—and other governments and agencies—will repeatedly seek orders compelling Apple to use the software to open the back door for tens of thousands of iPhones. Indeed, Manhattan District Attorney Cyrus Vance, Jr., has made clear that the federal and state governments want access to every phone in a criminal investigation…. [Charlie Rose, Television Interview of Cyrus Vance (Feb. 18, 2016)] (Vance stating “absolutely” that he “want[s] access to all those phones that [he thinks] are crucial in a criminal proceeding”). This enormously intrusive burden—building everything up and tearing it down for each demand by law enforcement—lacks any support in the cases relied on by the government, nor do such cases exist.

“Under well-settled law, computer code is treated as speech within the meaning of the First Amendment…. The Supreme Court has made clear that where, as here, the government seeks to compel speech, such action triggers First Amendment protections…

“In addition to violating the First Amendment, the government’s requested order, by conscripting a private party with an extraordinarily attenuated connection to the crime to do the government’s bidding in a way that is statutorily unauthorized, highly burdensome, and contrary to the party’s core principles, violates Apple’s substantive due process right to be free from “‘arbitrary deprivation of [its] liberty by government.’”- Apple

See:
https://assets.documentcloud.org/documents/2722434/Motion-to-Vacate-Brief-and-Supporting-Declarations.pdf

Summation by Techdirt:
https://www.techdirt.com/articles/20160225/15240333713/we-read-apples-65-page-filing-calling-bullshit-justice-department-so-you-dont-have-to.shtml

Thoth February 26, 2016 7:03 AM

@Wael, Markus Ottela
I was thinking of purchasing the $9 CHIP and the accompanying PocketChip “calculator”. I wonder if you can get a modified seL4 mircokernel on it working (it is suppose to support ARM chips but needs to double check) and use it as an off-line encryptor of sorts or maybe to be used in the TFC setup as the TX/RX modules.

If the TFC’s endpoint laptops and RPis can be replaced with a TX/RX PocketChips that does both display and Instant Message engine, it would be more compact.

I can imagine the other spare GPIO pins hook up to a smartcard reader per TX/RX device and installed with PCSCD smartcard driver to interface with a smartcard for secure key storage.

Thoth February 26, 2016 7:12 AM

@HO
You can clone the encrypted Flash data but you CANNOT clone the hardware protected security parameters stored inside the ARM chip. Furthermore, the user’s password is entangled with security parameters stored within the ARM chip (NOT the Flash chip) and trying to physically decap the ARM chip or any chip of that matter is highly risky and may result in the chip (and the evidence) destroyed in the process.

If you like, you can try to guess the 256 bits AES Data Encryption Key used to encrypt the Flash data. Without the security parameters or user password/PIN to re-form the AES 256 bit keys, it’s pointless to do bruteforce as you are effectively bruteforcing the AES 256 bit key which will take too long to do so (more than 25 years or even 50 years or more).

The second question is about precedent. If Apple ever creates the backdoor even for a single phone and only for that phone, the problem is China, Russia, Germany, India, Pakistan, Iraq, Iran … etc .. can use the FBI case to demand more backdoors. We have to consider that Apple phones are carried by diplomats and sensitive Government personnels as well and what if these countries request for unconditional backdoor to tap these people in sensitive positions. Security is usually a boolean .. either secure or not .. because the complexity of making “secure backdoors” when we can’t even get security proper is already problematic in and of itself.

Skeptical February 26, 2016 7:25 AM

@Nick P: re CALEA & Apple

Before I quote your comments and respond directly, I’d like to lay out some of the context here.

The All Writs Act is intended to be, and is written to be, a catch-all to enable courts to ensure that their orders have effect where Congress has not otherwise provided a specific means for a particular type of order to have effect.

In the most important case concerning the AWA, the FBI had obtained an order allowing them to install a pen register on telecommunications from a certain room for the purpose of furthering an investigation into illegal gambling activities. However, the FBI determined that they were unable to run lines themselves from the equipment in the room without being detected. The phone company could “lease” a line to the FBI, which would involve the company showing the FBI which lines to connect to in a location out of sight of the room.

The company however believed that this would violate legal prohibitions of illegal wiretap – that while the FBI might by itself legally install a pen register, the company would be acting illegally if it aided the FBI in the manner requested.

So the court issued an order under the AWA compelling the phone company to comply. The company appealed, and eventually the case rose to the US Supreme Court.

There, the Court rejected the company’s argument that Congress addressed the particular issue at hand via the Wiretap Act, and so the court could not act here under the AWA. Instead, the Court noted that while the Wiretap Act was intended to facilitate the interception of the content of communications, here the issue concerned a pen register to capture certain metadata.

Okay – so let’s turn to CALEA. The purpose of CALEA is to ensure that telecommunications providers are able to facilitate real-time intercepts of communications upon lawful requests by government agencies. CALEA has nothing to do with enabling the government to acquire data stored on devices, even if that device might also be used to conduct telecommunication.

The subsection you quote states:

“This subchapter does not authorize any law enforcement agency or office

(a) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services; “

You interpret this to mean:

Apple definitely manufactures [via third parties], “provides,” and “supports” telecommunication services with their iPhone offering. So, the law clearly says they don’t have to change the design to enable L.E. access. No backdoors per U.S. law.

The section states that CALEA itself does not provide the power for law enforcement to require any specific equipment design to be adopted by any of the listed types of entities. CALEA instead requires that equipment with certain capabilities be adopted, but here it also clarifies for us that the law does not provide the power for law enforcement to select the particular design.

In other words, it tells us what the law does not do. It says, “look, don’t interpret CALEA to mean that law enforcement can tell AT&T that they need to buy this particular switch and not that one – this law doesn’t give that power. This law simply requires that AT&T buys switches that have particular capabilities.”

If the court had issued an order requiring Apple to adopt a particular design to enable real-time intercepts of telecommunications and cited CALEA as authority, then the quoted section could pose a real problem.

But CALEA isn’t being used as an authority.

And the issue here is not the real-time interception of communications, but rather an effective search of encrypted data at rest, which is something NOT addressed by CALEA (even less so than the Wiretap Act addressed the collection of metadata ). The court issued a writ under the AWA in order to render effective a warrant authorizing the government to conduct a search of that device – something not addressed by CALEA, and not something prohibited by CALEA (remember: a law X that states, ‘law X does not authorize p’ is not the same as law X stating ‘law X forbids p.’).

So a section of CALEA – an inapplicable law and not used by the court as authority – that states merely what CALEA does not do – is completely irrelevant.

Regarding precedent:

The lawyers are arguing otherwise. I’m out of my depth here so I can’t say. Except, Apple’s lawyers, EFF’s lawyers, and all kinds of others that fight these cases regularly think this could affect future decisions. So, I default on believing it can lacking expert counteropinion.

As legal precedent, the decision of this particular court has no authority – other than as merely persuasive example – with respect to other courts.

That’s simply a fact. Any lawyer stating otherwise is either lying, impaired, relying on an ambiguity of the meaning of the word precedent, or desperately needs a refresher on how the US federal court system works.

Dirk Praet February 26, 2016 7:46 AM

@ ianf

how so, unfortunately?; and, of course, about the scientific underpinnings for your concluding tautology regarding the SB shooters’ demise. Please elaborate or else.

Err, “unfortunately” as in Edmund Blackadder‘s highly sarcastic replies to his servant Baldrick coming up with one of his cunning plans. “Killed by death” in tribute to Motörhead’s late Lemmy Kilmister, whose recent demise has sparked an unprecedented number of metaphysical and eschatological debates at the local pub. It never even remotely occured to us that he was in fact mortal, assuming that the expected outcome of any encounter with the Grim Reaper would result in Lemmy drinking him under the table and enlisting him as a junior guitar roadie.

Skeptical February 26, 2016 7:49 AM

Apple’s response is a travesty of legal reasoning, and more closely approaches a press release than a brief.

If it is not an unreasonable burden for Apple to turn off the guess-counter-and-conditional-delay function here, then obviously it will not be an unreasonable burden in relevantly similar circumstances.

That it relies on an absurd mischaracterization of the government’s arguments is another indicator of how little Apple has to stand on here. I do not see where the government ever said it would never ask for identical assistance in enabling the decryption of other devices for which it has a search warrant. What the government did say is that Apple need not simply hand over the capability to enable that decryption to the government.

Nor will the judge’s decision here determine whether such assistance will be sought in other courts – it certainly will be. It would be irresponsible for it not to be.

Apple, the vulnerability in your product already exists. It is there. If your security is sufficient to prevent others from exploiting that vulnerability without your assistance, then your security is also sufficient to protect the substance of that assistance. And if you cannot protect the substance of the assistance, then you cannot prevent others from exploiting the vulnerability anyway.

In either case, Apple’s refusal to aid in the decryption of a dead terrorist’s phone provides no additional security while hindering a terrorism investigation.

That is the bottom line.

65535 February 26, 2016 8:10 AM

Emptywheel notes a 50 day delay in seeking Farook’s iCloud data. The FBI must not care much about the San Bernadino shooting case – except to set a legal precedent for mass surveillance. It’s a sensational case. Why let a crisis go to waste? /s

‘FBI Waited 50 Days before Asking for Syed Rezwan Farook’s iCloud Data’

“In other words, Apple was fully engaged in this case, and yet FBI still didn’t ask their advice before taking action that eliminated the easiest solution to get this information. And then they waited, and waited, and waited.” –Emptywheel

[and EFF statement of the Apple Warrant]

“One repeated question has been: will other countries, like China, demand the same powers? You don’t need to look to Beijing—or even the future—to find the answer to that question. The newly proposed British spying law, the Investigatory Powers Bill (IPB), already includes methods that would permit the British government to order companies like Apple to re-engineer their own technology, just as the FBI is demanding. Worse, if the law passes, each of these methods would be accompanied by a gag order.”- EFF

https://www.eff.org/deeplinks/2016/02/investigatory-powers-bill-and-apple

Gerard van Vooren February 26, 2016 8:14 AM

@ Skeptical,

I think the nickname “Devil’s advocate” makes more sense since most of your posts are not skeptical but highly opinionated.

ianf February 26, 2016 8:30 AM

@ 655535 Re: ‘We Read Apple’s 65 Page Filing Calling Bullshit On The Justice Department, So You Don’t Have To’- techdirt

This is very much administrivial… you perform a valuable service by alerting us to, and quoting swatches from, a TechDirt summary of a long Apple court filling – so we won’t have to. Perfect. Unfortunately, lacking the glue of a comment or any selective embellishment, your chosen—pretty long—quotes then lose their context and become only so-so informative. Easy to get lost, within and out of. Hence your post never rises above an hard to swallow appetizer for that filling, when it ought to have been filling on its own.

    In short and for the future: to get there, narrow down and/or emphasize quotes to just their skinny, and don’t bogart those gluey-comments of yours (if you understand what I mean and of course you do).

Dirk Praet February 26, 2016 9:12 AM

@ Skeptical

In either case, Apple’s refusal to aid in the decryption of a dead terrorist’s phone provides no additional security while hindering a terrorism investigation.

I beg to differ and once again refer to @thegrugq’s spot-on analysis of the SB case in that there is most probably little to no valuable information to be retrieved from that phone that the FBI doesn’t already know. This entire case is about one thing only: seizing the opportunity of a terrorist event to set a precedent for LE to force the hand of Apple and other tech companies in systematically assisting them in breaking encryption and subverting their technologies to serve government purposes. That’s the bottom line.

In your last comment, which – with respect – is vaguely reminiscent of Colonel Nathan R. Jessep’s closing statement in “A few good men”, you’re not even hiding it any more.

Wael February 26, 2016 9:33 AM

@Thoth,

I was thinking of purchasing the $9 CHIP

I have no specific plans in mind what I’ll do with it. I have to wait till June before it arrives.

Nick P February 26, 2016 11:07 AM

@ shady

The system they use allegedly has tamper-resistance features. There’s new ways to hit hardware all the time but they probably thought of the basic ones. I’ve recommended trying them on a bunch of other iPhones of the same model to figure out what works. Then, try the one that works on it. They can reuse that attack on other ones.

ianf February 26, 2016 12:49 PM

@ Dirk Praet […] “a precedent for LE to force the hand of Apple and other companies […] to subvert their technologies to serve government purposes. That’s the bottom line.

Correct. The truly inexplicable dimension of it is that LE has chosen such a weak, postal rage event, called it “terror,” and then somehow managed to sell to the public that these 18 missing iPhone-traffic minutes HOLD THE KEY TO THE TERRORIST THREAT AGAINST AMERICA. Perhaps the dearth of Jihadi terrorists in the USA forced their hand?

@ Wael “has no specific plans in mind what he’ll do with C.H.I.P.

    Be honest, admit it… when it finally arrives, you’ll be sleeping with/on it.

Wael February 26, 2016 1:24 PM

@ianf,

Be honest, admit it… when it finally arrives, you’ll be sleeping with/on it.

Oh no! I only do that with TPMs. They’re more “trustworthy”. But seriously, I’ll probably do some crypto projects with it. Something perhaps like this. There are some wrinkles I’ll have to think about, though: how to interface it to an iPhone! Using the headphone jack is a possibility, but the I’ll have to mess with an Arduino miniature controller and deal with thinks like FSK or Manchester encoding, which luckily are available in open source libraries for both iPhone and Raspberry Pie. I haven’t made up my mind yet. A computer this small with that much memory that runs main line Linux. Perhaps I would want to port FreeBSD to it… I don’t know what I’ll do…

Nick P February 26, 2016 1:52 PM

@ Wael

Play with MINIX 3 or GenodeOS. Little projects doing better at robustness need love, too. That people are already using them for desktops suggests they’re pretty capable as well. That you do lots of embedded stuff means you might find GenodeOS interesting as its components are diverse and pluggable with a strict, hierarchical scheme for resource management. Might be worthwhile to port MINIX’s healing abilities for drivers or whatever to it.

Note: MINIX 3 team ported NetBSD userland to it for pragmatism. I think GenodeOS does a Linux VM. GenodeOS definitely supports isolating security-critical components on microkernel. MINIX 3 may support something like that.

Spooky February 26, 2016 1:57 PM

@Clive (and others)

Thank you very much for the lengthy hardware explanation of the iPhone, I think I (finally) understand.

So, to reiterate:

User Input Key –> BB1, BB2, BB3, etc –> AES-256

Where the BBs are effectively black box mixing functions that accept a fixed input, hash it with a unique internal value (set when the BB is created at the factory) and then output the result. Because the hardware for the BBs are on the die of the SOC (CPU + other chips), you must perform risky surgery on the SOC itself to get at those internally stored values, since they are never once loaded into memory or flash (as that would destroy their security). The output from one or more BBs is used to created a 256-bit AES key from the user’s key and that AES key is then used to encrypt the flash. AES-256 cannot be brute forced, period. And because the BBs only exist in hardware, there is nothing to reverse engineer from a software standpoint (in other words, you probably have access to a special opcode, register or port to manipulate the BB, but that is all). The best you could hope for would be hacked firmware or iOS code to speed up the process of guessing the passcode (and eliminate the hard limit on failed attempts) but even then, you’d still have to have the hardware from the original phone in the loop due to the BBs. Damn Apple, well-played…

Skeptical February 26, 2016 3:21 PM

@All:

What goes unquoted in Apple’s brief in the above is the actual estimate they give of the time and labor needed.

Despite the grandiose phrasing of they want us to build a new operating system, which is well calculated, and the lengthy summary of all the steps involved (quality assurance, documentation, testing), the bottom line estimate is:

Around 6 people, who by putting a “substantial” portion of their time towards this task, should finish in about 2 weeks though “possibly” not for 4 weeks.

That’s an all inclusive estimate – coding, testing, documentation, etc. 2 weeks, maybe as long as 4.

Unreasonable burden? No, sorry. The government will compensate them for the costs of doing so.

That’s the case. The rest – the posturing about CALEA, the 1st Amendment, substantive due process – is just haze.

The policy argument they raise is far too vague and expansive to be a decisive factor here. The court is making a decision about the applicability of the AWA to this particular case; it is not deciding whether the AWA is a wise instrument for the courts to have in light of current technology. Apple’s rhetoric about the policy implications is a bomb; one defuses it by making it irrelevant to this particular decision.

Oh, and one more thing.

If it takes Apple 2 weeks and 6 people to build this, dotting every i and crossing every t for evidentiary purposes, that does not exactly suggest a vast moat of security around this tool.

In fact Apple’s estimate completely destroys its argument that by not building this tool themselves they are somehow providing customers with more security from hostile nation-states or well-funded criminal enterprises (or some mixture of the two).

@Dirk: There probably isn’t much evidence or useful information, but one really doesn’t know. No investigation of an act of terrorism is going to just shrug off an encrypted iPhone with “eh, probably nothing in there.” That’s not the way it works. I cannot imagine anyone tasked with such an investigation doing so.

The fact is that they’re bound by institutional pressure and ethical obligations to investigate the crime in full. That means that they can’t walk away from the iPhone. There isn’t any grand strategy here. Just institutions, individuals, and incentives.

Let’s take this all a step further. Suppose the court declines to compel Apple to build the tool under the generous conditions specified. You’re the FBI, and the number of iPhones that are encrypted and implicated in major crimes across the US are continuing to pile up, and you know roughly how they can be made susceptible to being unlocked under certain controlled conditions, but Apple won’t task 6 people for a couple weeks to build the tool. You’re also responsible for handling a major terrorism case in which one of the mobile devices used by a terrorist remains encrypted. What do you do next? Shrug and move on? You were planning early retirement, were you?

No, you go to your own forensics people and ask them to do what is needed here. They tell you that they’ll need x people with certain qualifications and experience. Also, they’ll need certain equipment. Oh, by the way, they might need certain information from Apple too.

And now if you’re the FBI, and you’ve listened, you say okay: you talk to the people you need to get more funding, and you hire the people you need to hire, and you buy the equipment you need to buy, and then you have the court compel Apple to hand over the information you need.

When Apple constructs a system even more secure, you continue on as well.

Eventually the end result is a FBI brimming with folks expert in Apple’s systems, with a variety of tools for breaking them, and with all the information they want from Apple to figure out the next puzzle.

So, at that predictable steady state, do you think it likely that we’ve had a net gain or a net loss in security relative to our current state?

If you think it’s a net loss, then what’s the smart move for Apple? Build the damn tool and avoid the path I just sketched. They didn’t do that.

Now as to precedent, the judge’s decision in this case does not control outside of that particular court. It is no more binding than a decision to the contrary by a federal magistrate in a court on the other side of the US.

ianf February 26, 2016 3:46 PM

@ Wael – if you’re going to interface the C.H.I.P. to iPhone, have a crack at the Lightning I/O, that should be enough of a protocol reverse engineering challenge… maybe you can get it to be a general bridge between iOS and 3rd party hardware (there already are wired iPad Lightning keyboards…). Sell it by the container load, become globally known as Wael The Appletamer on WoW (or something).

    BTW. For a while there you had me spooked… then I realized you talked of sleeping with TPMs, not the TPS reports. Soooo relieved.

Wael February 26, 2016 5:08 PM

@Nick P,

Play with MINIX 3 or GenodeOS…

Time is the problem, but I’ll give’m a try when the boards arrive.

@ianf,

have a crack at the Lightning I/O

That was the first thought that came to my mind, especially,if the rumors about Apple removing the headphone jack in iPhone 7 to make it slimmer are true.

BTW. For a while there you had me spooked

I’ll try harder next time 🙂

Clive Robinson February 26, 2016 5:38 PM

@ Ben West,

Has anyone considered the burden Apple might take on by rotating their signing keys in order to constrain this effort to “that iPhone?”

Actually I have cobsidered what Apple could do to “kill it dead via update”. Consider the update process only works because there is software loaded in the various parts if the phone to do the update.

Ask yourself what happens if Apple send out an update that removes the update process from that part of the phone?..

Yup “game over” because once done it can not be undone. This means a “factory install” by direct connection to the chip where the passphrase is parsed.

If Apple changed the way the code was stored such that it was encrypted by the “hidden variable” and locked to a specific menory block, then it becomes possible to enforce certain things such as stopping execution in RAM and the changing of that hidden variable if a “factory install” is performed. This will be considerably easier with the “Enclave Chip” with minor hardware changes.

Arguably it’s something Apple should have done sometime in the past to head of this “might is right” stupidity of Comey and Co.

It’s also something I think all phone manufactures should do as soon as possible.

Because it would force Nation State to make “clear legislation” then everybody would know exactly what security of their privacy thus degree of liberty befor they purchase a phone.

The interesting game then starts, in that as with “off shore tax havens” it becomes a way to raise income for small island states and principalities to gain a financial advantage for their economies. The US UK etc may say “OEMs must provide a backdoor…” but that only applies to their jurisdictions, if another nation or principality decides their law says that the phones must have such security of privacy… Then the OEM will have to make phones that do have those features. As the old “as sure as eggs are eggs” saw has it, such secure phones will make their way around the globe and thus into the US UK etc one way or another. The only way to stop it would be to add the equivalent of “region codes” into the phone network protocols, and the US has already shot it’s self in the foot on this score due to the “free” Trade Treaties it’s been shoving down other peoples throats (it’s one of the reasons I’ve been highlighting these Obama inspired Trade Treaties because they effectivly hand a big chunk of sovereignty thus national security over to corporates via international tribunals that are binding)

Wael February 26, 2016 6:33 PM

@Clive Robinson, @ Ben West, @shady,

The proper steps are to shield the phone from communications — air-gap it, if you will. The next step is to clone the phone and continue with cryptanalysis of the data. It may take a SEM or a FIB attack to extract the keys out of the original phone. The idea is to preserve the state of the original phone through preventing it from receiving commands and not messing with its data unless absolutely necessary. If iPhone uses PUF (Physicaly Unclonable Functions) and Read Proof HW, the task could be expensive in time and money.

So the phone, if handled properly by the “attacker”, would not receive any “updates” to rotate keys or erase data. This technique may be easier.

Figureitout February 26, 2016 8:49 PM

Wael
–Looks good but my plate’s full for now and well into the future lol. SiLabs board (got the new IDE and flashed to board etc, but all the code is in “menu.c” lol, might was well call it main.c, it’s terrible and needs a lot of work), get beaglebone configured and running (after getting a microHDMI adapter lol..), get Raspi going as VPN mostly (kali image on RasPi), FRDM board I don’t know what to do w/ yet, and some more chips ontop of other PC’s I’m testing secure configurations. Looking into a monitored USB drive next.

Steady diet of circuits and code. Few ideas for more homebrew projects (the best kind lol) but require diligence on the user’s part (if you rely on someone else too much for security, it’s compromised). There was a nice HF rig in QST lately using Arduino nano (I love the layout of that board), basically full project logs. Whatever you do, post it.

On to the topic of the thread, I haven’t studied the iPhone security at all and am shocked that no side channels have been used to bypass…so the main protection seems to be the combination of a user pin that never gets logged anywhere (keyloggers are somehow defended against here), combined w/ a hidden key on a SoC (insane designs, all peripherals on one chip…), then encrypting memory. I recall “one-wire comms” by Atmel and some of their “authentication” chips that don’t permit dumping whatever’s been written to the chip, was very curious how that worked and if it’s utter bull. And commericial anti-tampering measures seem to be doing pretty well. As anyone can try to take apart say one of the RSA-ID dongles themselves, it’s non trivial and you’ll burn thru a few if you have no knowledge of what’s inside. Apple doesn’t use screws to make it easy for the hobbyist to get inside.

Figureitout February 26, 2016 9:20 PM

Forgot the key derivation function (KDF) on the pin/password, so you have to run your brute-forced pins thru that and combine w/ another brute-force attack…sounds like you need a side channel and this should be good protection commercially.

Buck February 28, 2016 12:55 AM

@Skeptical

Eventually the end result is a FBI brimming with folks expert in Apple’s systems, with a variety of tools for breaking them, and with all the information they want from Apple to figure out the next puzzle.

So, at that predictable steady state, do you think it likely that we’ve had a net gain or a net loss in security relative to our current state?

Net gain or net loss for ‘us’ as a whole? It’s hard to say…

However, in the scenario that you proffer here, I could actually imagine both Apple and the FBI mutually benefiting from this debate! Apple, because they gain a worthy opponent who would legally have to make their successful attacks public at some time or another… FBI, because these newfound forensic skills could come in handy for a large number of other cold-cases…

Yes, perhaps the only ‘net loss’ is inflicted upon everyone else — those who afford too much misplaced trust (thanks to a widespread misunderstanding about these technological matters) towards the very institutions that will turn on them the moment it becomes economically rational. For certain values of economics, of course…

Wael February 28, 2016 1:56 AM

Why would anyone with such capabilities and reach want to decrypt an iPhone anyways? Data-at-rest on an iPhone was data-in-transit at some point. SMS texts, web browsing history, downloaded media, phone calls, voice mail, etc… we’re at once in-transit and can be readily obtained from ISPs, MNOs, etc… “The truth is coming, and it cannot be stopped. IC Off The Record”. Perhaps there are a few cases where data is harder to obtain, for example files that were transferred over from a computer or encrypted emails using GPG or such, photos that were taken… It’s not made clear if this particular phone contains the sort of data that requires breaking the encryption algorithm on the phone and isn’t obtainable “offline”…

Clive Robinson February 28, 2016 3:15 AM

@ Wael,

So the phone, if handled properly by the “attacker”, would not receive any “updates” to rotate keys or erase data.

I guess we are talking at cross purposes.

My comment was about what Apple could do to close the update issue on current iPhones “not yet” in Comey and cronies grasping maw. Not those that Comey and cronies have under their control in their forensics labs and evidence lockers currently and are drooling over them like aging perverts.

Clive Robinson February 28, 2016 6:57 AM

@ Wael,

Data-at-rest on an iPhone was data-in-transit at some point.

Mostly so, but without wishing to give scum like Comey and Co excuses there may be typed in text, photos or audio recordings, made in a way that does not get put out over the “air interface”.

However at the end of the day, we are talking about 18mins after the event, it’s therefor very doubtfull there is actually actionable evidence of any kind. Or for that matter anything at all, let alone anything of even minor historical interest.

Thus for Comey to “grandstand” on the this is very very wrong, for a whole variety of reasons. However one in particular –which the press should pick up on as they are likewise complicit– is the further psychological hurt and harm Comey is bringing to the victims of the shooting, their families loved ones and friends.

It is absolutely unforgivable, but more importantly it should tell people a great deal about the FBI method of working. That is the FBI will shove any innocent under the bus for even the tiniest piece of publicity. It makes them worse than the worst of lobbyists and conmen. And for those that work for or are thinking of working for the FBI voluntarily or not in any capacity it should serve as a very very large red flag as to what may well be in their not to distant future.

After all does it matter if you are dying under the bus, one of the ones busily throwing people under, the driver or the conductor madly ring the bell and screaming manicaly onward onward? No you are dead or as good as, you’ve lost that essense that makes people vital. I think it would be safe to say that within the FBI Hoover is not a ghost but a living essence of venal behaviour that knows no bounds and will suffer no reason or constraint, with the self belief that “might is right” it will “crush all that oppose”, with the soulless automata monosalabic chant of “for the greater good”. Words that should strike real fear in any human heart and mind, because as 20th Century history shows it will not end well.

Curious February 28, 2016 11:18 AM

According to an EFF article:

“Apple is holding its annual shareholders meeting this morning at its headquarters in Cupertino, California.”

Sancho_P February 28, 2016 5:38 PM

@Nick P
Good arguments regarding Apple and CALEA but in this rare case I’m with @Skeptical.
My “sniff test” at CALEA produces two relevant ingredients:

First, it’s all about communication, not about information at rest.
Second, the gov can’t dictate how a business must do it’s business
(this is a very wise and basic statement).

While the first may render CALEA useless in this case, the second is important for the future and will, for the benefit of security, fire back at the LE forever:

Lock the device until no one but the owner has access.

If anyone developing devices hasn’t realized that before, all will now.
Thanks, Mr. Comey.

Sancho_P February 28, 2016 5:43 PM

Regarding Mr. Comey’s opinion at Lawfare.blog:

Most of us would agree to this ideas, yet they are not focussed at the point.

Therefore the whole article is meandering (not as long as @Skeptical) in misleading terms, obfuscating the obvious:

The phone’s memory does not contain valuable clues.
Whatever it contains is the very private content of an ill minded person.
You probably could analyze the brain of a fool, but what to do with it?
The content of an idiot’s brain?

Say I think about a plot to murder the POTUS. Or my kids. Or …
Until I tell someone about it is MY plot, NOT Comey’s business.
I could write it down in any type of memory, encrypted or not, it’s mine.
Not Comey’s business.
When I’m going to share it – to anyone – this might be a crime [1].
—-> This might be Comey’s business.

I don’t like it, but this is what they already have: The metadata.
All contacts, time and location data.
Probably more (see CALEA).
Yes, that’s your business, Mr. Comey.
Follow the clues you have, and best before something happens. Think about.

In fact, what I smell from Comey’s words is thirst for revenge, not justice.
It reminds me to the days after 9/11.
Not one single word about cause and prevention of such horrible crime.

[1]
I may share it @Bruce’s fiction stories. A crime?

Sancho_P February 28, 2016 5:45 PM

@Skeptical, re Apple’s vulnerability and the protection of it.

Thanks for being short in this part.
Right, the vulnerability is there already, but to exploit it it needs deep knowledge of secrets only Apple has. Chances are high that it would take several developers and Apple’s equipment / keys / documents / server to produce that piece of software.
The burden is the knowledge.

This burden is too high for simple criminals.

But if the software was produced and tested …
Who knows where the spies are?

Skeptical February 29, 2016 12:46 AM

@Clive: Thus for Comey to “grandstand” on the this is very very wrong, for a whole variety of reasons. However one in particular –which the press should pick up on as they are likewise complicit– is the further psychological hurt and harm Comey is bringing to the victims of the shooting, their families loved ones and friends.

It is absolutely unforgivable, but more importantly it should tell people a great deal about the FBI method of working. That is the FBI will shove any innocent under the bus for even the tiniest piece of publicity.

Clive… in no investigation will a federal agent decide to simply ignore the encrypted phone of a terrorist because “it’s unlikely” that there’s anything of value on it. It is the purest form of nonsense to pretend otherwise.

Apple’s estimate is that it will take them around 2 weeks to produce what the FBI wants, including all the documentation and logging necessary for the integrity of the phone as evidence to be preserved.

It’s quite clear that the Department of Justice knew with a significant amount of insight what they were asking Apple to do, and also knew that it would not take Apple long to accomplish the task.

If you think the FBI – in a terrorism investigation – is going to hesitate to ask a company to have a few employees spend a couple weeks of work in order to read a terrorist’s mobile phone, you’re kidding yourself.

As for calling it “absolutely unforgivable” and “the FBI will shove any innocent under the bus for…publicity” – presumably you wrote those words hastily, because they’re utterly ridiculous.

Despite Sancho’s endorsement of essentially amounts to an Apple claim of NOBUS, I don’t think Apple’s refusal to put together tool that requires two weeks to make, and that multiple parties could envision with some precision, is a significant source of security – so I don’t think we lost any appreciable degree of security here.

What will be lost is the ability of anyone to count on their iPhone – or their victim’s iPhone – thwarting a lawful search.

We have due process of law to protect rights in this country while still allowing the government to enforce the law. “Binding humanity down with chains of encryption” is not something the framers would ever desire, nor quite frankly something that anyone who values the role of government would desire. We do not live in a libertarian fantasy-land where the government is the primary enemy. Sorry. That’s the kind of worldview that motivates some to accumulate assault rifles and block common-sense gun control legislation.

Sancho_P February 29, 2016 3:07 AM

@Skeptical,

I understand that you don’t think about that significant source of security, but probably you should.
Secrecy is an important part of security when it comes to the keys.
In fact, it is the key.

Gerard van Vooren February 29, 2016 3:34 AM

@ Skeptical / Devil’s advocate,

How-come you are so well informed yet so naive? Of course the FBI isn’t gonna hack into the iDevice. They want a clear documented way of doing this. Whether Apple or the FBI is gonna break the device that doesn’t matter but a clear documented way is a backdoor. I think we can all agree on that.

The problem with Yanks is that they think the US is the only country in the world. What if in The Netherlands the Department of “Security” and Justice wants access of the phone? Or what if in a rogue country the Department of Injustice wants access of the phone of a dissident?

Clive Robinson February 29, 2016 5:05 AM

@ Skeptical,

Where to start…

Reverse order might be best.

It is unwise at the best of times to conflate other issues with “cold dead hands” arguments, as you should by now be aware. As for “Libeterian fantasy land” tut tut, many countries citizans outside of the US regard even your supposadly proflicate Democrats as being way to right wing for comfort, perhaps you should get out a little more and see what others see looking in not the scared “OMG were gonna die cos everyone’s a terrorist” view point that is prevelent in the US and certain hopefulls with little or no economic or world afairs knowledge are pandering to for their own self aggrandizement.

As for “due process” how many people have the police shot this year? It sounds more like the wild west than civilian law enforcement. Likewise the actuall judicial process international human rights organisations have a great deal to say on the lack of those in your judicial process in the US. The FBI have afterall been accused on a number of occasions of entrapment in their attempts to show a “terrorist menace” as a “clear and present danger”. I would argue that the number of people killed by supposed terrorists in the US in the last decade, is tiny compared to easily avoidable road deaths from dangerous vehicals and as you brought up “guns” those deaths from avoidable gun accidents and stolen due to not having guns in appropriate locked cabinets etc.

Which brings us to “lawfull searches” those founding fathers who you say would not want us chained by encryption, certainly tried to chain the behaviour of Government, as can be seen by the prohibition of searching peoples property, papers etc. I think they would find the current behaviour of all parts of the US Government and what they have done compleatly abhorrent and far from lawfull. But the fact that they knew of encryption and simply said papers and chose quite deliberatly to make it the inclusive or covering encrypted and plaintext documents in any form (if you bother to check the meaning of papers then covered all of what we call documents these days).

As for what Apple can and can not do in two weeks is very much an open question. What is not is that the FBI seek to compel and have witheld information to both the magistrate and Apple. Further many know that Apple is far from the only avenue open to the FBI, so they have far from exhausted their options with regards the phobe, and for people to pretend that the FBIs “last hope for the victims” is Apple is at best an ill informed position to argue from. Further much of what has been said is based on unfound assumptions and false promises by DOJ staff. The question is are they not being truthfull due to ignorance or malice? Let us first assume it’s ignorance, does this hold? No the DOJ have promissed Apple what they should know beyond all doubt they can not deliver legaly. The tool if Apple are compeled to make it “becomes part of the chain of evidence” thus can from the point of it’s use be called into question, and brought forth for third party if not public examination. For this disclosure of Apple’s “Trade Secrets” not to happen then what ever is recovered from that phone can not ever be used in court. Which very definatly changes the “undue burden” equation. Thus it’s much more likely to be malice by the FBI, which is added to by the FBI informing journalists long before they informed Apple or for that matter the magistrate what they were upto.

This brings us to the point you make of “… in no investigation will a federal agent decide to simply ignore the encrypted phone …”, sorry they ignore encrypted phones all the time in arguably –harm wise– more serious cases. They simply leave them in the lab or evidence locker and carry on with investigations and prosecutions, something the law actually compels them to do… It’s why they have so many encrypted phones queued up waiting on this bit of Grandstanding. I suggest you look up what has happened with Kevin Mitnick and his DES encrypted disks (which we know can be broken in a few days if anyone wished to).

Thus if the FBI don’t ignore, why have they up until a case where they can get a great deal of favourable press. Further why do they chose to make it public and in the process actually reduce their chances of success, are they realy that unworldy or are they being malicious? I would say the whole point is that as those in the US have repeatedly turned Comey’s plans down, he’s trying it via the backdoor of case law, and yes I would very much say he would be very happy to through Apple, their staff and anyone else under the bus. The FBI do afterall have a history of it…

Anyway enough of pointing out where your view point jibes with that of many others, as I said “go out and see for yourself” you might find the process instructive, it’s also one way way way more Americans should do, after all passports are not that hard to get for most currently.

ianf February 29, 2016 7:15 AM

@ Gerard van Vooren […] “The problem with Yanks is that they think the US is the only country in the world. What if… a rogue country’s Department of Injustice wants access of iPhone of a dissident?

Don’t be naïve. Being exceptional, of course they are the only ones that matter… the NOBUS principle applies. Should anyone else demand similar access of Apple, the USGov has its Export Restrictions of Munitions and other preventive legislation in place. So the American corporation Apple can then chose whether to follow the Holy Laws of the U.S.A. or defy them by making another “marketing decision.”

ianf February 29, 2016 8:27 AM

@ Clive,

I can’t speak for Wael, but you seem to be overlooking the slight base price difference between the #pizero and the #RaspPi3 models: slight as in $5 to $35, or by a factor of seven [*].

It matters if you’re not quite sure what you will do with it, and the project ends up in the forever unfinished hacks drawer – that I’m sure we all have “been there, done that.” (The £4 #pizero SBC + PiMag issue combo seems to have been one-off proof of concept undertaking that now doesn’t even rate a mention among earlier models on the Raspberry’s website).

[^*] It’s like with my cable broadband charges: the company gives me (unasked for) more and more speed—now 100Mbps—while at the same time hiking up monthly charges. I was happy with 10Mbps, but that “option” is no longer on the menu, as 100 is the new black for a while. I suppose I could cancel cable but then what… go back to ADSL over telephone wire? wireless Internet at twice the cost and capped at 5/20/40Gb/ month?

Wael February 29, 2016 9:52 AM

@Clive Robinson, @ianf,

Do you think you should have waited?

Price wasn’t a factor – I’ll only get one or two. But now that you alerted me Raspberry Pie 3 is out, yes! I should have waited — it’s a lot more capable than C.H.I.P, and marginally bigger than Pi Zero. We can’t foresee the future. I’ll order one today… in for a penny, in for a pound!

Wael February 29, 2016 11:24 AM

Correction!

in for a penny, in for a pound!

In for £35.00, in for £224.00 — accessories and all. I’d better do something with it and not leave it with its more expensive predecessors…

ianf February 29, 2016 1:04 PM

@ Wael,
              the price is always a factor, because the deeper you get into a project, the larger your investment in dedicated I/O, peripherals etc will be. It’s one thing to toss a ~$10 item into the abandoned hacks box, another to write off $150-250 it usually takes (half a year before I gave up on one project, I invested £200 in books and other reference matter… in theory still of use, but I haven’t touched them for >20 years). So I do not pooh-pooh it anymo.

ADDENDUM: I see you wised up… what happened, did you follow my advice re: sucking on fish heads?

BTW. When I awoke a day too late for ordering the #pizero (all 20,000 units were gone the first day), I checked what it’d cost to play with it, and came up with ~£45 for half-year PiMag subscription incl. the initial combo issue #40. That is no longer on offer, but there is an American site that claims to have them @ $5 available for in-store pickup only (store location unknown, but they seem to have outlets in several places incl. TX).

The other “resellers” are all

SOLD OUT £22.50 PiZero Essentials Kit

SOLD OUT £24 Complete starter kit [lists several variants of what now can only be described as mirageware (opposite to vaporware), SOLD OUT.]

That said, when thinking “Raspberry,” think “π,” NOT “Pie.” (You know how to think, don’t you? Just press your brain halves together and glute squeeze’em.) But perhaps you should wait a bit longer… I don’t know the different Pi form factors, but what if the promised π3 model A+ is smaller than the just-announced A?

    What about Model A+?

    Model A+ continues to be the $20 entry-level Raspberry Pi for the time being. We do expect to produce a Raspberry Pi 3 Model A, with the Model A+ form factor, during 2016.

PS. If you persist in calling this SCB model πe, I’ll subject you to online harangues in the style of Ernest Vincent Wright’s novel Gadsby. I know people who have been institutionalized for less than trying to find out why.

Skeptical February 29, 2016 8:49 PM

@Gerard: Of course the FBI isn’t gonna hack into the iDevice. They want a clear documented way of doing this.

There is no necessary contradiction between the FBI disabling certain features of the device without Apple’s assistance, and the FBI having a clear, documented way of doing so.

The problem with Yanks is that they think the US is the only country in the world. What if in The Netherlands the Department of “Security” and Justice wants access of the phone? Or what if in a rogue country the Department of Injustice wants access of the phone of a dissident?

The security services of the countries one would be concerned about do not care what the laws of the US are. And what stops them from extending their forms of intimidation and repression abroad to a greater extent than they have already is not any respect for the law either.

As to the US thinking itself the only country in the world, I suggest that a few minutes contemplation of US foreign policy might raise some doubts.

@Clive: It’s difficult to put together your first paragraphs into a coherent argument. As best I can tell, you claim that the US does not have due process of law because (a) there are some number police shootings, (b) the FBI has “been accused” of entrapment in sting operations (it’s always very shocking when a defense attorney claims his client was the victim of entrapment), and (c) more people die from motor vehicle accidents than terrorism.

As to the paragraph on searches, the framers bound the powers of government by setting parts of itself against one another – not by empowering individuals or companies to be secure against lawful searches so that they could also be secure against unlawful searches.

As to Apple, the estimate of the work required was provided by Apple in their response, along with an affidavit from the individual who would likely be selected to lead the project.

I’m not sure where you’re getting “last hope of the victims” from. I’ve yet to see or hear anyone representing the Justice Department use that language.

As to Apple’s trade secrets or the forensic tool they build, the specifics of neither would be subject to public view. You can examine Apple’s response to the court’s order if you want a sense of what would actually be a burden in Apple’s view. You’ll note the absence of your concern that either would be exposed in court or to the public.

As to law enforcement shrugging at encrypted cell phones and walking away, you’re forgetting a rather important condition to my claim that they can’t: the existence of a feasible means of decrypting the phone. A local law enforcement officer may indeed be forced to have the phone stored away if there exists no feasible means known to the officer for decrypting the phone. But that’s not so in this case. There is a feasible means of decrypting the phone, and the priority of the investigation is particularly high. So no – ignoring that means is not an option for anyone involved in that investigation.

You also repeatedly claimed that Americans have a somewhat hysterical attitude towards terrorism in comparison with other countries. I actually disagree, but it doesn’t have much relevance here. There is no hysteria in the court’s order, in the government’s motion to compel, or in the comments of those representing the views of various parts of the government in this case. Perhaps unexpectedly, the only hysteria to be found is in Apple’s response, where it practically claims a cyber apocalypse will follow if the court’s order is allowed. But I suppose that sort of rhetoric is all that’s left when neither the facts nor the law is on your side.

Brian T. February 29, 2016 9:42 PM

What about a hardware-only hack? Then there would be no chance of a potential ‘backdoor’ escaping into the wild with a simple ‘copy’ command.

Are there any volatile (re-writable/erasable) data contained on other chips besides the flash chip?

If not then why can’t Apple simply disable the WE (write-enable line) connected to the flash chip and/or keep the WP (write-protect line) enabled at all times? Just slice the trace(s) on the board (or wire to a switch, possibly Hi/Lo). Make 10 attempts and then reset the phone for another 10 attempts. 10000 possible (simple) passcodes? How long does it take to reset an iPhone 1000 times?

Also, similar to shady’s idea: copy the raw (possibly encrypted) flash data as a backup beforehand; perhaps with an ICE debugger or one of these. Then hotwire a duplicate flash chip as a sandbox/slave to accept any destructive writes while preserving the original chip.

JEDEC description:

WE_x_n: WRITE ENABLE The WE_x_n input controls writes to the I/O port. For Asynchronous SDR Data, commands, addresses are latched on the rising edge of the WE_x_n pulse. For Toggle DDR commands, addresses are latched on the rising edge of the WE_x_n pulse

WP_x_n: WRITE PROTECT The WP_x_n disables the Flash array program and erase operations.

Samsung’s FLASH datasheet:

WE
WRITE ENABLE
The WE input controls writes to the I/O port. Commands, addresses are latched on the rising edge of the WE pulse.

WP
WRITE PROTECT
The WP pin provides inadvertent program/erase protection during power transitions. The internal high voltage generator is reset when the WP pin is active low.

Dirk Praet March 1, 2016 8:14 AM

@ Skeptical

You also repeatedly claimed that Americans have a somewhat hysterical attitude towards terrorism in comparison with other countries. I actually disagree.

I think we all disagree. s/somewhat/completely .

As to the US thinking itself the only country in the world, I suggest that a few minutes contemplation of US foreign policy might raise some doubts.

Anyone doing so after just a few minutes of contemplation comes to the inevitable conclusion that the US actually is very much convinced it can impose its rules globally.

… the framers bound the powers of government by setting parts of itself against one another – not by empowering individuals or companies to be secure against lawful searches so that they could also be secure against unlawful searches.

Jefferson and Adams, even after the Revolutionary War, regularly used encryption in their correspondence for fear that then Postmaster General Gideon Granger – the J. Edgar Hoover of his day – would use his position to opportunistically seize and learn about their communications. I very much doubt they would have agreed with any proposed legislation granting the government the authority to compel decryption.

As to Apple’s trade secrets or the forensic tool they build, the specifics of neither would be subject to public view.

Fast forward 2 years. The Intercept publishes an interview with a rogue Apple engineer who has just uploaded the fbiOS to the Pirate Bay and has applied for asylum in Ecuador. Forensic investigation by Mandiant/FireEye subsequently reveals he gained access by accidently stumbling over an implant leading back to an ip address in the Beijing area. President Trump preemptively bombs Quito and goes to DEFCON-1 “until it is clear what the f*ck is going on”.

But I suppose that sort of rhetoric is all that’s left when neither the facts nor the law is on your side.

Judge Orenstein seems to disagree. He just totally and completely struck down all of the government’s arguments in the strongest of words: “The implications of the government’s position are so far-reaching — both in terms of what it would allow today and what it implies about congressional intent in 1789 — as to produce impermissibly absurd results.”

@ Brian T.

What about a hardware-only hack?

I think there’s actually no doubt whatsoever that both Apple and a sufficiently well-resourced entity – with some minor assistance of Apple – are perfectly capable of doing what the FBI is asking. The real issue here is whether or not the government can legally compel Apple to do so under the AWA or any other statute.

Curious March 1, 2016 10:42 AM

According to ABC news, Apple want to address US Congress as I understood this:

“Apple Calls on Congress to Step In Over Showdown With FBI”
http://abcnews.go.com/US/apple-calls-congress-step-showdown-fbi/story?id=37305323

Apparently ABC news apparently has insight into what Apple want to say:

“The decisions should be made by you and your colleagues as representatives of the people, rather than through a warrant request based on a 220-year-old statute,” Apple’s top lawyer, general counsel Bruce Sewell, is expected to tell a House panel this afternoon.

The questions at the heart of this dispute could end up before the U.S. Supreme Court.

Clive Robinson March 1, 2016 1:22 PM

@ Brian T,

What about a hardware-only hack?

It’s possible but risky.

First of it’s not the ordinary Flash memory you are after, that contains the data encrypted via subkeys of the 256bit master key. Reading the encrypted data out should be fairly trivial, and you could use other phones of the same hardware revision to practice on to reduce the risk.

What you are after is the secret one way or hidden variable that is generated in the phone in the factory which via an unspecified algorithm builds the 256bit AES key in RAM when the phone is unlocked.

Besides the key generation algorithm it is not known how the user entered passphrase is used in the 256bit AES master key generation. Nor for that matter the hidden variable or how it is stored, which might be problematical if the hidden variable is only partly recovered.

I can think of a number of ways to store the hidden variable such that only partial knowledge of it’s stored value would be of little use, because of the way you would mix in the passphrase with it.

Thus the question arises as to how and where the hidden variable is stored within a SoC. Without knowledge of this, then any risk from decaping etc is multiplied.

Aare Tali March 1, 2016 2:28 PM

Most news keep saying ‘the phone belonged to terrorist’, very few news actually say the phone belongs to the county and Farook just used it in his line of work. So, technically, this should be a case of ‘lost PIN’ and the real owner (county or some higher level of the government) has a legitimate request to unlock the phone. The request came from FBI and not from San Bernardino county IT department but that should be irrelevant, they are both part of the same government, and the request came from one part of the government to help unlocking a phone that belongs to another part of the same government.

Skeptical March 1, 2016 4:33 PM

@Dirk: Current polling indicates that roughly 7% of US voters think that terrorism is the most important problem facing the country. That’s hardly consistent with any notion of hysteria.

As with any government, I think the US is aware of the size of the world and the limits of its power. Save the cartoonish view of the United States for someone less aware of the continuous debates within that country about the limits of what can be done in the Middle East and elsewhere. Save it for someone less aware of the costs that the U.S. incurs when it intervenes anywhere, and for someone less aware of the extent to which other nations rely on U.S. power for their ability to govern themselves and to defend their own interests.

Speaking of cartoonish, let me address the scenario with an appearance by Trump in which Apple’s secret leaks (to be clear, I’m calling Trump cartoonish). Here’s the thing Dirk: that leak is something that can already happen. Assisting the US Government in executing lawful search warrants isn’t going to make it appreciably more likely. And if you live in a nation like the PRC, or Russia, it doesn’t matter at all.

As to Orenstein… I’ve said enough in another thread. Let me just add here that I am impressed with the flexibility and the stretch of his reasoning.

Clive Robinson March 1, 2016 5:19 PM

@ Aare Tali,

… very few news actually say the phone belongs to the county and Farook just used it in his line of work.

Whilst SB County purchased the phone, it has not been made clear if it was a work or personal or both phone.

As far as copyright law is concerned, unless he very specificaly signed away his rights, anything he typed on the phone, photos he took or other “creative work” he made legaly belonged to him and now belongs to his estate, and would be covered by any will or deed of assignment he had made (including one made to voice or video tape if verified as such).

The passphrase would in fact be a “creative work” in it’s own right, and neither SB County or the FBI have any “right of ownership” to it or the data it protects.

The only rights the FBI have are “evidentiary rights” which unless they applied for a court order would mean that the data, passphrase and AES256 master key should be handed over to his estate as they can nolonger be used to prosecute him.

The only way the “evidentiary rights” could be extended would be if the data held evidentiary value in another case.

Which is where the fun starts with the “chain of custody” and potential “fruit of the poisoned vine” challenges.

A legal representatives of the DOJ has made representation into court on behalf of the FBI. She has made a statment to the court that Apples “Trade Secrets” will not be disclosed, to remove any IP argument for “undue burden” from Apple.

Unless the FBI know there is no evidence on the phone or have no intention of using data on the phone as evidence, then that non disclosure statment is an outright lie as both the DOJ representative and the magistrate should be well aware.

If the statment is true then the logical conclusion is “Why do the FBI want access to the phone data?” because either they know there is nothing on the phone or they have no intention of using it as evidence in any case…

This in turn raises all sorts of other questuons about the FBI’s motivation. Of which the most charitable is “The FBI are grandstanding to get ‘case law’ precedent that democraticaly elected US law makers have refused to give the FBI by legislation”. There are less charitable reasons that could be ascribed to the FBI actions such as “Comey has decided to use the case to actively harm Apple in the US market as revenge for opposing him” or “Comey is hell bent on destroying the US telecommunications industry, by making it’s products unsalable to not just those in the US Jurisdiction but the rest of the world as well”.

At the end of the day all the SB County actually has without a signed waiver of rights from Farook is an expectation for the return of physically what is left of the phone and any accessories they supplied with it not any data on them. They might have a claim to some of the data if and only if they can prove they have rights to the “creative works” outside of Farook’s direct or derived “creative work” rights. At the end of the day, without a waiver of rights they would not even be allowed to even erase the phone, and thus might find it less expensive to either give it or sell it at a nominal sum to his estate.

Interestingly the FBI is not the only one who could apply for a court order. The victim’s of Farook could seek “property rights” as part of a civil suit against the estate, and in turn against SB County based on the fact that his “creative works” had realisable value for damages etc. It’s reasonably probable that there would be lawyers who would do it for a whole variety of reasons not least of which would be publicity…

As was once observed “It’s not only the dog that returns to it’s vomit…”.

Sancho_P March 1, 2016 6:03 PM

@Aare Tali

No matter who owns the device, the point is:
To protect user content, Apple intentionally designed the device to accept tactile key input, delay in case of false tries and eventually delete key mat after 10 unsuccessful attempts.

If the county doesn’t want secure phones the’d better buy insecure ones.
Hurry up, soon such devices will be second hand only.

Dirk Praet March 1, 2016 7:48 PM

@ Skeptical

Current polling indicates that roughly 7% of US voters think that terrorism is the most important problem facing the country

It’s heartwarming to see that at least the American people are less hysterical about it than their government, elected representatives and military-industrial complex.

Save it for someone less aware of the costs that the U.S. incurs when it intervenes anywhere

Judging from the gargantuan costs and catastrophical consequences of the Iraq wars, it would rather appear that there was exactly zero awareness on behalf of the criminal still-at-large half-wits who thought of it as a really good plan. Same for Afghanistan.

… and for someone less aware of the extent to which other nations rely on U.S. power for their ability to govern themselves and to defend their own interests.

Isn’t that what neo-colonialism is really all about? Not that it’s a strictly American thing, though.

… (to be clear, I’m calling Trump cartoonish).

There is no denying that my somewhat premature idea for this year’s movie plot contest is somewhat cartoonish. I do however hope there is a bit more awareness in the US of the possible consequences of electing this lunatic cartoon character for president than, well, when you prepared to invade Iraq.

…that leak is something that can already happen. Assisting the US Government in executing lawful search warrants isn’t going to make it appreciably more likely.

Err, no. The mere existence of the requested fbiOS would trigger any and all sufficiently resourced IC agency or cybercriminal gang to go after such a crown jewel with even more zeal than before.

Anyway, do pardon my today’s sarcasm. I had a really good day and I’m watching a pretty hilarious episode of Californication while I’m typing this.

TechCritic March 2, 2016 12:17 AM

I find it hard to believe that the CTO of a cyber security incident response company would write this without understanding the basics of iOS security. I am not the CTO of a company, and I was easily able to obtain this information. The information was available even before this incident. I mean it seems like you just made broad assumptions and didn’t fact check.

“The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications: Authorities have the phone in their lawful possession, and they only need help seeing what’s on it in case it can tell them something about how the San Bernardino shooters operated. But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.”

This is completely false. The software would in fact be specific to that one single iPhone unless Apple actively cooperates in making it work on other phones. Even if the modified OS was leaked to the public, no one could get it to run on any other iPhone. The software would be useless on any other iPhone for the same reason you cannot downgrade to older versions of iOS even if you have a backup of the entire OS.

In order for iOS to run on an iPhone, the code must be signed by Apple’s servers for that one specific phone. Before you update iOS, the phone takes a hash of the new OS files and sends them to Apple’s servers along with the phones unique hardware ID. If the hash matches Apple’s hash for the official version of iOS, Apple’s servers use the secret key to encrypt a token containing BOTH the hash of iOS and that particular iPhones Hardware ID. On every boot, the phone hashes it’s OS files and then uses Apple’s public signing key which is embedded in the hardware to decrypt the token from Apple’s servers. In order for the phone to boot, BOTH the hash and the hardware ID from the token must match those of the actual phone. If EITHER does not match the phone won’t boot or the update won’t install.

If the FBI had Apple’s private signing key, this never would have made it to the media. If the FBI could compel Apple to hand over it’s signing key, this never would have made it to the media. Writing the software to disable the security measures is definitely not beyond the FBI’s abilities, but they could never get that software to actually run on the iPhone in question or any other iPhone unless it’s signed with Apple’s private key for that one specific device’s unique hardware ID. Two identical iPhones cannot run the exact same firmware unless Apple individually signs that firmware for each phone.

The FBI is NOT even asking for Apple’s key. And even if they were asking for the special OS code, which they are NOT, they would be no more able to figure out Apple’s private key using that than using any other iOS version that’s ever been released. They can’t extract the private key from an iOS image or signed token. They would have already done so if they could. The signed token allowing the special OS to run on that one phone will be no more crackable than any other token Apple has issued.

The only risk whatsoever is the PRECEDENT. If Apple does it once, next time the NSA can issue a court order with a gag order.

“The FBI told the county to change the password on the phone — that’s why they can’t get in. What the FBI needs is technical expertise, not back doors.”

The phone hadn’t been backed up automatically in 6 weeks, unless the guy completely changed his whereabouts, this very very likely indicates that the turned off auto-backups, so unless MDM software could remotely initiate a backup with that setting off, changing the iTunes password is irrelevant. Unless MDM software can force a backup, even Apple knew there was almost no chance of the phone backing up when it connected to a known WiFi hotspot.

And one might also ask why Apple doesn’t encrypt iCloud data locally, and why Apple so willingly turns iCloud data over to law enforcement, yet they don’t warn their customers of this possibility. What’s the point of FDE if it is defeated by daily unencrypted backups? It is very dishonest and misleading for Apple to tell customers that iCloud data is encrypted without mentioning that they have the keys and decrypt it when a court order is granted. I guarantee 90% of the people toting the iPhone’s security are unaware of this, since Apple users are not a technical bunch.

The fact is, the ability to install an Apple-signed iOS update and retain user data without requiring the user’s passcode is a backdoor that Apple created only for itself. They could have had the phone wipe itself before an update if the passcode is forgotten. They, in effect, created a backdoor for only themselves, and now other parties want access to it. Apple should have taken its own arguments against backdoors more seriously. As Apple itself argues, it’s nearly impossible to reserve a backdoor for just the “good guys,” so they shouldn’t have created one in the first place. Why did they? Likely because “privacy” really is only a marketing tactic for Apple, and they know that if a bad update were to prevent users from entering their passcode, forcing a device wipe, the users would turn on Apple. The reality is that Apple users are unwilling to deal the the consequences of true security and Apple knows that.

As for the author, I think it’s very irresponisble for you to be writing for the Washington Post without even understanding iOS security first. You are spreading misinformation while presenting yourself as an expert.

Clive Robinson March 2, 2016 6:25 AM

@ TechCritic,

I find it hard to believe that the CTO of a cyber security incident response company would write this without understanding the basics of iOS security.

A bold statment if ever there was one…

You are claiming that your understanding of iOS security is better?

Based on what? The partial information Apple supply, your assumptions on that or do you have information not available to others?

I mean it seems like you just made broad assumptions and didn’t fact check.

Hmm are you any different if not worse? That is have you actually thought further than the limited information Apple supply?

The software would in fact be specific to that one single iPhone unless Apple actively cooperates in making it work on other phones.

It rather looks like you have not.

What the FBI want’s is code that runs from the iPhone RAM to bypass certain security features.

There are two important points to note,

1, There are otherways executable code can get into the iPhone RAM.

2, The actual bypass code is very likely to be generic not specific to any one iPhone.

Talking about the code signing process as the “be all and end all of the issue” shows a limited knowledge of the actuality of general purpose computing and the lack of capacity to think beyond the limited information Apple have supplied.

Code signing only becomes relavent if you can positively rule out all other methods of getting generic code into the phones RAM and having it execute.

That is the first of your thinking limitations to get over, there are several more. One of which is what further information that generic bypass code would tell people about getting around Apple’s security.

As currently indicated by Apple both the user passphrase and a hidden internal variable are used to build the AES 256bit Master Key in RAM. Code to bypass the passphrase time out and attempts would give further insights into Apple’s “Trade Secrets”, which currently are “unknown” so we know we will end up with more information, but we don’t know how harmful it will be to other aspects of the iPhone security.

What computer security history tells us is that the layered security process is usually less like the Onion metaphor and a lot lot more like an Egg, in that you crack the outer shell and you have a real mess on your hands. The latest example of this is the SSL DROWN attack which looked to resources intensive to be practicle. But further thinking on the information fairly quickly gave rise to the Special DROWN varient that puts it well within the “back room scriptkiddy” attack resource range (see Mat Green’s comments).

Any way go have a real think, think hinky and act like a resourceful and insightful attacker, not a code cutting developer.

Gerardine March 2, 2016 12:36 PM

Apple might as well advertise that terrorists don’t need burner phones they can just use an iphone. Apple protects the security of all activities even criminal and stops detection of criminal activity. To Apple – all criminals have more rights than law abiding citizens and victims. This is disgraceful – why can’t apple just dump that phone or any specific phones of terrorists and give the information to the FBI ?

TechCritic March 2, 2016 2:04 PM

Was I a bit of an asshole? Yes. However, the author is writing for a major publication and implicitly declared as an expert on security. He could have disclosed that he has little experience with iOS security in particular. He could have disclosed that he was uncertain of exactly how Apple’s iOS security works, so while what he said applies generally, it may be inaccurate in this case. He did not.

I am not presenting myself as an authority. I did not even write a personal blog article on this as an anonymous source with no credibility. Being that I am an internet commenter and not a journalist here, I am not under the same ethical standards for disclosing the extent of my knowledge. It is not unethical for me to present conjecture without explicitly labeling it as such. That being said, I did actually consult sources to learn about this. Essentially what you are saying is that since Apple releases limited information, anything is possible, and the author need not even mention the security systems we do know about and their potential implications. It’s one thing to state that there may be an exploit to get around the security systems, and it’s another to act as if they don’t even exist at all and it’s certain that unsigned code can be run no problem.

“But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key.”

No mention of the fact that the software needs to be signed by Apple with a secret key that has never once been leaked. No mention that it may be possible to bypass the check.

“But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.”

“It has to.” He has stated that it is certain this software would work on any iPhone. He still has not mentioned the need for Apple to sign the code or the fact that the code on all modern iPhones is signed for that one particular device’s hardware ID. We know that this is the case even without the information Apple has disclosed from the jailbreaking community. When Apple first started to forcibly prevent iOS downgrades (later 3GS models and iPhone 4), the jailbreak community immediately noticed this and investigated. There’s some info from this time in the archives of Saurik’s blog (Saurik creator of Cydia). In it’s first incarnation Apples signing token did not contain a anti-replay value, so people would intercept and a save Apple’s signature token for that version of iOS for their individual phones Hardware ID, allowing them to downgrade with “permission” from that token. If the user did not proactively save the token, they would never be able to downgrade. With the 4S and above, Apple added a random anti-replay value to the token preventing its reuse and putting us in the situation we are in today where downgrades are not possible after Apple’s servers stop signing that version of iOS for a particular model of phone. AFAIK even the jailbreak community has no way around this to this day.

iOS jailbreaks exploit a flaw in an Apple-signed version of iOS specific to that one individual device in order to run unsigned code. The jailbreak only comes into play after the OS has started booting. If the OS was not signed by Apple for that hardware ID, the phone would never reach the point in the boot procedure where the exploit can be applied. If the FBI could install such a jailbreak exploit on the phone without entering the passcode to enable disk decryption, they would have already done so, and we wouldn’t be hearing about this case. They haven’t been able to do that yet, and neither has the jailbreak community.

“Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies.”

Finally he mentions signing, but still not that each phone needs it’s own specific signature for its hardware ID and that such signatures are not transferable even between two identical iPhones. He doesn’t even mention this in his multiple corrections. This is the most important and most revalent security feature to this discussion, and he doesn’t even mention it’s existence. We don’t know of any way to bypass it, acting as if it doesn’t exist at all is spreading misinformation. Suggesting that it could potentially be bypassed is another story.

And I will address the rest of your response later when I have time, but the FBI wants the iPhone to boot from the RAM Disk as you mentioned. If the phone is to boot from the RAM disk, it will still require software signed as I described above. The phone is still booting, it’s only getting the OS files from a different place. As far as we know they are still subject to the same security checks and those security checks are built into the hardware. It’s possible that using a RAM disk could somehow bypass the Hardware ID check, but we have no reason to believe that as of now, and even if we did, the check must still be mentioned as a roadblock.

TechCritic March 3, 2016 1:18 AM

@Clive

Follow-up

What the FBI want’s is code that runs from the iPhone RAM to bypass certain security features.

There are two important points to note,

1, There are otherways executable code can get into the iPhone RAM.

The FBI is not asking Apple to use or devise one of these other ways. Apple does not need to use a hack that bypasses the signature check because it made the conscious design decision to maintain user data through a software update even without the passcode being entered. It’s possible that there is some other way as you said, but Apple will certainly not choose to use that method when they could just sign an update that disables a few features. Using a signed update is the safe way of doing this because that would make it impossible to use the software directly on any other iPhone if it leaked.

2, The actual bypass code is very likely to be generic not specific to any one iPhone.

Apple is being asked to use a signed update to accomplish this. Apple has every reason to do that because that is the safest way to do this. The actual bypass code itself may be generic, but it is useless if it isn’t signed, so that makes it not generically usable. On top of that, Apple is in no way being required to share this code. In the contingency where the code leaked it would still be low risk.

Talking about the code signing process as the “be all and end all of the issue” shows a limited knowledge of the actuality of general purpose computing and the lack of capacity to think beyond the limited information Apple have supplied.

Code signing only becomes relavent if you can positively rule out all other methods of getting generic code into the phones RAM and having it execute.

Again, Apple would never consider another method. Apple does not need to consider another method. Apple is not being asked to consider another method. If there is a vulnerability that would allow unsigned code to run from the RAM this will not be using it and calling attention to it.

That is the first of your thinking limitations to get over, there are several more. One of which is what further information that generic bypass code would tell people about getting around Apple’s security.

This is your only legitimate point. It is possible that even though the software cannot be executed directly on another iPhone, IF it were leaked, it could allow hackers to learn more about how the security features work which would aid them in finding potential exploits. Apple has no obligation to turn the software over to anyone though, so this is not a direct threat.

If your point is that IF Apple doesn’t prevent the software from being leaked, it might remove security through obscurity and potentially give hackers more avenues for exploits down the road, then I agree with you. However, for that matter, the iOS source code is nearly as much of a risk, and Apple doesn’t have problems not leaking that.

I look forward to your response @Clive

Clive Robinson March 3, 2016 3:02 AM

@ Techcritic,

The FBI is not asking Apple to use or devise one of these other ways.

That is irrelevant as you should well know. It’s what all the other attackers do that is relevant.

Using a signed update is the safe way of doing this because that would make it impossible to use the software directly on any other iPhone if it leaked.

Wrong, the signed update does not stop the generic part of the code being used on another phone. It only stops the use of the update process on another phone. Thus if an attacker pulls out the generic code out of the update and uses another method to put it on another phone then it will run.

From your first paragraph you are obviously not listening to what you have been told, and likewise appear to not want to listen. Worse looking at your presious post you come across as dishonest,

I am not under the same ethical standards for disclosing the extent of my knowledge. It is not unethical for me to present conjecture without explicitly labeling it as such.

Therefore it is simpler for both myself and others to stop reading your comments and ascribe your behaviour to other reasons. However what ever the actual reason you are still not thinking like an attacker, and therefore would not be much of a defender. To make the change you first have to listen to what you’ve been told…

As has been pointed out by many in the past “You can take a horse to water…”

Therefore there is no point in entering into a process that will be a waste of my time and the resources of this blog. And to answer your first question, yes you do come across that way, and thus combined with your other traits, I’ll leave it to others to judge and or respond as they see fit.

ianf March 3, 2016 4:09 AM

~25k text later…

@ Clive […] Therefore it is simpler for both myself and others to stop reading your comments and ascribe your behaviour to other reasons

WHAT TOOK YOU SO LONG?

Clive Robinson March 3, 2016 6:59 AM

@ ianf,

WHAT TOOK YOU SO LONG?

A good question, I would like to think it’s my genial good nature, desire to help others, good humour, politness and being “an all round good egg”…

Now if those rolling around with laughter would let me keep me delusions we can all enjoy the moment 😉

Dirk Praet March 3, 2016 7:54 AM

@ Clive Robinson, @ ianf,

WHAT TOOK YOU SO LONG?

Just wanted to extend a small word of appreciation for your efforts explaining the technical background.

@ TechCritic

I am not under the same ethical standards for disclosing the extent of my knowledge.

Which makes perfect sense as it would appear to be either limited or misguided.

ianf March 3, 2016 10:42 AM

Those not yet bored stiff with the ongoing FBI-vs-Apple soap opera, now conducted in the Capitol, Washington, D.C. may care to read a fairly short résumé of Tuesday’s proceedings there [long story short: FBI isn’t winning, but then neither YET is Apple]: http://gu.com/p/4h7n3

        OR

follow The Guardian’s LIFO-order liveblog

Chronologically – scroll UP to continue.

2:2 [from the beginning]

1:2 [post intermission]] scroll up

        and/or

read some earlier Guardian dispatches

http://gu.com/p/4h54v

http://gu.com/p/4h3ca

http://gu.com/p/4gqzf

Alternatively, read just this single exchange summary to get the feeling of the hearings (embellishments mine):

1 March 2016 7:48pm Congressman Chaffetz: the FBI already seriously overreaches.

We’re back, and the committee has resumed queuing up around Room 2141 of the Rayburn House Office Building to spank the director of theFBI.

Jason Chaffetz of Utah asks Comey when historically the government has been able to compel a company to create a product to help it. Comey says he doesn’t know but cites one of the cases in the DoJ’s brief, New York Telephone. Comey is fairly short with him.

Chaffetz then tells Comey he thinks the FBI already overreaches to a serious extent. “We can’t even see the degree to which you’re using Stingrays, or the requirements [to use them].” Chaffetz points out that the devices are apparently being supplied to the IRS without any congressional oversight. What is the occasion of their use, he asks. “Is it articulable suspicion? Is it probable cause?

I don’t have a great answer,” Comey responds. “I like the idea of giving as much transparency as possible.”

The Guardian has repeatedly attempted to use the Freedom of Information Act to acquire records on the use of Stingrays from law enforcement and the documents received have without exception been redacted to remove all mention of the device.

        OR

simply check this The New Yorker’s cartoon: Apple’s IMPOSSIBLE TO UNLOCK new phone

RoundSparrow March 9, 2016 2:50 PM

The iPhone uses LDDR3 RAM, 1GB of DDR3. The chip is right there on the board.

Why can’t an attack be devised that you desolder that DDR3 chip, replace it with your own emulating chip – that allows you to dump and modify the running operating system? You are no longer bound by the size and power requirements that Apple has when mass producing these.

DDR3 has no ECC or way to know it’s memory is being modified – including the operating system kernel You could hack the code that is designed to walk all the storage and erase it – and modify that code to dump it out a USB port or something… the code’s already there for you to walk all the storage 🙂

Clive Robinson March 10, 2016 6:20 AM

@ RoundSparrow,

Why can’t an attack be devised that you desolder that DDR3 chip, replace it with your own emulating chip

Because of the “hidden variable” that’s kept –or should be– inside of the CPU chip that uses it and the passphrase to build the 256bit AES master key that eventually ends up in RAM –in some form[1]– Only when the phone is in the unlocked state. The secrecy of the Master Key and how it comes into being is what all iPhone security rests[2].

[1] Whilst the AES master key may end up in RAM it is far from clear how or in what form. That is it could be somehow encrypted in some form against a temporary random value held inside the CPU chip. I’ve talked about how to do this several times in the past on this blog. It’s therefore quite reasonable to expect that the security engineers who designed Apple’s iPhone security are more than conversant with such techniques. And if Apple have done this then it’s more than reasonable to expect them to regard it as a “Trade Secret” with very high IP / Market value up in the tens if not hundreds of millions of USD (see what the going price of even very limited iPhone exploits[2]).

[2] Which brings up yet again “the undue burden” the FBI realy don’t want you thinking about. Look at it this way if it were not an “undue burden” then either the FBI could work around it themselves, or pay a third party to do it. The fact that they provably (que of other phones the have) have not actually shows it is a considerable burden in one way or another, which destroys the FBI / DOJ argument for use of the AWA.

DavidPun March 29, 2016 1:59 PM

Most of this stuff is getting archaic. I now use a new form of encryption where the encrypted stream can be decrypted into multiple different plain-texts. Different passwords generate different plain-texts from the same encrypted data stream. Hence even if someone thinks they have found a password, they still don’t know …in principle…whether they have obtained the correct plain-text even if it looks as if a password is successfully decrypting the message. Its the wrong message. The most recent version I have been playing with can decrypt into 64 different plain-texts. This means that the end user still holds the key to determine which plain-text is the correct version. This encryption scheme was actually developed in the UK as a means of feeding misinformation to snoopers trying to break an encryption code. They think they have broken the code, but are recovering wrong information.

Dirk Praet March 29, 2016 7:52 PM

@ DavidPun

I now use a new form of encryption where the encrypted stream can be decrypted into multiple different plain-texts. Different passwords generate different plain-texts from the same encrypted data stream.

That is exactly the feature I have always wanted. Would also come in quite handy for obliterating encrypted disks/volumes at mount time in case you’re dealing with bad folks who forget to clone your disk(s) before asking you to grant them access to your machine.

Got any links?

ianf March 30, 2016 8:27 AM

@ Dirk Praet, DavidPun […] “Different passwords generate different plain-texts from the same encrypted data stream.

While I can envision this on pure philosophic[k]al level, I have difficulties grasping it in terms of algorithms.

Because for that to work, the compound ciphertext would need to contain all the different original plaintexts, hence any subsequently decoded output plaintext would be just a fragment of that ciphertext. In effect, the different passwords would be not so much pass-keys, as obfuscated (and traditionally encrypted) commands to the decrypting engine as to which path through the ciphertext to take. As such, and being of limited length, they’d probably be pretty easy to break down…

    Simple calculus tells me that if the ciphertext is, say, 10kB long, yet a password decrypts only 1.2kB seemingly complete plaintext AND THEN EXITS WITHOUT ERROR, then something IS rotten in the state of Denmark. Do enlighten me.

@ dud – You DON’T SAY!!!! Can’t speak for Bruce, but I’m sure he’ll be sooooo relieved to know that now, and after all that time during which we all wandered in circles wondering “could Stuxnet be Israeli or could it not.” Glad to have that resolved at last, and in so direct a fashion. Keep up the good work.

Dirk Praet March 30, 2016 9:38 AM

@ ianf

Simple calculus tells me that if the ciphertext is, say, 10kB long, yet a password decrypts only 1.2kB seemingly complete plaintext AND THEN EXITS WITHOUT ERROR, then something IS rotten in the state of Denmark.

I don’t know. A severe case of random length padding?

Clive Robinson April 5, 2016 10:32 AM

Hmm,

It was obviously an inside job…

Actually the evidence is very far from conclusive in any direction.

Including the FBI are lying through their teeth, to pull themselves out of a hole that they and the DOJ dug for themselves, when the FBI went public even before the magistrate had inked the original request, and the DOJ went of planet with their submissions to the court.

ab praeceptis April 13, 2017 2:09 AM

@John Galt

Maybe it’s just me and I’m too picky but I feel that Clive Robinson deserves some more respect when addressing him. He is always polite, always knowledgable, always constructive.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.