Lawful Hacking and Continuing Vulnerabilities

The FBI’s legal battle with Apple is over, but the way it ended may not be good news for anyone.

Federal agents had been seeking to compel Apple to break the security of an iPhone 5c that had been used by one of the San Bernardino, Calif., terrorists. Apple had been fighting a court order to cooperate with the FBI, arguing that the authorities’ request was illegal and that creating a tool to break into the phone was itself harmful to the security of every iPhone user worldwide.

Last week, the FBI told the court it had learned of a possible way to break into the phone using a third party’s solution, without Apple’s help. On Monday, the agency dropped the case because the method worked. We don’t know who that third party is. We don’t know what the method is, or which iPhone models it applies to. Now it seems like we never will.

The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations.

Compare this iPhone vulnerability with another, one that was made public on the same day the FBI said it might have found its own way into the San Bernardino phone. Researchers at Johns Hopkins University announced last week that they had found a significant vulnerability in the iMessage protocol. They disclosed the vulnerability to Apple in the fall, and last Monday, Apple released an updated version of its operating system that fixed the vulnerability. (That’s iOS 9.3­you should download and install it right now.) The Hopkins team didn’t publish its findings until Apple’s patch was available, so devices could be updated to protect them from attacks using the researchers’ discovery.

This is how vulnerability research is supposed to work.

Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and­—more important­—everyone is more secure as a result of the work.

The FBI is doing the exact opposite. It has been given whatever vulnerability it used to get into the San Bernardino phone in secret, and it is keeping it secret. All of our iPhones remain vulnerable to this exploit. This includes the iPhones used by elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents.

This is the trade-off we have to consider: do we prioritize security over surveillance, or do we sacrifice security for surveillance?

The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies. A vulnerability in Windows 10, for example, affects all of us who use Windows 10. And it can be used by anyone who knows it, be they the FBI, a gang of cyber criminals, the intelligence agency of another country—anyone.

And once a vulnerability is found, it can be used for attack­—like the FBI is doing—or for defense, as in the Johns Hopkins example.

Over years of battling attackers and intruders, we’ve learned a lot about computer vulnerabilities. They’re plentiful: vulnerabilities are found and fixed in major systems all the time. They’re regularly discovered independently, by outsiders rather than by the original manufacturers or programmers. And once they’re discovered, word gets out. Today’s top-secret National Security Agency attack techniques become tomorrow’s PhD theses and the next day’s hacker tools.

The attack/defense trade-off is not new to the US government. They even have a process for deciding what to do when a vulnerability is discovered: whether they should be disclosed to improve all of our security, or kept secret to be used for offense. The White House claims that it prioritizes defense, and that general vulnerabilities in widely used computer systems are patched.

Whatever method the FBI used to get into the San Bernardino shooter’s iPhone is one such vulnerability. The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately.

This case has always been more about the PR battle and potential legal precedent than about the particular phone. And while the legal dispute is over, there are other cases involving other encrypted devices in other courts across the country. But while there will always be a few computers­—corporate servers, individual laptops or personal smartphones—­that the FBI would like to break into, there are far more such devices that we need to be secure.

One of the most surprising things about this debate is the number of former national security officials who came out on Apple’s side. They understand that we are singularly vulnerable to cyberattack, and that our cyberdefense needs to be as strong as possible.

The FBI’s myopic focus on this one investigation is understandable, but in the long run, it’s damaging to our national security.

This essay previously appeared in the Washington Post, with a far too click-bait headline.

EDITED TO ADD: To be fair, the FBI probably doesn’t know what the vulnerability is. And I wonder how easy it would be for Apple to figure it out. Given that the FBI has to exhaust all avenues of access before demanding help from Apple, we can learn which models are vulnerable by watching which legal suits are abandoned now that the FBI knows about this method.

Matt Blaze makes excellent points about how the FBI should disclose the vulnerabilities it uses, in order to improve computer security. That was part of a New York Times “Room for Debate” on hackers helping the FBI.

Susan Landau’s excellent Congressional testimony on the topic.

Posted on March 30, 2016 at 4:54 PM91 Comments

Comments

Douglas Knight March 30, 2016 5:25 PM

Why do you believe that there actually is a vulnerability and that the FBI actually has accessed the phone? Isn’t a simpler explanation that the FBI just perjured itself to get out of the court case?

ianf March 30, 2016 6:04 PM

Look, Douglas, there always are vulnerabilities, esp. in such a complex piece of OSware as is the iOS. So there is no gain in pooh-poohing such. That necessarily doesn’t mean that the FBI did manage to exploit it to own advantage, nor that they gained any vital intel from that iBone… we’ll never know that (“we’re NOT WORTHY” hint, hint).

It was obvious from the start that that wasn’t the objective of the FBI. So whether they later “perjured themselves” to vacate that stupid suit OR NOT is besides the point. And because they are the Feds, there to guard the letter of the constitution, who’s to say that their words might’ve been a perjury?

What I wrote a week ago still applies:
https://www.schneier.com/blog/archives/2016/03/friday_squid_bl_518.html#c6720033

G March 30, 2016 6:08 PM

We could “help” the FBI by not allowing them to buy exploits unless they gain full details of the vulnerability. Of course the cost will rise accordingly, but on the upside we would then know that they are able ot disclose the vulnerability to the vendors for the benefit of all…

Hubert March 30, 2016 6:14 PM

When you say “The FBI plans to classify this access method”, what does that mean for the company that discovered it? Would they be breaking the law if they sell their software to someone else? Would they have to prevent their own employees from seeing the source unless they have clearance?

Evan March 30, 2016 6:14 PM

One is tempted to think that the FBI is aware of this line of reasoning and doesn’t care. Yes, leaving unpatched vulnerabilities exposes US economic, intelligence, and military assets to attack, and those attacks are more frequent and sophisticated than ever before, but so what? The FBI’s job is to conduct counter-intelligence and counter-terrorist activities, so everything something gets hacked they get even more money and power. If we somehow rolled out secure systems for everything in the government tomorrow, a lot of people in the intelligence/law enforcement community would find themselves struggling to justify their existence.

Thoth March 30, 2016 6:22 PM

The Feds and many other intel related agencies rely on the business model of perpetual and continued existence of vulnerabilities to gain a NOBUS advantage. It is just how it is for their business model of spying and being outright dishonest to the Congress and people whom they are supposed to serve and be answerable to.

There is a saying that the stripes of a tiger is very difficult to change. Similarly, the mental attitude they possess have been ingrained into them for decades and the system being weak and totally reluctant to force them to change and be answerable to the public while allowing them to get away with nasty things they do in the line of duty.

All boils down to their business model… to keep the civilians as insecure and as obedient and predictable as possible to make their jobs easier.

Tõnis March 30, 2016 6:53 PM

“We don’t know what the method is, or which iPhone models it applies to. Now it seems like we never will.”

In one of its briefs, the government argues something to the effect that the public “needs” or “deserves” to know what is on that phone. That was probably just another government lie … within a court brief. Government liars.

Gweihir March 30, 2016 6:58 PM

The interesting question is whether they will eventually learn that what they do is far more destructive than terrorism could ever be or whether they will keep this stance to the bitter end (the end of that part of the FBI, hopefully, not something larger).

I think they mistake having access and keeping access by not disclosing their methods for having power. That it cannot work long-term is pretty clear: They will need to eventually use evidence gained this way in court and then they will have to disclose what was done in order to prove that the evidence was not tampered with. Sure, for a while they can get by with plea-bargains (in their current form usually legalized coercion using extreme threats), but eventually, they will run into somebody that is willing to fight and unwilling to bow to power and where they cannot just drop the case or remove that aspect. What then could happen is publication of a severe vulnerability as part of the court proceedings, about the worst way to do it.

Currently, the threats from criminal hackers is heating up again, with the crypto-extortion malware wave we are seeing. This is “extortion-kit” malware that does not need a lot of understanding on the side of the criminal, and it is getting pretty sophisticated and effective in the face of pathetic-level defenses. The number and nature of the victims we are hearing about is impressive. This can only get worse and if the FBI thinks it is a good idea to contribute to it getting worse to keep some power, then they are part of the problem.

ianf March 30, 2016 7:10 PM

Actually, Evan, the FBI’s primary objective is TO BE SEEN conducting counter-intelligence and counter-terrorist activities in order to secure ever bigger budgets for conducting counter-intelligence and counter-terrorist activities.

If that sounds like a circular logick, that’s because it is. Also, it’s far easier to achieve the goal of being seen as… than to actually combat homegrown terrorism, esp. such of the under the radar type.

Praelatura Sanctae Crucis et Operis Bozo March 30, 2016 7:26 PM

So FBI plans to classify this access method. Think it through.

…the FBI that let its designated enemies and strategic rivals take sensitive personal information on 21 million officials in positions of trust, including detailed dossiers on the perversions, addictions, mental disorders, and crimes of the blackmail-ready weak links;

….the FBI that let the Chinese take many terabytes of F-35 designs and schematics, fix the worthless half-trillion dollar white elephant, and build one of their own that can actually fly;

…the FBI that EFF-stickered tor-relay admin Ed Snowden made fools of?

Come on, we’ll know what they did in a couple weeks.

Z March 30, 2016 8:19 PM

The issues I have with the narrative of “all vulnerabilities should be disclosed and patched” are:

1) It completely removes the usefulness of hacking from a state point of view. To be effective, state-level hacking require zero-days. Asking the US government – or any government – to simply abstain from hacking is asking them to abandon a tremendously useful tool in a time period where the attack is dirt cheap compared to the defense. That’s a though pill to swallow for them, especially since everybody else is doing it.

2) It assumes that hunting for vulnerabilities and correcting them is actually a sensible strategy of making device and software secure, which is highly discutable. There are simply too many of them for such a strategy to be useful. Worst, most vulnerabilities are actually exploited after they have been found and disclosed, and before they are patched (when patching the entire ecosystem is actually practical – sometimes it is not). Put in another way, if the FBI vulnerability was super easy to exploit by everyone and difficult to patch, then disclosing it could lead to a ton of iphones being hacked very quickly.

I understand the idea of evaluating the tradeoff from the point of view of the defensive-minded security professional, but the trick is, not everyone has this point of view nor benefit from it.

WhiskersInMenlo March 30, 2016 8:26 PM

If I look at this from Bruce’s perspective he is 100% correct.

If I look at this from other perspectives a “secret” method is
as close to an ideal outcome as I would expect or desire.

The final document filed to tell the court “nevermind” does a number of
things. The big thing it does is remove the All Writs Act as a lever
to pry on or pummel Apple with for this specific device. There is
now at least one additional remedy known to the DOJ/FBI.

The response “It’s a secret” checks off a second All Writs Act consideration.
For whatever reasons this secret exists as per law. So check these two:
* The absence of alternative remedies—the act is only applicable when other judicial tools are not available.
* Usages and principles of law—the statute requires courts to issue writs “agreeable to the usages and principles of law.”

A future court would have to discover that the “remedy” found was not a remedy
allowing a chain contempt of court investigations. That court would also have
to strike down the classification used by the FBI to say “secret”.
So for this make and model Apple can say go away.

As a “secret” held by a federal agency there is no reach from foreign governments
that I can imagine. Had Apple provide a service others could request or demand it.

This outcome does not diminish the swiss army knife that is the All Writs Act.

Apple cloud data dumps will still happen but more applications will end to end
encrypt data a little or a lot and bad actors will not cloud store anything except
pictures of puppies. Law enforcement and evidence preservation folk will
not be resetting iCloud passwords.

Security flaws involving products sold globally are an important issue.
With this outcome an additional big flaw has not been introduced.

Global devices are an interesting security contexts. The flaw you know is a
flaw available to all, the good the bad and the ugly. Those that look at a
flaw and think that the knowledge gives them power must be reminded that
a flaw cuts both ways. If an international agency knows of a flaw that allows
them to steal secrets their own national secrets can be stolen in exactly the
same way.

If I had crawled under the wire of my FEBA to reconnoiter the enemy to fine
a flaw on their side then in crawling home I found flaw in my own defenses
I would be duty bound to report it. If both perspectives showed the same
flaw the urgency would be greater to report.
i.e. both inside looking out and outside looking in

The single minded players on the US/DOJ side like puppies might have had their noses
rubbed in the mess they were making and got grabbed by the scruff of their neck
and told to exit with grace. A secret method is such an exit. They can meet
public expectations and mutter that nothing was found or that secret drone strikes
were deployed…

Of interest there are 14 some million holders of secret or higher clearances.
Any flaw that allows a foreign agent to secretly snoop on them, track them, obtain
or plant compromising data is a serious national risk. Since all of these people
have one or more portable devices in their family or personal possession the big dogs of
national security might be the ones that told them to tidy it up. If not they should
have.

As a secret method, a bug can still be filed with Apple perhaps from an anonymous account and
when fixed the method used for this phone can still remain classified. This side channel report need never
be linked to this case. Afterall it is a secret.

Summary:
The inside looking out and outside looking in perspective matters.
This court apparently had blinders on. Because, well I can only guess.

Ambiguous March 30, 2016 8:53 PM

This is a pertinant article:
http://arstechnica.com/tech-policy/2016/03/us-says-it-would-use-court-system-again-to-defeat-encryption/

Title: US says it would use “court system” again to defeat encryption
Byline: Feds say they can force entire tech sector, not just Apple, to disable security.

US government officials from the FBI director down have said repeatedly that the FBI-Apple legal brouhaha was just about a single phone—the seized iPhone used by Syed Farook, one of the San Bernardino shooters. And just last week, James Comey, the FBI director, said his fight with Apple wasn’t about setting precedent; rather, it was about battling terrorism.

But it seems that the storyline has changed.

It was, of course, always about precedent.

Let me start off here, on a sarcastic note:

“World, terrorists, criminals of all stripes, American hardware & software is backdoored.”

Brilliant idea.

Get that loudspeaker and keep shouting about how American products are unsafe.

I do not mind admitting the fact that a few years ago I heard from the manager of the division of the company that found security issues for the FBI that ‘this is the only vulnerability farm the FBI has’.

Put another way: they both vastly overpaid and, at the very same time, have far less people and resources to put towards these efforts then the professional agencies who focus in these areas like the CIA Department of Science and Technology, the NSA, and Military Intelligence.

Is this a “leak”? No. It is certainly classified information for many people. Not for me. That manager has since left his job, and the company which had this division merged with another company.

I like the DoJ and FBI. I have more ties there then anyone who is not multi-generational.

But, their stance on this issue is extremely detrimental not only to the interests of the United States, but also to every nation committed to the principles of liberty and righteousness.

(I certainly welcome any investigations otherwise, always do, and they will always get shut down.)

I will only otherwise note that as for the FBI, or anyone else, being forced to reveal their security vulnerabilities? Not going to happen. Good rhetorical device. But, reality is the FBI has a minute .00001% of a fraction of a budget of the Big Boys.

The FBI is great at kidnapping cases, serial killer cases, some other cross state types of physical crime.

But, for computer security? They are completely and thoroughly penetrated.

Both by spies like me, and electronic surveillance they certainly can not detect.

And this certainly is for the best interests of the world.

Dirk Praet March 30, 2016 9:10 PM

@ Bruce, @ all

That’s iOS 9.3. ­You should download and install it right now.

WARNING! There’s something very wrong with the iOS 9.3 update that causes it to brick older devices such as iPad 2. Users of such devices should wait for a new upgrade that Apple has said will be made available in the days to come.

@ Z

The issues I have with the narrative of “all vulnerabilities should be disclosed and patched” are:
1) It completely removes the usefulness of hacking from a state point of view.

I can hardly imagine any US or other company having a problem with their IP no longer being stolen by state sponsored Russian or Chinese hackers.

Worst, most vulnerabilities are actually exploited after they have been found and disclosed, and before they are patched

Responsible disclosure means informing the product vendor first and giving them sufficient time to patch the problem. That’s exactly what the guys of Johns Hopkins did with the iMessage flaw they had discovered.

@ Tõnis

In one of its briefs, the government argues something to the effect that the public “needs” or “deserves” to know what is on that phone.

Well, me for one, I am very curious about what interesting info they have found on that phone. Especially regarding that “dormant cyber pathogen”. But I guess that now that they know such findings must be totally classified as a matter of national security, thus overruling anything they previously said about it.

Ambiguous March 30, 2016 9:18 PM

@’was it Cellebrite’

I think it was.

But, circumstantial evidence.

Accuvant labs was the vulnerability farm for the FBI a few years ago. Charles Miller & Josh Drake, both primary cell phone security bug finders worked for them, at the time. They were the headliners at last year’s Las Vegas Black Hat.

Shawn Moyer, was the manager of Accuvant Labs.

Josh Drake didn’t really exist until his 30s when he started to find security vulnerabilities. Charlie Miller is the son of a police man in St Louis who, like Dave Aitel and Butler of the “B” in HBGary… were teenage recruits in a special NSA program, gone astray. Or so the story goes.

Since then, Accuvant has merged with Fishnet Security, to become “Optiv”.

Great company. Hidden secrets.

Unfortunately, paying for such rock stars means a lot of money. So, who can you offshore to? India? Too easily penetrated by Russia and China. China?

This manner of job would easily cost upper six figures from Americans.

From Israelis, too!

But, why not charge only 10K?

If that could be publicized?

And maybe attract the agencies who spend real money and have real objectives. Revealed from surveillance and human intelligence techniques so super secret, it can well be said, “It is so scary how deep we have penetrated them”.

What I laugh about. Is that guys like Putin really believe they can trust their top advisors. That nations like China actually think – their top leaders anyway – that they have secrets.

That???

Is what gets me off.

That they think they have secrets.

Daniel March 30, 2016 9:18 PM

A vulnerability in Windows 10, for example, affects all of us who use Windows 10.

Us…us..are you really trying to imply that you use Windows 10?!!

r March 30, 2016 9:39 PM

Something I think a few of you may be overlooking, considering this is sort of a request for forensic data the primary stipulation of a non destructive method may preclude any lack of disclosure of the method from cellebrite or whoever… Now I know that all they are after is the Rolodex/calendar but I would hope for any evidentiary hearings the method was not a black box.

Z March 30, 2016 9:45 PM

@Dirk Praet

I can hardly imagine any US or other company having a problem with their IP no longer being stolen by state sponsored Russian or Chinese hackers.

Again, this assume that’s “vulnerability hunting” is a sensible security strategy. But it’s not. There are too many vulnerabilities out there to expect us to find and patch them all. Disclosing one will not change anything, as all the attacker has to do is find another zero-day, and plenty are discovered all the time.

Responsible disclosure means informing the product vendor first and giving them sufficient time to patch the problem. That’s exactly what the guys of Johns Hopkins did with the iMessage flaw they had discovered.

iMessage can be patched using a centralized update system, but plenty of software do not benefit from such functionality and must be patched manually. There are still plenty of SSL implementations vulnerable to heartbleed out there. Not because people are dumb, but because patching is hard and long and sometimes you can’t even do it without breaking other things. The amount of time and ressources necessary to mitigate such vulnerabilities completely dwarf the ressource necessary to find them and create an exploit. As long as this remain true, the benefit is largely toward the attacker.

Buck March 30, 2016 10:04 PM

@Evan

If we somehow rolled out secure systems for everything in the government tomorrow, a lot of people in the intelligence/law enforcement community would find themselves struggling to justify their existence.

True, unless there’s also a way to simultaneously repurpose their existing skills for operating in the new security paradigm…

@Gweihir

This is “extortion-kit” malware that does not need a lot of understanding on the side of the criminal, and it is getting pretty sophisticated and effective in the face of pathetic-level defenses.

My cynical-self tells me that the CIA (and their foreign counterparts) think the black market for drugs will be running dry within the next 20 years or so. Thus, they have to find new sources of black money to fund their dark arms sales… I find that thought a bit more comfortable than what has happened before, but still! :-\

@Z

Point 1)

It completely removes the usefulness of hacking from a state point of view.

I totally disagree. Zero-days can be used against all appropriate targets (as well as plenty of others to hide intent) just before/during/immediately after to install customized and obfuscated remote access methods. As long as widespread vulnerabilities are not extremely rare, disclosing used ones does not much affect their utility from a nation state’s point of view.

Point 2) I completely agree with you here.

sooth_sayer March 30, 2016 11:02 PM

There were only 3 possible outcomes ..

  1. Apple wins – and phone can’t be unlocked — that’s bad for society at large.
  2. Apple looses and Govt can FORCE it to unlock phones — public humiliation — loss of marketing battle and ZERO gain for apple.
  3. This — i.e. Govt. unlocks it without saying how — this is bad for Apple by an order of magnitude — the rocket scientists should have known their software has as good a possibility of bugs as any other. But I think they think their s*** don’t stink.

There is still a possibility that the leak is from Apple internally — might even be official.

In the end Apple would have lost this battle – if not today then in 5 years — maybe after a bigger terrorist incident when no one will care about the privacy issue for the moment.

RonK March 30, 2016 11:45 PM

@ Gweihir

They will need to eventually use evidence gained this way in court and then
they will have to disclose what was done in order to prove that the evidence
was not tampered with.

There is a well-known procedure to get around this problem.

That, and the relatively ephemeral effectiveness of any particular hacking procedure given current market turnaround times would lead me to think that the FBI for the most part won’t be revealing anything worthwhile.

Ambiguous March 31, 2016 12:11 AM

@Buck

I’m wondering what the folks here would think about this sort of compromise:

Approach #2: Options and Notice
Here’s an approach that I haven’t seen proposed anywhere, one based on consumer options and transparency: What if Congress required that encrypted services (storage or communications services or both) have what one might call an “Emergency Access Mode” as an option available to consumers?

Look. Corney, or whatever his name is, is a complete idiot. Literally. I mean, so, the guy will retire soon, or have some horrible accident, or some horrible person disclosure. Blah blah blah.

Maybe it is cancer of the brain.

Let us all give him a twelve gun salute.

Reality is, these guys are blasting across the world a singular message.

“American hardware and software is untrustworthy, we are backdooring it.”

So, they pass on.

Period. That is it.

They did all they could to fuck up real intelligence, and they did it in ignorance. So, they die early.

When we are over this hill, it will be good.

I can not confirm nor deny…

Sorry for his accidental death…

Reality is… we cull those who destroy us. The founding fathers well pointed out, our own government is the greatest threat… so… you have us.

The Secret Service.

And the FBI is welcome to investigate me when their leader dies of natural causes….

Ambiguous March 31, 2016 12:32 AM

@ianf

Let us offshore all encryption.

Terrorists! We of America are going to backdoor all American Software and Hardware!

Foreign Spies!

We have backdoored all American software and hardware!

Reality is, while I complained about the take down of Patreaeus… I gave them the clues to do so.

Sorry.

I am the bad guy.

I liked him. But I took him down.

So, Sympathy of the Devil.

When they die of stuff like brain cancer…

Can you finger us?

Waaaa.Waaa. Waa.

You caught an american conspiracy. Same one behind jfk.

Oh, wait.

We will get away with this, too.

Goodbye, corney. I won’t make fun of your last name at your funeral. And, like, make corn dog jokes.

Really, I won’t.

😛

trsm.mckay March 31, 2016 1:00 AM

There is another option – perhaps the process required hardware hacking as well (there has been speculation about that in previous posts on this topic). Bruce over-simplified when he said:

The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device.

There are attacks against HW that are quite expensive to perform (decapping and performing an analysis). When I design systems using HW, my goal is to make attackers pay the entire price for each system they attack. You make all private and symmetric keys in the device unique, use PFS transactions, avoid sharing secrets across devices, design mechanisms that help detect HW devices that have been compromised, and provide solid methods of revoking trust.

This can be pretty effective to protect mass produced devices like Set-top-boxes, Point-of-sale terminals, and perhaps mobile phones. You spend all that money to attack a device, and all it gets you is the secrets of that particular device. By contrast centralized HSMs (used for banking and like) contain secrets that are worth the cost of attacking a single device. So if the device contains important enough information, and a cell phone that contains valuable information about terrorism might qualify to spend the money for a HW based attack (those in this specific case, being a phone that the shooter decided not to destroy, that seems pretty unlikely).

Given the supposed timing (~2 weeks, but of course it may have been longer), my guess is some type of side-channel emission analysis (DPA, etc.).

Clive Robinson March 31, 2016 1:02 AM

@ Z,

There are too many vulnerabilities out there to expect us to find and patch them all. Disclosing one will not change anything, as all the attacker has to do is find another zero-day, and plenty are discovered all the time.

You are looking at vulnerabilities in the wrong way, that is that each one is unique, they are not.

The simplest way to look at it is that each vulnerability is an “instance” in a “class” of vulnerabilities.

Thus the “fix strategy” of a deffender should be to “fix the class” not “fix the instance”.

The problem then is determaning new classes of attack that are currently unknown, or not yet sufficiently determined. Without making fixes to broad and thus potentialy harmful.

For instance an overly broad class is “network attacks” fixing that is easy “disconnect from the network”, but that has obvious harms. Thus you would need a reduced class scope. Thus “SSH attacks”, again is to broad. However what about “fallback attacks” where a MITM downgrades the security… Stopping fall back is easy to do and in many cases is not harmful. In the cases where it is harmful, making it not harmful may be a case of simply upgrading software, something that should be done anyway.

Clive Robinson March 31, 2016 1:08 AM

@ Sooth Sayer,

1. Apple wins – and phone can’t be unlocked — that’s bad for society at large.

That “bad for society at large” is a very very large assumption on your part. And I would argue is a false one when viewed from the society at large perspective, not the narrow one of law enforcment.

Godfrey Holdcroft March 31, 2016 1:29 AM

My money is on the a side channel attack. Apple may have already closed the problem as the iphone in question was the older 5c. As far as I’m concerned the mobile phone network is fundamentally insecure anyway so the takeaway is don’t use your mobile for anything you want to keep secret.

The other takeaway was the DOJ was going to lose using the All Writs act anyway.

Wael March 31, 2016 2:13 AM

@Clive Robinson,

The simplest way to look at it is that each vulnerability is an “instance” in a “class” of vulnerabilities.

I would say fix the root cause by not violating relevant security principles which apply at every stage of design and implementation. This is what’s meant by “class” vs. “instance”. Won’t give links. Boy oh boy, are we in a sour mood today 😉

ianf March 31, 2016 3:03 AM

@ Wael “would say fix the root cause by not violating relevant security principles which apply at every stage of design and implementation.

Rrrright. Only you forgot to factor in the outcomes of the laws of Good Intentions; Wishful Thinking; Unintended Consequences; and the “Not Invented Here” principle.

Sorry, no links, haven’t had my morning tea yet.

Sam March 31, 2016 3:12 AM

I couldn’t see it mentioned here, but I thought it was worth a moments thought.

There was an article I read the other day that described a hardware hack (desoldering the NAND flash) that would allow multiple attempts to bypass the passcode. If this is the case, there is nothing Apple can do about it as it.

Personally I was hoping the FBI would give the phone to John McAffee to break, his claim that it could be cracked mainly by social engineering was interesting, if not a little doubtful … but it would have done wonders for his presidential campaign.

Petter March 31, 2016 3:19 AM

I’m thinking that FBI have had access to the phone through this “secret” way all along but thought that they had a good chance of pulling one off.

They hoped they could use a dead terrorists phone to get a way into any future phone by a precedent.
I mean, who would say know to peek into a terrorists cellphone?

Even though we did not see through the evil stunt they tried to pull off many of us reacted to the way they wanted to cripple all ours security byt backdooring our phones though legal ways.

Clive Robinson March 31, 2016 5:31 AM

@ Wael,

Now you are making me feel guilty 🙁

@ ianf,

Did not your mother tell you breakfast was the most important meal of the day? All ways to be served in the breakfast room? Further that a gentleman will all ways drinks tea with his first repast of the day? With coffee like alcohol reserved for after the sun is past the third quarter?

To do otherwise would be, acting continental which a gentleman would surely not do, because it would be behaving as the French do, and that would be quite intolerable, for goodness sake it’s probably why the eat horses

And people complain about dropping standards… Well I ask you 😉

Specied5618 March 31, 2016 5:41 AM

It could be a major poker move, for the bigger picture
You can see the “bad guys” ditching their iPhone for something else now. Moving to something the men in black can more easily break.

Clive Robinson March 31, 2016 6:01 AM

@ Bruce,

The FBI is doing the exact opposite. It has been given whatever vulnerability it used to get into the San Bernardino phone in secret, and it is keeping it secret.

There appears to be some ambiguity over the tense of the FBI’s use of secret.

The general view is that they are “going to” make it secret, not that “it was” secret at the time it came to their attention.

This would indicate that it did not come from a US Government agency or other governments agencies, where it would all ready have been clasified as secret.

As for a company asking for the method to be kept secret, the general rule is it would be classified at the lowest level of “confidential” for several good reasons. Not the least of which any higher clasification would prevent the company from using the method as they see fit. It would almost certainly not be used with information coming from a foreign company.

Which brings up the question of “National Security”, this would be a reason for the US Gov classifying the work of a US National or Company at secret. But the method would generaly have to have strategic value.

All of which brings up more questions than it realy answers…

wiredog March 31, 2016 6:14 AM

FBI plans to classify this access method and to use it to break into other phones in other criminal investigations.

And as soon as they introduce the cracked phone into evidence the method becomes subject to discovery by opposing counsel and disclosure to the jury, after which it is no longer secret. Remember that the FBI is primarily an interstate law enforcement organization. Counter-terrorism/counter-intelligence are high visibility, but going after bank robbers is the bread and butter. Lots of work going after organized crime, too. And the Families have been using crypto for decades.

Clive Robinson March 31, 2016 6:29 AM

@ Specied5618,

No it’s not likely after the Mexican “tunnel boss” use of BlackBerry phones was shown to be poor OpSec.

I suspect the “old ways” will get looked at again and the higher end criminals will use old school “Spy Craft” for non urgent communications and second hand “burner phones” for urgent communications or burner wireless modems through an anonymous network of some form.

They might try a varient of “The Moscow Park Plastic Rock”.

But if they have any sense they will take the encryption off of equipment vulnerable to communications attacks.

Mark March 31, 2016 7:22 AM

I personally think that the FBI hedged its bets. The six weeks — was it six weeks? I’m sure that I read that somewhere — during which this whole thing was made public sounds about enough time to onboard a new vendor, get the contracts and legal business sorted out, and then start to crack these phones.

Wael March 31, 2016 8:00 AM

@Clive Robinson,

Now you are making me feel guilty 🙁

Sweet! 😉 I was just kidding, I’ll continue the way I am.

CallMeLateForSupper March 31, 2016 8:10 AM

@Bruce
:,,,far too click-bait headline.”

Yeah, I balked at the headline when I saw it yesterday. “Misleading” is what immediately came to mind. So I was very surprised when, after I’d read the article, I scrolled up to see who’d authored it.

r March 31, 2016 8:13 AM

@trsm.McKay,

That’s kindve the perspective I’m leaning towards… Copying the flash prior to potentially destructive software exploitation, recovering the pin then restoring the device/flash to the original device.

Something else I was thinking about was the old windows nt source code floating around publicly. Considering that companies have to provide source code for auditing to governments: apples source or debugging information may be available to an Israeli company imo.

@wiredog,
It’s already public knowledge they were seeking and/or offered a hack, it should be a non-issue to classify the exact details as security sensitive. It could be even that Apple aided them with a stipulation of a nom-de-plume. Parallel construction anyone? Anyways, more to the point they’ve demonstrated as the case with tor and Firefox that their methods don’t require exposure to anyone other than a judge.

CallMeLateForSupper March 31, 2016 8:51 AM

@Evan
“…a lot of people in the intelligence/law enforcement community would find themselves struggling to justify their existence.”

Interesting to think on that for a moment. TLAs don’t justify their existance now. Since dang near everything is secret, there are no fine metrics by which outsiders can gauge TLA performance and efficiency. Even Congress is handicapped in this regard despite the fact that oversight is, by law, their responsibility.

Jason March 31, 2016 9:37 AM

I think this ended well. Consider:

Apple has had it’s nose publicly rubbed in the fact (assuming the FBI isn’t lying) that a vulnerability exists, at least in older models. They can now search for and close this vulnerability, possibly for all phones, possibly for only new models in which the hardware is designed to close the hole.

Apple has learned that the ability of a phone to accept forced updates is VERY bad for security. Closing this vulnerability also has the advantage of making the All Writs Act useless as Apple will have no way to open the phone.

Apple has learned that a large percentage of the population is concerned enough about privacy to consider purchasing a phone reasonably safe from government snoops. That’s a pretty good market, one well worth the time and effort used to develop. Probably not for everyone, because if you forget your password, a secure phone will turn into an expensive brick, one even Apple can’t open. But if Apple offers a “this is secure to the best of our ability” phone, they will have a long line of people waiting to buy one.

Most people who want or need a secure phone don’t need one that can’t be cracked under laboratory conditions, such a thing may never exist. We need a phone that is very time consuming and expensive to crack, with a high probability of destroying the stored data during the attempt. We don’t want a phone that can be plugged into a black box at a traffic stop, police station, or customs station, and popped open in a few seconds. We don’t need a phone in which the firmware can be accessed externally when the phone is off and during boot up. Any external ports should be completely disconnected (except for battery charging) until the password is entered. We don’t need a phone that can be sent a targeted, forced, and silent over the air update that can install a back door turning it into a tracking device.

Apple has learned a lot from this episode. Let’s hope they take advantage of it.

Dirk Praet March 31, 2016 11:05 AM

@ Z

Not because people are dumb, but because patching is hard and long and sometimes you can’t even do it without breaking other things.

So what? Love is hard too, but that doesn’t mean we give up on it. Every admin with half a clue knows that zealous patch and release management is essential to maintaining a secure infrastructure, whatever way vulnerabilities have come to light. It’s part of his job description, and so is acquiring the skills for doing so in a methodically sound way, even with limited resources.

I’m still hearing this argument way too often from engineers and managers as a really poor excuse to run hopelessly outdated and insecure environments that are totally begging to get breached. Sorry, but not on my watch, and, over the years, I have had more than one of them fired after a devastating audit report. Wishing vulnerabilities to go away and folks not to hunt, exploit and publish them is not helping anyone. You can plead or legislate the good guys out of doing so, but that won’t stop the bad ones, including state actors.

The net result would be everyone getting less secure. You may just as well argue against regular health check-ups because doctors and medication are expensive, treatment inconvenient or cumbersome, but it doesn’t change the potential presence of a condition that is just waiting to hit you in the face some day. With much bigger consequences for your health than if you would have treated it preventively.

So my advice to you is very simple: get your act together and embrace vulnerabilities and patch management where ever it is you’re working. Or be very, very afraid if ever a really anal security auditor like myself comes walking through the company’s door. Because we will have your *ss.

ianf March 31, 2016 11:29 AM

Hold your horses, Jason, ease up on that brick pedal: […] “if you forget your password, a secure phone will turn into an expensive brick, one even Apple can’t open.”

Not necessarily so. If no password, then the worst that might happen is that user contents are wiped out, and the phone can only be “reborn” as a virgin unit. It all depends on how the relevant OS function has been written and configured (wipe after X error entry attempts, etc). So not wholesale doom yet.

Bricking via remote control (and on last registered AppleID phone owner’s say-so) is then reserved solely for stolen units to render them inoperable=worthless.

    While smartphones are high on street robbers/ muggers/ thieves “acquisition list,” ever since the remote iOS bricking became an option, the iPhone can’t hardly be as desirable an item as it once was. I even read a report of a Parisian(?) who managed to brick his stolen unit that lit up online in Vietnam after several months. That’s quite a theft deterrent.

[^*] as for forgetting… today I realized that I no longer remember the cable # to a specific canal that I apparently watch too little to bother with. As I’m too young for Alzheimer (if too old to rock and roll) better I stick a Post-It note with that # at the back of the telly, and then remember that I’ve stuck it there!

Z March 31, 2016 11:31 AM

@Dirk Praet

I believe you missed the point of my comment.

I’m not saying we shouldn’t “embrace vulnerability and patch management”, I’m saying it is ridiculous to expect those to fix the problem of data insecurity. 10 years ago people could have made this claim, but not today. In a previous post you said “I can hardly imagine any US or other company having a problem with their IP no longer being stolen by state sponsored Russian or Chinese hackers.”, which assume that you DO believe this is a solution, that we just have to be more serious about it and somehow we’ll address our problem. But that’s just not true.

Who cares if managers and engineers should do it when they don’t? Wishes don’t fix jack. I don’t deal with “should”, I deal with what is actually being done in the industry. If best practices were to save us we would have been out of jobs a long time ago.

You compare those practices with health checkups. But even if we all did health check up all the time, nobody would claim that it would protect us against illness and diseases. It would certainly help but we would still get sick in the end.

Derek March 31, 2016 11:33 AM

So, I disagreed with the stance a lot of people took and with what Apple put out. The original order was not the first time the AWA (All Writs Act) had been used nor the first time it was used against Apple. There was nothing precedence setting in this case. Not to mention the actual owner of the phone gave permission (Shooter’s employer). The order itself wasn’t asking Apple to break the crypto or backdoor it. It was asking Apple to turn off some features that make it harder to brute force access to the iPhone. More of a handicap ramp to the front door. The FBI would have had to still brute force the phone in order to access it. Calling anything in this case a backdoor to crypto is irresponsible and blatantly playing into fear uncertainty and doubt style.

Apple’s letter (Tim Cook’s) ignored the fact that they previously assisted law enforcement over 70 times in the past. Their privacy policy still allows them to unlock phones prior to the encryption by default (the majority of the 70+ were this situation). Apple conveniently left out that they hold master keys to iCloud backups and still provide the iCloud backups to law enforcement unencrypted (they provided the iCloud backups for the iPhone in this case even). If Apple was serious about customer privacy and the protection of encryption they would not have the master keys to iCloud backups.

On a final note had Apple assisted the FBI to begin with like they have in the past they could have built into their devices a way to prevent that assistance from working in the future. From a company perspective if Apple did the same things as other large tech companies they might have better results. Apple has a bad rep for working with researchers, provides no bug bounties, etc. Apple is a company a decade or more behind in security practices.

The debate on private companies buying and selling 0days instead of disclosure and remediation is a separate debate and goes far beyond government surveillance and privacy.

r March 31, 2016 11:40 AM

@ianf,

Something to be aware of, something my girlfriend is paranoid about… To those that are of an AB blood type: you are extremely vulnerable to very early onset dementia and memory issues.

David Leppik March 31, 2016 11:42 AM

@Z: State-level actors will always have tricks up their sleeves that are infeasible to others, such as bugs or human informants. But these have much different characteristics than software hacks. They are more expensive and more targeted, and some even require a degree of human loyalty that is hard to buy.

Right now, if you think of your phone as an extension of your brain– or if you focus on how much easier it is to track people these days– we’re just trying to get back to the level of personal security/privacy we had in 1990. Back then, the government actually had to follow people around to track them, and people didn’t have lie detectors strapped to their wrists all day long. And yet they managed to solve lots of crimes and gather lots of intelligence.

And that’s where we want to be in terms of the economics of surveillance. It was possible for East Germany to track an entire population, but they had to spend a significant portion of their Gross Domestic Product to do it. Mass surveillance was a deliberate choice for a dictatorship, not something a crime syndicate could afford or a democracy could accidentally slip into.

Clive Robinson March 31, 2016 11:48 AM

@ Z,

… because patching is hard and long and sometimes you can’t even do it without breaking other things.

In my limited experience “breaking on patching” has a correlation with developers “pushing the envelope” or CV polishing with “use of latest tech”.

Whilst it’s understandable for developers to want to use what’s new etc, it’s actually bad practice, and should be strongly discouraged. Not just for “mission critical” but all development that could be reached now or in the future from people likely to be hostile. It thus does not belong on business systems full stop because it’s asking for trouble.

Tom M March 31, 2016 11:52 AM

What are the chances they accessed the phone using its fingerprint reader using the exploit first reported in 2002 using gummy candy and a PCB?

David Leppik March 31, 2016 11:56 AM

@Derek

Apple provided all the assistance that they had provided on the previous 70 iPhones. Those phones were much easier to access, and didn’t require a custom OS. Also, iCloud has keys to the data, so that they can add and remove devices, and so that when users forget the device password, they don’t lose all their photos. Apple may tighten security, but at the cost of worse customer service.

@Sam

There was an article I read the other day that described a hardware hack (desoldering the NAND flash) that would allow multiple attempts to bypass the passcode. If this is the case, there is nothing Apple can do about it as it.

It’s not impossible that Apple could develop future chips that make this harder. For example, use an adhesive (not just solder) between the chips that makes them even more difficult to take apart without destroying them.

HJohn March 31, 2016 1:41 PM

@Douglas Knight • March 30, 2016 5:25 PM

Isn’t a simpler explanation that the FBI just perjured itself to get out of the court case?


It’s also possible they wanted out of the case while wanting ISIS to think they can break their communiction. I mean, if they can’t break it they may be wanting to trick ISIS into not using it or trying something else.

Not saying that’s what happened or that I would agree with them if they did it. But I can definitely see why they wouldn’t want to tell ISIS “this tactic of yours is too good for us to crack.”

albert March 31, 2016 3:11 PM

Available for download from Cellebrites website:

UFED_Classic_Supported_Phone_List_4 0.zip

It’s a spreadsheet listing all supported phones, and the various operations that can be performed on each model using Cellebrites s/w or h/w. (also available: UFED_PA_Userguide.pdf)

I leave it to you experts to determine if the functionality required by the FBI was already included in the spreadsheet listings. If not, then it seems to me that:

  1. Cellebrite included the functionality recently (after the list was created), or
  2. They, or someone else, physically hacked the phone, or
  3. The FBI already had the s/w, and knew the capabilities.

As I pointed out earlier, Cellebrites product can do remarkable things.

. .. . .. — ….

Rick Starr March 31, 2016 3:21 PM

Terrible example. “This is the way vulnerability research is supposed to work.” OK, what do you suppose would have happened if Apple refused to patch the vulnerability (the way they refused to obey the court approve order to unlock the phone?) I suspect PUBLICITY, exactly as happened.

PS: If you don’t like the check-and-balance of the Constitution, work to change it. Sure, there have been times (especially recently) when the government has over-reached, but this isn’t one of them. The guy is dead. The owner of the phone wants it unlocked. The FBI wants it unlocked. A valid search warrant was served. A court followed and validated the search warrant during the first trial. And for all the “it’s too expensive” and “it’s too hard”, it took about a week from the time somebody else stepped up and said “We can do it” to “We have done it”, so, er, how hard would it have been for Apple to just do it – as they have done before, and as the telecommunications companies have done before?

Eager March 31, 2016 3:48 PM

@Jason

Apple has learned that the ability of a phone to accept forced updates is VERY bad for security. Closing this vulnerability also has the advantage of making the All Writs Act useless as Apple will have no way to open the phone.

Forcing updates isn’t a problem; it’s good for security because most users wouldn’t bother otherwise. Look to Android for proof of that.

The problem is the OS allowing updates WITHOUT authentication – i.e. updates to firmware etc. should only be allowed on an unlocked device.

@All

Apple to FBI: Please Hack Us Again
http://www.thedailybeast.com/articles/2016/03/31/apple-to-fbi-please-hack-us-again.html

The tech giant now wants to know how the FBI cracked its seemingly secure device, and Apple is using an another active court case—this one in New York—to do it.

In a letter to a U.S. district court judge last week, Apple lawyer Marc Zwillinger practically invited the FBI to hack the New York phone using the same method it employed in the San Bernardino case. Apple also has refused to help unlock the New York phone, which was used by a confessed methamphetamine dealer.

“If that same method can be used to unlock the iPhone in this [New York] case, it would eliminate the need for Apple’s assistance,” Zwillinger told Judge Margo Brodie. In other words, successfully hack us again, and the case is closed.

However, if the FBI or the Justice Department says the San Bernardino method won’t work on the New York phone, it should have to say why, Zwillinger argued.

FBI to Unlock iPhone in Arkansas Case After San Bernardino Hack
http://www.nbcnews.com/storyline/san-bernardino-shooting/fbi-unlock-iphone-arkansas-case-after-san-bernardino-hack-n548366

"Yoshi2" March 31, 2016 4:18 PM

I’m in favor of law enforcement and counter-terrorism. Saving lives is more important than pampering consumer mental insecurity.

If people live dangerous lifestyles that require high encryption, they should get off of the iOS / iPhone / iPad platform and onto something else, or just live with it. Maybe they need to re-examine their lifestyle if they are reliant upon consumer-grade products for security.

It makes sense to keep some decryption techniques available to law enforcement and counter-terrorism at all times… so they can do their work. Saving and protecting lives is their work.

There are aspects to cryptology that go way beyond just the mathematics and the computer science. People feeling upset about these technical decisions could perhaps benefit from re-examining for themselves the many definitions of secrecy, illusion, security, anonymity, privacy, etc. Nothing happens in a social vacuum. People feeling persecuted perhaps need to reprioritize their ethics.

It doesn’t seem like the FBI did anything wrong at all in my humble opinion.
I’m thankful that they exist, and I’m glad they have some workable options.
I also congratulate them on the Silk Road break they made several months ago.

I’m really not interested in protecting the conveniences of terrorists and criminals.
Probably I’m not the only one with this stance.

Clive Robinson March 31, 2016 5:35 PM

@ Yoshi2,

With regards “law enforcment” and your assertion of,

Saving and protecting lives is their work

I don’t know who led you to believe that but there is no legal requirment for them to do so in most jurisdictions.

With regards the rest of your comments I’m assuming they are your own ill considered ideas?

Sancho_P March 31, 2016 5:41 PM

@“Yoshi2”

”I’m in favor of law enforcement and counter-terrorism.”
+1

”If people live dangerous lifestyles …”
It was part of my former job to work on both sides of hostile borders.
I did it for my country, my company, customers, people on both sides of the border, and, of course, money.
Have you ever been abroad, working with foreign authorities?
Can you imagine at least some of them being corrupt and dishonest?

”It makes sense to keep some decryption techniques available to law enforcement …”
See, if only they can hide we are lost.

Sancho_P March 31, 2016 5:44 PM

@Eager

”… i.e. updates to firmware etc. should only be allowed on an unlocked device.”

Um, nope, it doesn’t help as a general remedy, just in this particular case (owner is already dead).
Imagine they want access to a criminal’s device, they’d silently prepare the update, wait until the suspects unlocks the phone – bang –
then take him out.

Sancho_P March 31, 2016 5:53 PM

@Derek (and in part @Rick Starr, @“Yoshi2”)
”So, I disagreed with the stance a lot of people …”

If you still do I’d offer my personal point of view:

Apple, particularly Tim Cook, doesn’t want to support serious crime and terror.
Surprised?

So they did what could be done to help LE, probably not telling their customers loudly and to the full extent what was possible for the manufacturer [1].
They acted, in good faith, silently like a moderator between LE and customer demands, going as far as they could.
[knowing some of Tim Cooks morals I basically support that concept of trust]

It seems LE’s growing demand and some revelations / public discussions (ask Ed Snowden for details?) increased Apple’s sensibility, they started to react (encrypt).
But LE was pressing even harder, Tim Cook got (still quietly) upset.

Finally, LE, in a surge of power, tried to force Apple in public to break (here I disagree with Derek’s downplaying statement) their own security measures, without any chance to negotiate behind closed doors, as always done before.

This has been a very stupid move by the LE.

Tim Cook was forced to pull the emergency brake instead.
Surprised?

The rest of your post sounds a bit like a rant but I won’t bite, not being an Apple fanboy at all (probably because we use iMacs).

[1]
It seems even the average user (like Farook) knew that backups, if enabled, could be decrypted / transferred (restored if shit happens).

Dirk Praet March 31, 2016 7:15 PM

@ Z

Who cares if managers and engineers should do it when they don’t?

Err, shareholders and customers for starters. I am well aware of the reality on the shopfloor, and the only way you can ever change that is by holding people accountable for failing to comply with company policies, procedures and industry standard best practices. And in which patch and release management is but a small, beit important element of the bigger picture. The most spectacular IT disasters and security breaches I’ve ever seen were never caused by cascade failures or really smart hackers, but by people simply not doing the job they were paid to do, SE’s and managers alike.

Banning bug hunting and keeping vulnerabilities and exploits under a lid is not going to change anything about that. It’s just going to make things worse.

But even if we all did health check up all the time, nobody would claim that it would protect us against illness and diseases.

You’re telling that to a man whose father suffered a near fatal stroke that put him half-paralysed in a chair for the last 20 years of his life because he obstinately refused to take his annual check-ups. I know several others who died from various conditions that could have been perfectly treated if only they had been diagnosed in time. So excuse me for not buying your line of thinking.

Niko March 31, 2016 7:44 PM

This article misses the obvious point: the FBI has no mandate to secure commercial networks. Outside of terrorism, their mission is to get criminal convictions. If the FBI had to disclose computer vulnerabilities, they wouldn’t be looking for them to become with.

Dirk Praet March 31, 2016 8:12 PM

@ Yoshi2

People feeling persecuted perhaps need to reprioritize their ethics.

I suppose the Kurdish Yazidis in Iraq should just convert to Da’esh’s interpretation of Islam and get over it? Probably 10 million other examples I can think of off the top of my head, so that’s a -500 for your comment.

@ Clive

I don’t know who led you to believe that but there is no legal requirement for them to do so in most jurisdictions.

It’s like this “Protect and Serve” thing. I’ve always wondered who.

@ Rick Starr

If you don’t like the check-and-balance of the Constitution, work to change it.

Actually, invoking the checks and balances of your Constitution is exactly what Apple did in its response to the FBI. Accompanied by a request to have Congress decide on future cases instead of the executive just seizing such powers based on a 220 year old statute.

Niko March 31, 2016 8:13 PM

@dirk

It’s interesting that a European would call for more medical screening. It’s very well known that the US spends more on health care than just about everyone else. Trying to compare outputs, after adjusting for accidents, genetics, lifestyle, and environmental factors is really hard. There’s much more consensus on inputs, and the consensus among the US medical community is US doctors order more tests and more expensive tests compared to their European colleagues.

Clive Robinson March 31, 2016 8:59 PM

@ Dirk Praet,

It’s like this “Protect and Serve” thing. I’ve always wondered who.

If somebody I used to know is correct, it depends on how close you are to the badge…

So for the “rank and file” officers first, then their family and the union / association then those who “help them” in various ways, then maybe their bosses who have their own badge/association (ACPO etc).

As for the public, if you are in one of the other uniforms of service etc…

Clive Robinson March 31, 2016 9:26 PM

@ Niko,

It’s very well known that the US spends more on health care than just about everyone else.

And gets the least benifit for what they spend.

The reason for the expensive and often unnecessary tests is “cover your back” for doctors but mainly insurers who pick up the littigation and compensation bill.

I was once told a truism with some feeling by a US doctor that “In the US you’re only realy qualified after you win your first malpractice case, in the UK your first [malpractice] is also your last…”.

Sam April 1, 2016 6:13 AM

@David Leppik

It’s not impossible that Apple could develop future chips that make this harder.

Indeed, one assumes that they have already taken more secure measures, and it means all iPhones of that generation cannot be patched.

Dirk Praet April 1, 2016 6:53 AM

@ Niko

It’s very well known that the US spends more on health care than just about everyone else.

I was comparing the benefits of tedious patch management with those of regular health check-ups, not comparing the US and EU health care systems. While such a discussion is well beyond the scope of this thread, I’d like to invite you to Google “Belgian healthcare system” some time.

One of my friends who has been living in NYC for about 15 years still has a formal place of residence in her home town in Belgium and is paying her healthcare contributions in Belgium to be fully covered. It costs only a fraction of what she’d pay in the US. Till the late eigthies, an uncle and an aunt of mine living in Salem, Massachusetts came over once a year for doctor and hospital visits and to stockpile on medications because all of it was cheaper over here even without insurance than it was with insurance in the US.

Marcos El Malo April 1, 2016 7:40 AM

@Dirk @Clive

Relevant U.S. case law: https://en.wikipedia.org/wiki/Town_of_Castle_Rock_v._Gonzales
U.S. LE has no legally enforceable duty to protect anyone, according to the U.S. Supreme Court.

I’ve been reading up on this a bit because apparently there are quite a few LE apologists that don’t believe in our fundamental right to secure our persons, homes, property, and personal effects against adversaries and wrongdoers. My thinking is that if securing our shit is our own personal responsibility, then it follows that we have the right to do so. The LE apologists want to abrogate this right because it might interfere in the future with their narrow constitutional exceptions.

We often use the metaphors of doors, locks, and keys. It seems to me that LE would like to make shelter illegal if it aided their mission – what that mission actually is, is open to interpretation at the government’s convenience. They aren’t legally bound to anything, while saying whatever they want (as per Castle Rock v Gonzalez).

违章动物 April 1, 2016 8:58 AM

@Marcos El Malo, thanks for the interesting case. The cited IACHR opinion is the natural sequel, because it points out that the US government breaches its responsibility to protect. When Gaddafi did that, he got a bayonet up his butt. Roberts needs a bayonet up his butt. The Supreme Court’s dogged ignorance of binding conventional international law is what makes it a laughingstock among apex courts. Just let it dwindle from eight, to four, to zero, and replace it with a NHRI in compliance with the Paris Principles.

ianf April 1, 2016 10:17 AM

Superfluous comment of the day—or maybe not.

违章动物 [Wéizhāng dòngwù] (“Illegal Animal” say Google) seems oddly obsessed with perps getting bayonets up their butts… I trust only figuratively speaking, rather than literally so… which would be MESSY TO BEGIN WITH, and not something appropriate for this v. much SFW forum.

Erich Schmidt April 1, 2016 10:19 AM

Regarding “All of our iPhones remain vulnerable to this exploit. ” Maybe, maybe not. The problem is we don’t know. This is an older iPhone in this case that didn’t have many of the security features in later hardware and software.

ianf April 1, 2016 11:34 AM

@ Clive “Did not your mother tell you breakfast was the most important meal of the day?

Wasn’t that kind of stately home, but thank you—it’s been a while someone cared about meme tabolism. BTW. how did you figure out I was not a foundling… it is that obvious?

As for the morning tea… I used to imbibe it litre-wise (unadulterated) until I wised up. I’ll tell you why with this reg.exparaphrase:

“FS: What’s with the water?

“RV: Alcohol takes the edge of, I want to stay angry.

behaving as the French do, for goodness sake it’s probably why they eat horses

Shocking, I know. That’s why I retaliate against the Frogs by boycotting dinde française.

Niko April 1, 2016 3:52 PM

@dirk

We’re getting away from security here, but in any system where a 3rd party pays the vast majority of the cost(private insurance and Medicaid/Medicare in the US or the government in Belgium), looking at the out of pocket costs to patients doesn’t measure how much health care actually costs. The example of your Aunt and Uncle illustrates how Americans go to the doctor more often. It’s close to impossible to spend enough on over-the-counter drugs in a year in the US to cover the cost of even 1 round trip to Europe, so I assume they needed prescription drugs to treat some type of chronic condition. If you have a chronic condition that requires life long medication(ex. diabetes), any doctor in the US who only saw his patients 1x a year would be setting himself up for a malpractice lawsuit, as Clive alluded to above.

Niko April 1, 2016 6:39 PM

@dirk

Going back to the security issue. The fundamental issue is whether vulnerabilities are dense or sparse. If vulnerabilities are dense, then telling the vendor about the ones you know about and patching them does little to protect you against some APT who has exploits that target lots of vulnerabilities that you don’t know about. If vulnerabilities are sparse, then telling the vendor and patching is much more effective. So are vulnerabilities dense or sparse? Technically, correlation of vulnerabilities is also a factor. However, I think Bruce got the conclusion backwards on this issue. If there is a high correlation in the discovery of vulnerabilities, it would be because some vulnerabilities are much easier to discover than others and most hackers can only find the easy vulnerabilities. Reversing that, if correlation is high, some vulnerabilities are unlikely to be discovered by any but a handful of nation-states, who can afford to hire the best people and throw the most resources at a problem. If the correlation in vulnerabilities is high, then NOBUS actually becomes more feasible, not less.

Flash Master April 2, 2016 4:07 AM

@Douglas

Why do you believe that there actually is a vulnerability and that the FBI actually has accessed the phone? Isn’t a simpler explanation that the FBI just perjured itself to get out of the court case?

Umm, because it’s so utterly plausible? OMG, they found yet another in a long line of vulnerabilities in crap products that were probably backdoored in the first place not that it really mattered.

Didn’t we already see a story about the basic technique they used which boiled down to “make a backup so the code you aren’t in control of can’t successfully delete the last copy of the relevent data”? I’m pretty sure that thought occurred to me during the whole time that this nonsensical theatrical farce filled the news sites.

LarryG April 2, 2016 11:38 AM

The tech giant now wants to know how the FBI cracked its seemingly secure device, and Apple is using an another active court case—this one in New York—to do it.

The economy of scale does not add up here. If Apple really cared, they’d just offer $1B from their cash reserves to acquire the entity which cracked the phone. Time is money in the hitech race. But then this is more about public relations.

"Yoshi" April 2, 2016 1:56 PM

To people who choose to quote others, please don’t do it out of the full context of what each person said. You can’t just cherrypick other people’s comments without destroying the meaning of what they said in full. When you cherrypick, it turns the comments section into a wall of back and forth infighting which isn’t useful for anybody to gain or share any information.

Here’s the full extent of what I said with one minor correction for clarity:

“I’m in favor of law enforcement and counter-terrorism. Saving lives is more important than pampering consumer mental insecurity.

If people live dangerous lifestyles that require high encryption, they should get off of the iOS / iPhone / iPad platform and onto something else, or just live with it. Maybe they need to re-examine their lifestyle if they are reliant upon consumer-grade products for security.

It makes sense to keep some decryption techniques available to law enforcement and counter-terrorism at all times… so they can do their work. Saving and protecting lives is their work.

There are aspects to cryptology that go way beyond just the mathematics and the computer science. People feeling upset about these technical decisions could perhaps benefit from re-examining for themselves the many definitions of secrecy, illusion, security, anonymity, privacy, etc. Nothing happens in a social vacuum. American consumers feeling persecuted perhaps need to reprioritize their ethics.

It doesn’t seem like the FBI did anything wrong at all in my humble opinion.
I’m thankful that they exist, and I’m glad they have some workable options.
I also congratulate them on the Silk Road break they made several months ago.

I’m really not interested in protecting the conveniences of terrorists and criminals.
Probably I’m not the only one with this stance. ”

When you quote people out of context in order to break down communication you end up like this guy: http://ritholtz.com/2015/06/that-guy-from-the-internet/

Gerard van Vooren April 2, 2016 2:16 PM

@ “Yoshi”,

“Probably I’m not the only one with this stance.”

Correct. There are always people who pick a stance. If the Gestapo for instance would exist today that institute would also have people who back it up.

The only question you have to ask for yourself is that is it a wise stance, is it a stance that is established after careful consideration and without bias?

Clive Robinson April 2, 2016 3:44 PM

@ “Yoshi”,

Probably I’m not the only one with this stance.

Having demonstrated that one of your fundamental assumptions is false, it does raise the question as to why you have not paused and reviewed your stance?

I can also show that others of your fundemental assumptions are likewise false, which is rather worrying for your argument as it is a good indicator that your stance is not reasoned but either assumed or regurgitated from things you have heard from others.

However aside from reconsidering your stance, you might also want to look up a good triestes not on “authoritarians” –which are numerous– but “authoritarian followers” of which there are only a couple.

An implicit belief that authorities should have whatever powers they wish for is not good. It shows both a lack of reasoning and a lack of historical knowledge. As history has repeatedly shown giving authorities unlimited power can only end badly for both sides. It’s why the US Constitution exists, and why there is the old saw of “Power corrupts, absolute power corrupts absolutely”. I suggest you have a serious think on why that might be.

Nicholas Moore April 2, 2016 8:08 PM

@ Yoshi
“To people who choose to quote others, please don’t do it out of the full context of what each person said.”

I’ll bite.

@ Yoshi
“When you cherrypick, it turns the comments section into a wall of back and forth infighting which isn’t useful for anybody to gain or share any information.”

But it is this constant struggle between “ego and the altered ego” until one can be reached at a moment of peace, which am not, that brings out more of one’s thoughts, and ideas. As far back as wo/men were introduced to communicate, we have been enamored by recorded evident constant struggles of good/evil. The wise have learned to form dichotomies and manipulate it to advantage, because otherwise the change of guards would a stand still, rather than “progressive” toward an envision.

@ Gerard van Voooren
“The only question you have to ask for yourself is that is it a wise stance, is it a stance that is established after careful consideration and without bias?”

Only it’s impossible to make an unbiased stance, by laws of physics, but stances that are relatively less “biased”.

@ Clive Robinson
“It’s why the US Constitution exists, and why there is the old saw of “Power corrupts, absolute power corrupts absolutely”.”

As you imnplied, “authority” and “the public” are words of collectives, not only in groups but also individuals. Each whim of an individual can make a different outcome, which is why the Consitution and two others are all about cheques and balances, as Our Founding Fathers had worked hard to install true democracy and freedom. But as human nature is of complicated matters, and as in any wo/man-made system, vulnerabilities and exploits are discovered both from outside and within. When times are good, abundance leads to sharing, but as morale high ground is lost, it gravitates towards the dark side. The Laws of Physics may not be changed, but as we all witness the Laws of Wo/Man is whimsical.

moz April 8, 2016 3:49 AM

I’m going to double down on my guess that this is Cellebrite.

When we discussed who is helping Bruce said it couldn’t be Cellbrite because they would have done it already. He’s already added a link that the FBI ruled out Cellebrite because of reading the manual without consulting them directly.

At the time I thought that Cellebrite’s system was ruled out because of the new hardware security, but that the fact that this is an iPhone 5C means that there’s no secure enclave and the hardware is clearly a type that Cellbrite claims to be able to attack and that adapting their hardware attacks to the IOS9 software would require only “relatively simple” reverse engineering of any changes in data format layout. Cellebrite probably had the feature partially developed but not published and delivered to the FBI.

So if I get this right probably the entire FBI vs Apple is just a mistake by the FBI where the FBI thought that IOS9 automatically implied new security features and actually they come with the iPhone 5S hardware and future hardware instead. The FBI saw the opportunity for a test case and only later found out that Cellebrite would be able to help them after they went public and it was too late to avoid publicizing the situation.

More Machiavellian people might think the FBI wanted to test legal theories in a situation where they had plausible deniability, but could withdraw at any point by suddenly “finding out about” Cellebrite’s new features.

In any case, there’s no reason to think this is beyond Cellebrite’s capabilities and there’s an explanation, now, of why the FBI might have claimed to be unable to open the phone even when Cellebrite actualy could do it.

AvidAppleReader April 8, 2016 8:18 PM

In Friday’s print Wall Street Journal (and this link):

http://www.wsj.com/articles/roots-of-apple-fbi-standoff-reach-back-to-2008-case-1460052008

Roots of Apple-FBI Standoff Reach Back to 2008 Case
Child sex-abuse prosecution is believed to be first time a federal judge ordered Apple to assist in unlocking iPhone
In recent months, Apple has resisted at least a dozen requests by the Justice Department for orders compelling the company to assist in bypassing phone security. ENLARGE
In recent months, Apple has resisted at least a dozen requests by the Justice Department for orders compelling the company to assist in bypassing phone security. Photo: Michaela Rehle/Reuters
By Joe Palazzolo and
Devlin Barrett
April 7, 2016 2:00 p.m. ET

The roots of the current standoff between Apple Inc. and the Federal Bureau of Investigation reach back to 2008, with the unexpected discovery of a suspect’s iPhone apparently forgotten inside a bag of diapers.

Lawyers and investigators involved in the 2008 prosecution of Amanda and Christopher Jansen, a young married couple from Watertown, N.Y., remember it as one of the most horrific cases of child sex abuse they had ever seen.

History may remember it for another reason. It is believed to be the first case of a federal judge ordering Apple to assist the government in unlocking an iPhone—and the technology giant not only complied; it helped prosecutors draft the court order requiring it to do so.

The recent dispute over the locked iPhone 5C of Syed Rizwan Farook, one of the San Bernardino shooters, ended last week with the announcement that an undisclosed third party had shown the government a technique for accessing the phone’s data.

Since the Watertown phone was opened, Apple has helped federal investigators access more than 70 phones, according to a government court filing. But in recent months, the company has resisted at least a dozen requests by the Justice Department for orders compelling the company to assist in bypassing phone security, according to a February court filing by an Apple lawyer.
Advertisement

Court documents and interviews with those involved in the Watertown case shed light on a bygone era of cooperation between Apple and the government, before the two sides parted ways on issues of data security and customer privacy.

The 2013 leak by Edward Snowden on government surveillance programs prompted the tech industry both to shore up their products’ security features and to view the government’s requests for information more skeptically.

In a decision that reached the very top of the company, say people familiar with the situation, Apple in 2014 tightened its phone encryption, making it hard for even the company itself to unlock an encrypted phone. Today, Apple and other tech firms argue that compelling them to write new software to open devices would create new security flaws for millions of their customers.

Apple first began selling iPhones in 2007, about a year before New York State Police executed a search warrant at the Jansens’ home.

Authorities suspected Mr. Jansen was in possession of child pornography, according to documents filed in the case.

But during a search of the home for evidence of child pornography, Ms. Jansen revealed to an investigator that she and her husband had drugged and raped Mr. Jansen’s 5-year-old daughter and 8-year-old stepson, who stayed with the couple over the summer of 2008. Ms. Jansen also admitted sexually abusing her 1-year-old daughter with Mr. Jansen.

The Jansens were arrested on Sept. 18, 2008. When workers from child protective services came to take the infant, the mother gave them a diaper bag. Inside that bag, the foster parent assigned to care for the child discovered an iPhone, according to court documents.

Federal authorities joined the case in December 2008. Before seeking a federal search warrant for the iPhone, investigators consulted with Apple, according to a Justice Department brief filed last year in a separate case in Brooklyn.

The company wanted a court order authorizing it to crack a customer’s passcode. But it was otherwise cooperative: An Apple lawyer supplied the Justice Department with language to use in the agency’s legal request for the order, according to the brief.

Lisa Fletcher, a federal prosecutor in Syracuse, said in her Dec. 15, 2008, request that no specific statute authorized a company like Apple to help law enforcement. But, she continued, the court could order Apple to help the government under the All Writs Act, an old federal law that judges had used in the past to conscript telephone companies into helping federal agents install and operate call-tracking devices.

U.S. Magistrate Judge George Lowe signed the order within hours of the Justice Department’s request. A New York State Police investigator then took the iPhone to Apple’s headquarters in California, according to court documents and a person familiar with the case.

Apple engineers bypassed the phone’s passcode in the investigator’s presence, according to court documents filed in Brooklyn.

People familiar with that and other phone-opening cases said the Watertown case is the first to their knowledge in which All Writs Act order was used to crack an iPhone, though definitive records on that score don’t exist.

At the time, said people familiar with the matter, it wasn’t considered a big step worth noting, because government authorities had long used the All Writs Act to get companies to help them with various devices and technical issues.

Once opened, the iPhone in the Watertown case revealed damning evidence of the couple abusing the children, including text conversations between the Jansens about specific acts they wanted to commit, as well as messages they exchanged during the abuse, according to sentencing documents.

Both husband and wife pleaded guilty to federal charges in October 2009 and were sentenced the following year to life in prison without parole.

At the time, there was little indication that All Writs Act orders requiring Apple to help them open phones would become the subject of a high-stakes legal fight between the world’s largest tech company and the U.S. government—a conflict that remains unresolved and is expected to escalate in the coming months.

“Apple was cooperative and very helpful in assisting us in obtaining important information in this case,” said U.S. Attorney Richard Hartunian, whose office prosecuted the case against the Jansens.

Write to Joe Palazzolo at joe.palazzolo@wsj.com and Devlin Barrett at devlin.barrett@wsj.com

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.