Judge Demands that Apple Backdoor an iPhone

A judge has ordered that Apple bypass iPhone security in order for the FBI to attempt a brute-force password attack on an iPhone 5c used by one of the San Bernardino killers. Apple is refusing.

The order is pretty specific technically. This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.

From Apple’s statement about its refusal:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks ­ from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers ­ including tens of millions of American citizens ­ from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Congressman Ted Lieu comments.

Here’s an interesting essay about why Tim Cook and Apple are such champions for encryption and privacy.

Today I walked by a television showing CNN. The sound was off, but I saw an aerial scene which I presume was from San Bernardino, and the words “Apple privacy vs. national security.” If that’s the framing, we lose. I would have preferred to see “National security vs. FBI access.”

Slashdot thread.

EDITED TO ADD (2/18): Good analysis of Apple’s case. Interesting debate. Nicholas Weaver’s comments. And commentary from some other planet.

EDITED TO ADD (2/19): Ben Adida comments:

What’s probably happening is that the FBI is using this as a test case for the general principle that they should be able to compel tech companies to assist in police investigations. And that’s pretty smart, because it’s a pretty good test case: Apple obviously wants to help prevent terrorist attacks, so they’re left to argue the slippery slope argument in the face of an FBI investigation of a known terrorist. Well done, FBI, well done.

And Julian Sanchez’s comments. His conclusion:

These, then, are the high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.

A New York Times editorial.

Also, two questions: One, what do we know about Apple’s assistance in the past, and why this one is different? Two, has anyone speculated on how much this will cost Apple? The FBI is demanding that Apple give them free engineering work. What’s the value of that work?

EDITED TO ADD (2/20): Jonathan Zdziarski writes on the differences between the FBI compelling someone to provide a service versus build a tool, and why the latter will 1) be difficult and expensive, 2) will get out into the wild, and 3) set a dangerous precedent.

This answers my first question, above:

For years, the government could come to Apple with a subpoena and a phone, and have the manufacturer provide a disk image of the device. This largely worked because Apple didn’t have to hack into their phones to do this. Up until iOS 8, the encryption Apple chose to use in their design was easily reversible when you had code execution on the phone (which Apple does). So all through iOS 7, Apple only needed to insert the key into the safe and provide FBI with a copy of the data.

EFF wrote a good technical explainer on the case. My only complaint is with the last section. I have heard directly from Apple that this technique still works on current model phones using the current iOS version.

I am still stunned by how good a case the FBI chose to push this. They have all the sympathy in the media that they could hope for.

EDITED TO ADD (2/20): Tim Cook as privacy advocate. How the back door works on modern iPhones. Why the average American should care. The grugq on what this all means.

EDITED TO ADD (2/22): I wrote an op ed for the Washington Post.

Posted on February 17, 2016 at 2:15 PM222 Comments

Comments

Erik February 17, 2016 2:22 PM

Also worth noting that Congressmen Thomas Massie and Justin Amash have supported Apple’s efforts via their Facebook pages.

QnJ1Y2U February 17, 2016 2:27 PM

The most encouraging thing about this (alluded to in the tidbits.com essay): Apple thinks there is market value in protecting their user’s security. And in this case, really protecting it, not just pretending.

Matt February 17, 2016 2:30 PM

Wait… couldn’t Apple do the backdooring once, in its own facility, for that specific iPhone, under the auspices of FBI officials, and give them the decrypted data, and then destroy the tools used to backdoor the phone? Technologically, what Apple is saying makes no sense.

B February 17, 2016 2:33 PM

100% political; the FBI does not technically need Apple’s help here and Apple can technically comply with this order.

Mike February 17, 2016 2:43 PM

@Matt: Sure they could do that. And once they’ve done that, they’ve set precedent that it can be done again in the future – even if they destroy the backdoored version of the OS, they can just recreate it as many times as the court would like to require. Eventually, they’ll get tired of recreating it, and keep one around. Just one… which escapes, or is copied. Etc.

Once you pay Danegeld, you never get rid of the Dane.

Arthur February 17, 2016 2:49 PM

Here is what the FBI wants :

Judge Pym wrote:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

http://www.cultofmac.com/412738/apple-must-unlock-the-iphone-5cs-encryption-or-else/

Daniel February 17, 2016 2:56 PM

That tidbits essay is massively disappointing. It is riddled with circular logic but most importantly it is based on facts that are either totally false or framed in ways that are entirely misleading.

As one example, “law enforcement has historically had access to anything we have ever recorded or communicated. Until now.”

That is manifestly untrue. He writes as if the 4A never existed. Indeed, if we go back into the 1800s and early to mind-1900s there was precious little that law enforcement could get.

What is really going on culturally is that the scientific and technological revolutions from fingerprinting to DNA to mobile communications has generally been a one-sided affair that has benefited centralized control of people, places, and things. Arguably, this is not surprising and may even be a manifest necessity because during that same time the world has population exploded. What encryption represents is a push back against those centripetal forces. It’s a good place to push back the technical arguments (backdoor help both good guys and bad guys) is sound.

It doesn’t do anyone any good to misstate the historical record as the Tibbits essay does as it makes what people are asking for seem more radical than it really is. People who are in favor of encryption want to stop a runaway train, not prevent the train from operating at all.

Just passin' thru February 17, 2016 3:06 PM

Over at TechDirt, Danny describes why this is an impossible request…

https://www.techdirt.com/articles/20160216/17393733617/no-judge-did-not-just-order-apple-to-break-encryption-san-bernardino-shooters-iphone-to-create-new-backdoor.shtml#c851

I note in passing that Apple has said in the past that this is impossible for them to do, but Tim Cook’s recent statement implies that they can.

http://arstechnica.com/gadgets/2016/02/tim-cook-says-apple-will-fight-us-govt-over-court-ordered-iphone-backdoor/

I wonder what is the basis for the legality of a civil court order such as this one…

  • it is one thing to produce something you have in response to a subpoena, or<l/i>
  • to desist from some future illegal activity from which you have been found guilty,<l/i>

    but it seems quite a different thing to me to be required to DO something that is against your own legal and best interests. (It is legal to pursue your legitimate economic interests, and not do anything that would undermine them.)

  • Daniel February 17, 2016 3:07 PM

    @Arthur.

    So what the FBI wants is a (a) a process to make cracking the phone possible and (b) assurances that if they successfully crack the encryption that any data will be preserved and not auto-deleted either during or after the process of cracking.

    That is what ever criminal wants too. The fact that what the FBI wants is to make cracking easier vs a direct back door is a distinction without a real world difference, especially for an organization with the expertise and massive resources of a TLA.

    Jim Dandy February 17, 2016 3:07 PM

    This is fundamentally a problem with encryption. The information is simply hidden in the cipher text and the key is just another piece of information. This train wreck has been coming for a long time.

    Anonymous Coward February 17, 2016 3:12 PM

    If Apple is technically capable of providing the FBI with such a backdooring tool, does this not mean that the backdoor is already in place?

    Yet Another Bruce February 17, 2016 3:13 PM

    I like the way Apple is working hard on security. I wonder if they could maintain a whitelist of legit cell sites as a way to thwart cell site simulators (e.g., Dirtbox, Stingray).

    Tim February 17, 2016 3:14 PM

    Time for an OpenKey http://crp.to/ok Opensource encryption, only you own the keys and can validate there is no backdoor. One useful feature for places where encryption is illegal is plausible deniability mode.

    AES February 17, 2016 3:17 PM

    I’m surprised there’s no reference to Apple’s letter:

    http://www.apple.com/customer-letter/


    February 16, 2016 A Message to Our Customers

    The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

    This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
    The Need for Encryption

    Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

    All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

    Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

    For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
    The San Bernardino Case

    We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

    When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

    We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

    Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

    The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
    The Threat to Data Security

    Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

    In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

    The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

    The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

    We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
    A Dangerous Precedent

    Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

    The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

    The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

    Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

    We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

    While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

    Tim Cook

    evil dirk February 17, 2016 3:25 PM

    The problem with devices, especially those that can connect via WIFI or USB can also have their device drivers also exploited. I had a very recent account breach attempt via an exploit targetting an out of date driver left behind by an old device I have not used in over a year. Lucky I quickly spotted the attempt as I was just checking security logs as it tried to run. WIWIFIWIFI

    MLauer February 17, 2016 3:27 PM

    This public spat is 100% political because FBI doesn’t need Apple to get the data on the phone. This is part of the FBI’s campaign to soften up the public regarding privacy by implying that Apple is obstructing justice in a horrific terrorism case… “See, we need Apple to backdoor your device so you won’t be gunned down my maniacs!”. Apple also wins because they can say they are so committed to their user’s privacy that they will defy the FBI in a horrific terrorism case.

    Me February 17, 2016 3:28 PM

    Even if Apple refuses to comply, if the delete after 10 failed password attempts feature can be bypassed by a specially made OS update, then the security of the iPhone is already compromised.

    If this brute force protection feature is bypassed—now, or in ten years—anyone who used a simple passcode (most people) and lost physical access to their device, or a backup of their device, could later have their data accessed.

    If it can happen, it probably will.

    Just passin' thru February 17, 2016 3:38 PM

    @Matt

    Once the Apple creates the tools or iOS version that circumvent Apple’s encryption, a court order or even a mere subpoena or national security letter will obtain it for the government… so there is no more security at that point. Can you say Lavabit?

    AES February 17, 2016 3:43 PM

    There are a few obvious features I have taken from the story that Bruce or others may wish to comment on:

    • The handset is an iPhone 5C (i.e. no TouchID or Secure Enclave)
    • The handset was running iOS 9
    • The handset was provided by a local state department (the terrorist’s employer)!
    • If an effective MDM policy was in place his employer could have unlocked the device
    • Apple have already provided his iCloud data to the FBI
    • Metadata from Verizon would have given a lot away (SMS, cell site locations etc.)
    • Apple can sign and install new iOS firmware (i.e. comment out the data erasure code)
    • The PIN was a simple numeric code (i.e. 4 digits) and not a complex passphrase
    • If the chip is removed and put onto a debug board then breaking the full encryption would be necessary and would take many years. (Apple entangles a unique ID on the chip along with the user passcode and some other things). However on-the-device cracking only requires trying every four-digit permutation.
    • If the firmware is modified allowing unlimited passcode attempts on the device (and/or removing the arbitrary, software (not hardware) controlled time delays)) then the actual delay would be milliseconds (not seconds, minutes or hours).
    • If the terrorist had used a complex passphrase then Apple would be able to provide absolutely no assistance. On new iPhone’s this is far easier because of TouchID. This unlocks the device at the press of a finger but the full passcode is required upon reboot of the device, after 48-hours or after 5 incorrect TouchID attempts.
    • Going forward Apple may wish to entangle the user’s passphrase with the boot-loader and/or prevent new firmware being installed until the device has authenticated.

    • Users wishing to secure their devices from external interference would do well to:

    • Disable iCloud (or use a zero-knowledge encrypted cloud service)

    • Don’t backup to iTunes unless you use system FDE (it stores a binary authentication token)
    • Use a complex passphrase (alphanumeric >16 characters, numbers, symbols etc.)
    • Use a modern 64-bit iPhone with the latest firmware (9.2.1) and keep it up-to-date
    • Don’t use a jailbroken device – this removes significant in-built protection
    • If you suspect your device will be seized then power it down
    • Ideally disable pairing using the Apple configurator tool

    The above steps prevent logical, physical and over-the-air acquisition. It also makes exhaustive key search the only (and highly impracticable) option.

    It’d be great if Bruce, Apple and anybody else committed to privacy would publicise the above steps so that users can make an informed choice on whether they want to lock down their device from invasive searches.

    Well done to Apple for protecting user’s privacy!

    Matt February 17, 2016 3:51 PM

    @Mike: But if Apple can do it… couldn’t anyone do it? If there’s a secret key that only Apple has that prevents anyone else from doing it, couldn’t that key also leak? Aside from a little implementation work (trivial for the NSA or the like), what’s the difference between the key leaking and the tool leaking?

    j8v3 February 17, 2016 3:55 PM

    If everyone USED GOOD PASSWORDS, this whole thing wouldn’t be an issue!

    It’s because of lazy/stupid people that we’re in this mess. They want insecure 4 or 6-digit PIN codes to unlock their phones because it’s convenient, and they complain about having to type anything even remotely complex like “password123”, yet they expect to have strong security and privacy that is resistant even to the US government. These fools bring us all down by demanding the lowest level of security and expecting the higest.

    This whole idea that you can secure a system with only a 4 or 6-digit PIN is a dream. If everyone used good passwords or passphrases, Apple’s assistance would be of little help to the FBI.

    Apple should at least make iOS use alphanumeric passwords by default instead of numeric PIN codes, and warn users who opt to use a PIN that it’s not very secure. This way, iOS would have secure defaults and if people want the weaker security of a PIN instead, that’s their own informed choice to make.

    AES February 17, 2016 4:03 PM

    @j8v3

    I agree with you on the issue of secure passwords for the current generation of devices but, as I’ve alluded to, if Apple “Going forward … entangle the user’s passphrase with the boot-loader and/or prevent new firmware being installed until the device has authenticated” then a user COULD get away with having a 4/6 digit passcode PROVIDING that the incorrect password counter was implemented in hardware (and not software).

    If you were to do a chip-off acquisition then you’re stuck trying to break AES which, as we all know, isn’t going to happen any time soon. This is because Apple incorporate unique (non-recorded) ID’s on-board their cryptographic module.

    By doing this you’d get convenience and great security (even with a 4/6 digit passcode). However until/if those changes get implemented then the only proper option to those concerned is using a strong passphrase and powering their device down in an unsafe situation.

    Daniel February 17, 2016 4:13 PM

    I think the whole pin vs passphase debate is a red herring. The FBI position is as follows.

    Option One: We want a backdoor
    Option Two: If we can’t have option one, we want God Mode.

    In this case God Mode meaning that it has the expertise and the resources to break device. The intuitive appeal of God Mode is that it takes away the “good guys bad guys” argument because the average criminal/hacker doesn’t have God Mode.

    So while it is true that in this case they might be able to break the PIN trivially everything that we have learned about how users set passwords over the last 30 years tells us that even with a passphrase they still will be able to break the passphrase in 95% of the cases in a trivial fashion.

    What is really damaging to the FBI cause is the data wipe procedure because then the strength of the passphrase is practically meaningless. What I would like to know is whether AES’s claim above is really true…if shifting the wipe from software to hardware would actually impose a meaningful burden on the FBI or whether all it would mean is that Apple would be forced to write a backdoor in machine language rather than a higher level code.

    zzz February 17, 2016 4:19 PM

    Why doesn’t the judge require them to hand out user data using warrants instead since Apple has root access on all Apple devices that are connected to the Internet?

    j8v3 February 17, 2016 4:19 PM

    Also, this whole story is a charade to distract us from the fact that Apple has been in bed with the government in the past.

    Apple was a partner in the NSA’s PRISM program, voluntarily providing the NSA with access to customer information — no warrants or court orders involved. Now, Apple regrets that partnership because it hurt their image and profits. They hope we’ll forget all about PRISM if their marketing department gives the impression that Apple now cares about customer privacy over profits.

    Also, Apple doesn’t like to admit it in their customer letter, but they’ve already given the San Bernardino terrorist’s unencrypted iCloud data to the FBI. Data on iCloud is encrypted with Apple’s key, not customers’ keys. So much for Apple’s claims that even they can’t access customer data themselves.

    Let’s not forget that Apple is a corporation that will do ANYTHING to protect its profits, with or without customer privacy.

    AES February 17, 2016 4:28 PM

    @j8v3

    I agree with you about Apple’s previous conduct but it seems like they’re making positive reforms. At the moment they do seem to be creating the most secure device compared to Android, Windows Phone or Blackberry’s.

    As Snowden tweeted:

    This is the most important tech case in a decade. Silence means @google picked a side, but it’s not the public’s.

    Apple do make it clear what they can/can’t do in their security guide.

    https://www.apple.com/business/docs/iOS_Security_Guide.pdf

    Also here’s a detailed, official, website from Apple that has different country’s Security Configuration Guides:

    https://support.apple.com/en-us/HT202739

    bud February 17, 2016 4:34 PM

    Even if Apple refuses to comply, if the delete after 10 failed password attempts feature can be bypassed by a specially made OS update, then the security of the iPhone is already compromised.

    A secure “delete after 10 attempts” feature is impossible in the first place. The general idea in cryptography (Kerckhoffs’s principle) is that the key is the only secret. Once someone gets the cyphertext, they can make copies and try to crack them as much as they want. This makes Apple’s statement worrying, because if they don’t have the key they shouldn’t be able to help at all.

    Perhaps they could use some tamper-resistant hardware like a TPM to store part of the key. But those don’t provide any information-theoretic security, they just add engineering complexity to hacking attempts. The FBI can probably defeat those better than Apple (“anyone can create a system he/she cannot defeat”).

    daniel (not Daniel) February 17, 2016 4:34 PM

    The OpenKey plausible deniability feature is problematic for several reasons, discussed many times. A HW key is fine for common access control but conceptually does nothing to address the Apple problem.

    AES February 17, 2016 4:38 PM

    @Daniel

    “What I would like to know is whether AES’s claim above is really true…if shifting the wipe from software to hardware would actually impose a meaningful burden on the FBI or whether all it would mean is that Apple would be forced to write a backdoor in machine language rather than a higher level code.”

    We’ve heard that the Obama administration has ruled out forcing tech companies to incorporate backdoor to software (the FBI have vowed to keep up the pressure).

    Impossibility is a defence for failing to comply with an order issued pursuant to the All Writs Act. Therefore if Apple and others stand firm and make their products even more secure I can see us reaching a stage where compliance with a warrant is physically impossible.

    Congress will then have two choices – legislate to make backdoors obligatory or do nothing and companies will have a valid defence.

    That would be the “meaningful burden” you talk about. Of course if a true backdoor is created then the debate of passcode/passphrase becomes a red herring but unless the law is changed (or until a hardware counter is implemented) then the stronger the passphrase the better.

    me February 17, 2016 4:39 PM

    @j8v3

    “Also, Apple doesn’t like to admit it in their customer letter, but they’ve already given the San Bernardino terrorist’s unencrypted iCloud data to the FBI.”

    Do you have a source link you can post? What I read was that the FBI was able to access his iCloud data. But I didn’t read that Apple had delivered the data.

    AES February 17, 2016 4:41 PM

    @me

    In reference to your question to @j8v3 I can answer it:

    When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.”

    Now have a read of the Apple Law Enforcement Guidenlines 😉

    http://www.apple.com/privacy/docs/legal-process-guidelines-us.pdf

    daniel (not Daniel) February 17, 2016 4:47 PM

    @bud – you are exactly correct. You have stated what is the glaring problem here and I am surprised how many keep dancing around it. How could Apple say all along their security was designed so even they could not break in. Obviously, if that were true they wouldn’t be in this situation.

    HW keys just add another layer. HW keys have been around s long time. They were RSA’s bread and butter. I used very secure PCMCIA cards to log on ten years ago. But it’s the same problem. It doesn’t make any difference if your password is ten pages long.

    Recent advances in remotely recovering encryption keys may seem exotic but it’s going to get better and better.

    j8v3 February 17, 2016 4:48 PM

    It seems people are forgetting that Apple products are proprietary, not free, in the current situation with the government/FBI. It’s no mystery why this situation is happening: this is exactly what you get with proprietary computer systems!

    Due to potential gag orders, there could already be backdoors in iOS and you’d never know it, or Apple could be ordered in the future to place a backdoor and you’d similarly never find out. With proprietary products, you simply do not have the freedom to ensure there are no backdoors because you can’t inspect or change the code yourself. You’re blindly trusting Apple and, by extension, the government(s) in which it does business.

    Even if Apple succeeds in this case, and doesn’t have to produce a backdoor for the FBI, their products will still be proprietary. How do you know there aren’t other backdoors already there? Or future backdoors? You’ll still have no way of knowing for sure what’s going on in your iPhone and no way of fixing it.

    People have been supporting this mess with the FBI by buying proprietary systems. And proprietary systems favor shadiness and backdoors. This is what you get. I agree with everyone that it’s unfortunate, but it shouldn’t come as a suprise.

    j8v3 February 17, 2016 4:59 PM

    @me

    Apple encrypts iCloud data with their own key, not users’s keys.
    https://theintercept.com/2014/09/22/apple-data/

    The FBI has the terrorist’s iCloud backups. The only way they could’ve gotten this data is if Apple gave it to them.
    http://www.macrumors.com/2016/02/16/apple-ordered-unlock-san-bernardino-iphone/

    “Authorities said they were able to access several backups of Farook’s iCloud data, which were saved a month before the attack took place. Prosecutors argued that the evidence in his iCloud account indicated he was in communication with both his victims and his wife, who assisted him in the attack. They allege he may have disabled iCloud data saves after that point to hide further potential evidence.”

    j8v3 February 17, 2016 5:07 PM

    @me

    One more thing: maybe Apple didn’t necessarily “give” the terrorist’s iCloud data to the FBI. Maybe the FBI took it by accessing the iCloud servers remotely through PRISM. Either way, Apple isn’t being very forthcoming about this.

    me February 17, 2016 5:21 PM

    @j8v3

    Thanks for those links.

    I recognize that Apple has keys for iCloud user data, and has the capability to access this information. But it would be useful to confirm that Apple did decrypt that data in the San Bernardino case as part of their understanding of what complying with a warrant entails.

    Apple does indicate that they cooperated in full with the warrant, but they don’t specify what kind of information they actually provide the FBI.

    It is conceivable that the FBI accessed iCloud data without assistance from Apple.

    EvilKiru February 17, 2016 5:23 PM

    @daniel: @bud is not “exactly correct”, because the FBI isn’t asking Apple to decrypt the phone and Apple continues to assert they can’t decrypt it.

    The FBI is asking Apple to create custom firmware that will allow the FBI to try to brute-force the PIN without having to worry about whether or not the phone has been configured to auto-erase after 10 failed PIN attempts (they don’t even know whether or not that feature is enabled). None of this involves breaking any encryption.

    Top top things off, it’s highly doubtful the FBI will discover anything they can’t already discover by getting a secret warrant to tap into the NSA’s massive collection of cellphone metadata.

    Dirk Praet February 17, 2016 5:26 PM

    This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.

    The entire case hangs on the fact whether or not Apple is technically capable of doing so for just this one device.

    If not, and iOS has to be reengineered to do so, then first of all this would amount to an “unreasonable burden” on Apple as in United States v. New York Telephone Co. (1977). On top of that, a general backdoor would be blatant overreach because it would impact not only one particular device, but those of millions of other people too. Last but not least, such an order would be unconstitutional in that forcing Apple to push backdoored updates would constitute “compelled speech”, which not only is in violation of the First Amendment but would equally raise Fourth and Fifth Amendment issues.

    It is also worth noting that in October 2015, Magistrate Judge James Orenstein of the US District Court for the Eastern District of New York delivered a verdict in a similar case in which the DoJ invoked the All Writs Act to unlock a seized mobile phone, saying that Apple could not be automatically conscripted in government investigations because it is “a private-sector company that is free to choose to promote its customers’ interest in privacy over the competing interest of law enforcement.”

    Unless Apple indeed already has a (secret) way of breaking the device’s encryption, I’d recommend Tim Cook to take this case all the way to SCOTUS. And yes, it would be very useful if other tech CEO’s like Sundararajan Pichai would speak out on the issue too so whatever the outcome at least their users know where they stand on matters of privacy and security.

    @ j8v3

    Even if Apple succeeds in this case, and doesn’t have to produce a backdoor for the FBI, their products will still be proprietary. How do you know there aren’t other backdoors already there?

    It is often thought that all it takes to force a backdoor into a product or service is a National Security Letter (NSL), but even the government admits that they can only be used to obtain limited information, which does not include forcing anyone to introduce backdoors. It’s one of very few issues I have come to agree with @Skeptical. Comey’s ongoing backdoor lament and this particular case would seem to confirm this.

    Any backdoor found in US products is probably the result either of interdiction or active “joint ventures” (which puts me even more at odds with @Rolf Weber).

    aardvark 1 February 17, 2016 5:41 PM

    The people who support the US government having backdoors need to ask themselves if they would feel the same if China or North Korea were able to backdoor phones to get at dissidents. My guess is they would not support this, probably because they would be on the side of the dissident and against the “evil” regime.

    The problem with this thinking is that it assumes that the US government will never use its power for “evil”. But that simply cannot be guaranteed. The McCarthy era is but one piece of historical evidence that the US government can attack innocent citizens and will do so at times. Giving them more power is not safe because they cannot be guaranteed to always be “good”.

    The Bill of Rights exists to protect people from their government as our founders knew that governments with unbridled power are far more of a danger to their citizens than foreign adversaries.

    If we have the right to bear arms to protect ourselves then surely we have the right to strong encryption.

    CallMeLateForSupper February 17, 2016 6:00 PM

    “”Apple privacy vs. national security.”

    I hope that is not CNN’s assessment. I hope it reflects “Apple says privacy and FBI says national security”.

    CNN knows less about security and encryption than most Congress critters do. I don’t know which is worse: media not covering important issues because they don’t understand the issue and thus underappreciate their importance, or media addressing issues and doing a poor job because they don’t know what they’re talking about. Both situations are [insert-your-negative-adjective].

    Sancho_P February 17, 2016 6:12 PM

    Worth to mention that a special “update” (?) can be loaded into a locked device without the owner’s consent. Game over.
    I do hope it will work only with “old” devices.

    It’s not about Apple.

    AFAIK Tim Cook understands the memory’s content of his beloved phones as part of the owner (part of the owner’s brain).
    The phone’s content per se can only hurt the rightful owner.

    There is no universal (often called fundamental) right of privacy (humans can’t grant universal rights),
    however, our thoughts are free in the universe.
    This is the only “right” we as individuals could try to claim.

    Freedom of mind is the freedom we have.
    One can’t be charged for bad thoughts.
    To share them might be a crime.
    To store them securely, for no one else to read, is it right or wrong?

    But no problem, if we ask the public Tim Cook will have to leave the US jurisdiction.

    Now it’s about Apple.

    daniel (not Daniel) February 17, 2016 6:18 PM

    @EvilKiru – you miss the point. The information IS IN THE PHONE. Quit pretending like it’s in some alternate universe because it’s encrypted.

    bud IS correct and the only one who mentioned information theoretic security. Were that protecting the information it would not make any difference whether Apple cooperated or not.

    Thoth February 17, 2016 6:28 PM

    @all
    The better architecture would be to have a hardware protected chip (TPM, HSM…). An interesting note is that the hardware protection chips the smartphones uses (Secure Enclave, Samsung Knox..) which are simply derived from from the ARM TrustZone architecture which is NOT a tamper-resistant chip unlike most hardware TPM or HSMs means a Secure Enclave or Samsung Knox can be bypassed by physically probing and decapping the chips which a TPM or HSM with it’s tamper resistant features can resist to some extend. It does have hardware accelerators and logical protection for crypto but not on a physical level.

    The better architecture to protect a device is to have multiple pieces of security parameters use to protect itself. One example isnto allow an NFC enabled smartphone to pair with a smartcard to be used to hold a split key for unlocking and decrypting the phone. This means that even if iPhone’s OS can be overwritten to have a backdoor to bruteforce the PIN derived crypto key, the attackers have to bruteforce the smartcard to make it spit out second half of the crypto key and a proper secured and open source smartcard applet can be easily created like using JavaCard API with setting a PIN try limit on the smartcard before it blocks the smartcard and thus the second half of the key. Due to the ubiquitous nature of smartcards in the open market with the ease of developing your own open source security applet to load into a wide range of ISO and GP compliant smartcard platforms, it should present a more secure multiple factor authentication.

    The same attack to replace signed software does not work on a smartcard because the only controls is to load and unload applet but no update applet exist. The only way to “update” the smartcard applet is to unload/delete (which will trigger the zeroize destruction of hardware keys) as part of it’s unloading process and thus render the method of “signed replacememt software” attack vector useless.

    Pairing the phone to a smartcard meant you always need to hold the phone next to a card but that should not matter if you use a phone casing with a few card slots sewn onto the casing. For the login interface, a 8 digit or long PIN would be required where the first 4 characters would be use for the phone’s native unlock while the last 4 characters would be fed into the contactless smartcard to authenticate the card fo release he 2nd key half.

    Lawrence February 17, 2016 6:51 PM

    Do anyone actually think that China or Putin’s Russia would let Apple sell their “secured” iPhone there without a backdoor? …”To comply with local laws”????

    Saying there is no backdoor is just Marketing Speak to show they “care” about their customers. What lies are these Apple Junkies fooling themselves with here?

    ianf February 17, 2016 6:53 PM

    @ AES

    Great, unambiguous, end-usable analysis/ advice. Only…

    Users wishing to secure their devices from external interference would do well to:

      – Disable iCloud (or use a zero-knowledge encrypted cloud service)

    Such as? (and, apart from Dropbox, are there any other fully automatic cloud backup alternatives to Apple’s iCloud?)

      – Don’t backup to iTunes unless you use system FDE (it stores a binary authentication token)

    Again, you lost me on the “system FDE” together with local iTunes backup. Please elucidate.

      – If you suspect your device will be seized then power it down

    Will that prevent e.g. the Israeli border security from hoovering up the data contents off a locked phone, or is resetting it, then reinstalling a fresh system on it in such cases still to be preferred?

    [Earlier Nicholas Weaver’s advice to LEAs https://medium.com/@nweaver/how-to-arrest-someone-with-an-iphone-79e0e010bd8%5D

    Dirk Praet February 17, 2016 6:53 PM

    @ Thoth

    The better architecture would be to have a hardware protected chip (TPM, HSM…).

    However much I agree with your technical analysis, a DoJ win in this case will set a precedent that in due time will also cover hardware protections like the ones you suggest.

    Athinira February 17, 2016 7:16 PM

    @ ianf
    ‘FDE’ means ‘Full Disk Encryption’. What he is essentially saying is that if you plan to backup your iPhone to your computer, you should make sure that this backup is protected as well but secure encryption.

    As for securely encrypted devices, powering them down is enough to keep them safe.

    @ Dirk Praet
    That is incorrect. Legal precedent have already been set that clearly says that (1) a company cannot be compelled to provide assistance that is impossible for them to provide, and (2) a company cannot be compelled to design their products to include back doors.

    So essentially, all a company needs to do is design their system they themselves cannot break or weaken it on an individual basis, and this precedent means nothing to such systems since the entire catch with this case is that there actually IS a way for Apple to weaken the system in its current state.

    Thoth February 17, 2016 7:36 PM

    @Dirk Praet
    That’s where the part on open source smartcard applets comes in so you only load the codes you have reviewed to handle the authentication and crypto. Of course you can backdoor the smartcard chips … which has thousands of variants and brands manufactured from around the world unless the basic argument is that all hardware and chip manufacturers are malicious and all Governments are Warhawks and malicious as well then no amount of electronic level security would be better off than plain paper and pencil which a hidden camera could sniff your writing 🙂 . You are not safe anywhere, everywhere….

    Have you considered embedding very tiny backdoors into individual transistors 🙂 . That means physically making a transistor level PCB computer using your own hands would also not be safe. Who knows if the circuits in data diodes are backdoored as well or even the resistors they used to actualized one way communication. This would be fun to know if transistors and resistors have backdoors in their epoxy packaging 😀 .

    Time to buy a bunch of acid, acetone, craft knives, gloves, high magnification microscope, soldering iron and petri dishes to decap the chips and hardware for inspection !

    @all
    Off topic but related. Talking about decapping chips, anyone have a safer acid mixture or a safer method that can be easily bought from a supermarket for removing epoxy ? All I am left with decapping a smartcard IC chip is to remove epoxy and peer into the exposed IC chip.

    Anon February 17, 2016 7:36 PM

    First of all, has no-one learned that Open Source does NOT make software better or secure? glibc is the latest casualty.

    The Apple vs. FBI case has zero to do with encryption, and everything to do with security and privacy.

    The FBI are NOT attempting to break encryption. They are asking Apple to do something that as far as anyone is aware to date has not occurred, which is they are asking Apple to [b]directly modify their own software, and add new functionality specifically to break the security of their products, and directly assist the FBI in breaking security/privacy.[/b]

    This is a huge problem, and means that any company could be asked to modify any software “in the name of national security”.

    What is also interesting, but seems to have been missed, is that if it required hardware modification, this would either be impossible or not legal. Why is this?

    AES February 17, 2016 7:39 PM

    @ianf

    Cloud backup

    Spideroak and Tresorit are two tried and tested zero-knowledge cloud providers which integrate nicely into iOS and also Android, Mac OS, Linux, Windows (and in the case of Tresorit also Windows Mobile). There are others out there but these are two that I’ve worked with.

    Both services require that you set your own encryption and they utilise the on-board iOS encryption to protect your cloud master key. They both support auto-synchronisation and provide a solution as near as possible to the native iCloud (including a WYSIWYG file manager interface; if you choose to use it) with the added benefit that only you can decrypt your data.

    Obviously if your phone is compromised/hacked then you’ve got a problem. In any event choosing the cloud depends entirely on your threat model – convenience vs security but either solution is better than iCloud.

    iTunes

    Most readers of this blog will be using FDE/WDE but the average computer users doesn’t, or they use the in-built Windows 10 encryption (which stores the recovery key in OneDrive), or they make basic errors that compromise their security.

    When you connect your iPhone to iTunes (on your computer) the device ‘pairs’ with your system – providing you allow your handset to ‘trust’ the system. Every subsequent occasion your handset is connected the phone automatically unlocks itself. This is very useful if an adversary wishes to gain access to your iPhone – just connect the phone to your computer and you’re in. If you’re using FDE/WDE then they’re not going to be able to do this.

    At the very least, if you’re not using FDE/WDE, then you want to encrypt your iTunes backup (see the link below for instructions). It goes without saying that the other information on your computer is likely to be of more use than merely your backup.

    https://support.apple.com/en-us/HT205220

    However if you’re not using FDE/WDE then you should seriously consider NOT allowing your iPhone to ‘trust’ the computer (even if your backup is encrypted) because an adversary can simply hook the phone up and then download the data from your device afresh! Elcomsoft provide software that does this in a forensically sound manner although it is trivial to extract the information yourself.

    The same applies to other devices that you ‘trust’. Some cars require that you ‘trust’ them which gives another vector for an adversary to gain access to your device.

    Device off

    The current generation of 64-bit iPhone (6S) running the latest version of iOS (9.2.1) can’t have data physically acquired by cable. The earlier 32-bit devices and ALL architecture (32/64 bit) jailbroken devices can have their data extracted from them.

    Once the iPhone is powered down (or five incorrect TouchID presses are registered) the encryption keys are destroyed from memory. The only thing that may be extracted are SMS messages (although not WhatsApp or other data services) which are received when the phone is powered on but not unlocked.

    It’s best to switch the device fully off at a border. If being forced to surrender your passcode/passphrase is a concern then you should pair-lock your device to make the artefacts on your phone ‘impossible’ to transfer to a computer – of course this wouldn’t protect you from somebody reading the data directly from the phone. See the article below:

    http://www.zdziarski.com/blog/?p=2589

    In summary if you think your phone will be seized then power it down and make sure you have a strong passphrase.

    Carlton February 17, 2016 7:44 PM

    When the government demands the creation of a tool that can break the door locks to your house it is grossly misleading to frame the debate as ‘privacy’ against ‘security’.

    In the first instance, access control is a major security issue.

    The second point is the assumption that breaking current security will make us all safer. No one is being made safer by disabling the security of our phones.

    L. W. Smiley February 17, 2016 7:52 PM

    Can a locksmith be deputized (against their will) to pick a lock for the police to execute a warrant? Whether or not the the locksmith designed and built the lock in the first place? Or is this:

    Posse comitatus — the common-law or statute law authority of a county sheriff, or other law officer, to conscript any able-bodied man to assist him in keeping the peace or to pursue and arrest a felon, similar to the concept of the “hue and cry.”

    Of course in the case of the locksmith and door, he may be endangered if the door or lock were booby trapped. I’m sure the phone has been examined and determined safe.

    Boory February 17, 2016 7:54 PM

    I thought law enforcement could already desolder and dump flash storage and that brute forcing the PIN was just to save time.

    Thoth February 17, 2016 8:05 PM

    @Anon
    Open source can be secure or insecure. There are high quality open source implementations and low quality ones. Of course open source gives more confidence as you can inspect them which is part of a security model … inspect the implementations before using it !

    Most open source codes are unverified and my latest Friday Squid entry talks about formal verification (like what @Nick P does) to make codes more verifiable and lesser of the Heartbleed and glibc vulnerabilities from codes without proper formal verification. The good old “fire-and-forget” attitude of developers and engineers that do not encapsulate assurance and formal verification is troubling.

    Encryption plays a big part in securing privacy and the Govt (not just FBI I guess) vs. Apple can be put into a political context of FBI trying to show they are the Boss and everyone must listen to them (in a political statement by using Apple as an example).

    Regardless whether modifying software or firmware is allowed or legal, all it matters is that Apple’s security implementation (and similarly Google’s Android and many others) that do not use multiple split authentication and keys for security is a huge weakness as it presents a single point of failure.

    The reliance of a single factor to rely on authentication by using signed packages (which @Clive Robinson mentions that signed packages means nothing) can be compromised by signing a modified package with vulnerability and throwing it down to the device.

    If a formally or at least semi-formally verifiable codebase with formalized logic published in an open source context, it would become a very powerful tool with the backing of clearly written and clearly formalized logic codes and users can be well informed of what they are using.

    Peter February 17, 2016 8:10 PM

    I wonder if apple are opposed to backdoors in general, or just to those they don’t hold a patent for ?

    Cryptographic system using chaotic dynamics – United States Patent 7,734,048 :

    “The invention is a cryptographic system using chaotic dynamics. A chaotic system is used to generate a public key and an adjustable back door from a private key. The public key is distributed and can be used in a public key encryption system. The invention can also be used for authentication purposes. The adjustable back door of the invention can be used in conjunction with the public key to derive the private key. The degree of difficulty involved in deriving the private key is dependent on the adjustable back door. That is the value of the back door can be adjusted to vary the difficulty involved in deriving the private key.”

    https://cryptome.org/2013/09/apple-backdoor.pdf

    EvilKiru February 17, 2016 8:14 PM

    @daniel: Of course the data’s in the phone. Where did I go so wrong in my attempt at communicating that you thought I was stating something contrary or “magical”?

    I was trying to point out the fallacy of @bud’s statement that “This makes Apple’s statement worrying, because if they don’t have the key they shouldn’t be able to help at all.”

    Niko February 17, 2016 8:38 PM

    I wonder if Apple could get 95% of what it wanted if it came up with a backdoor that required physical access to the phone. If they did that, then the sophisticated hackers and cybercriminals argument goes away, who aren’t going to go around stealing individual iPhones. Their “all or nothing” approach might lead to less security for their users in the long run.

    panax February 17, 2016 8:57 PM

    Apple is being coerced into signing malware under duress. The FBI will have a signed version of the OS which they can load into any compatible iphone because the signature is valid. This scenario is essentially the same as if Apple’s private signing key had been leaked and compromised. Apple should have implemented their design in a way to allow the revocation of certificates in the event that if their key were to become compromised, shouldn’t they? If not then this is a fault on Apple’s part. What would they do if their signing key had been stolen? Be like Microsoft and continue using it as if it hadn’t happened?

    Apple should be able to:
    1) give a signed modified version of the OS to the feds.
    2) generate a new certificate that they will use to sign all future software. 3)release an update to all possibly affected devices which replaces Apple’s previous root certificate with their new certificate thus revoking the old certificate and preventing the feds from using their modified backdoored version of the OS on any device which has been patched.

    Does anyone know enough technical details about the iphone 5 to determine if this is feasible? Or is the Apple root cert read only with no way to update it/revoke it?

    Daniel February 17, 2016 9:11 PM

    @Nilo

    Of course, but that would limit the FBI too in unacceptable ways. That’s why the FBI wants Apple to /write a program/. Once the software is out there, then the FBI can use that software anytime they want; they don’t even need a warrant, a national security letter will do. For surveillance to be economical it must posit a one to many relationship at some level. Requiring a physical phone in a physical hand is a one to one relationship that amps up the cost of surveillance to unacceptable levels.

    The FBI doesn’t care about this case from an investigative point of view; they care about the legal precedent this case will set. Never let a crisis go to waste.

    Niko February 17, 2016 9:20 PM

    @Daniel

    That might be the FBI’s position, but the judge in paragraph 4 seems almost to be asking Apple to come up with a technical compromise. If Apple made the motion, it seems like they would have a good chance of success with that judge.

    Thoth February 17, 2016 9:28 PM

    @Niko
    This forum had discussions regarding physical backdoors in the past and usually it turns into a flaming thread 🙁 .You can search the archive for Nick P, Clive Robinson, Wael and myself and we had some discussions back then.

    Anon February 17, 2016 9:55 PM

    @Thoth:

    I appreciate that having access to source code is a good thing, especially when you want to audit it for security purposes, but my point was more general in that people think Open Source is better/more secure because it is Open Source. That is blatently not true, but too often I read that here in the comments (an example exists in the comments of this blog post).

    There was a link to comments posted about “systemd” in the Squid thread. The irony pouring out of the site was outrageous! Everyone was moaning that systemd was getting buried into Linux, but not one suggestion of modifying affected packages and getting rid of it! “All talk and no trousers”, to use a British expression. In other words, they used Linux for the cool factor, but don’t have a clue how to write a single line of code to fix the very thing they claim to love – openess and ability to freely modify the software!

    Not laughed that hard in a while.

    Back on topic: I think the signing key to verify updates is in the firmware. Doesn’t iOS also verify the image prior to boot? Like TPM? It must be possible to change it, otherwise how are units jailbroken?

    Tim Cook in an interview said that if they wrote the software the FBI wants, then the situation is dire. They don’t want to create it in the first place.

    The core question is: can a company (any company) be fompelled by court order to create software for the state? The implications of that are the equivalent to the invention of the atomic bomb.

    grendelkhan February 17, 2016 9:56 PM

    Sundar Pichai has responded on Twitter:

    Important post by @tim_cook. Forcing companies to enable hacking could compromise users’ privacy / We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism / We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders / But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent / Looking forward to a thoughtful and open discussion on this important issue

    Anon February 17, 2016 10:27 PM

    Looks to me like Google is engaging in damage limitation, rather than actually coming out fighting against this.

    Thoth February 17, 2016 10:30 PM

    @Anon
    In a technical sense, the root TPM key if owned by the user and the user signs his own OS to securely boot, that is fine in theory. But in practice, the TPM is used as a lock-in “security” feature and similarly the ARM TrustZone and such secure boot that iOS and Samsung Knox relies on. They rely on the Ape or Samsung private keys and this break the secure boot model because ow the codes’ privilege is not with the owner but with Apple and the likes and thus can be compelled. Think of it along the like pre-End-to-End era where all your messages rely is on the server’s private keys to relay and secure messages (iCloud as a good example). What US Govt is asking is similar to corrupt a trusted channel. Hardware keys cannot be changed as a good amount uses EPROM programmed keys but what you can do is get to the low levels and run as root if you can access the phone’s low level functions.

    Observer February 17, 2016 10:44 PM

    @AES

    * Metadata from Verizon would have given a lot away (SMS, cell site locations etc.)* Apple can sign and install new iOS firmware (i.e. comment out the data erasure code)* The PIN was a simple numeric code (i.e. 4 digits) and not a complex passphrase* If the chip is removed and put onto a debug board then breaking the full encryption would be necessary and would take many years. (Apple entangles a unique ID on the chip along with the user passcode and some other things). However on-the-device cracking only requires trying every four-digit permutation.* If the firmware is modified allowing unlimited passcode attempts on the device (and/or removing the arbitrary, software (not hardware) controlled time delays)) then the actual delay would be milliseconds (not seconds, minutes or hours).* If the terrorist had used a complex passphrase then Apple would be able to provide absolutely no assistance. On new iPhone’s this is far easier because of TouchID. This unlocks the device at the press of a finger but the full passcode is required upon reboot of the device, after 48-hours or after 5 incorrect TouchID attempts.

    etc

    What technicians call handsets, everyone else calls smart phones.

    You said it right there, as others have said, the FBI can access this data. As others have said, “this is political”.

    They can simply remove the software lockout threshold and bruteforce every key attempt.

    (I doubt the touchid is all that strong, or that the latest system can handle brute forcing attacks, as well as many other possible implementation attacks, including combinations of software and hardware attacks.)

    It is hard to perfectly design such systems, mistakes get made, and testing is even more difficult.

    Who has the money and resources for such bug finding is not the vendor, but governments. They also have the time.

    @all

    There are a lot of political points here:

    1. apple was engaged in prism, and never came clean, but the documents proved the case. This story acts as a diversion away from that.
    2. the usg failed in their case with san bernadino, so this argument deflects the blame and diverts attention elsewhere. Apple is made a scapegoat, they get to keep their jobs and modify nothing which is meaningful.
    3. in contrast to random nutjobs, islamic terrorism as a hot button issue is deeply waning. This issue helps keep it front and center.
    4. a lot of people in these fields think they are working together, bonded together against the masses who ‘complain too much’, this keeps them bonded. It gives them an identity. Strengthens their group. In an ‘us vs them’ environment, everyone outside is the enemy.
    5. very likely, someone is already “in”, and it is not the FBI

    If Apple does go this way, who is next? And what message does this send globally about the security of the company? Or of any American company? It is trivial to come up with separate encryption systems to protect data.

    What are they going to do, try and prevent that, too?

    Will the Apple store kick out any new encryption software that makes it difficult to break?

    Why buy such things from america, when you can have it for cheaper from other countries, and more secure? Will they try and outlaw, that too?

    I liked what Bruce posted some days ago, “I thought we already agreed closed economies do not work”.

    The way it was, encryption was treated like a weapon. That did not fly, and now we have the internet. Massive international commerce and other communications. Without encryption, that would not be possible.

    So, they can not go that way.

    What they can do is create stratas of society which can be the super hackers and view any data, anytime, from anyone. The high priests and priests and cardinals and pope — where you had the religion as government. The Middle Ages model. Which evolved from the pseudo-‘caeser as god and/or demi-god’ model.

    There is a direct tendency for authorities to try and become gestapo, stasi, kgb.

    Like water seeks its’ own level.

    Ultimately, the authorities are acting as fearleaders for the terrorists. The San Bernadino case was a drop in a bucket, but they are doing all they can to try and incite the crowd to fear their terrorist power.

    The two work together. That is always a danger.

    Doctors profiting from sickness. Cops profiting from crime.

    Too easy for cops to start going,”Hey, if there is more crime, I can make more money and have more power”.

    Nothing scares the status quo more then the capability for the population to communicate safely.

    They can video tape, audio record, and anonymously put up information, anywhere, anytime.

    In Chicago, something like 72% of cops broke the audio of their dash cams.

    The citizens have to worry about hidden video cams and rare cop cars. Corrupt cops have to worry about anyone, anywhere, anytime and their smart phone.

    Corrupt politicians, same thing.

    The real diversion may be, the real surveillance of tomorrow is not Them. It is us. And they see that coming. And are scared. They want to win that before it comes to shore.

    Too little, too late.

    Next big story is always coming right along.

    Citizens are not each a ready journalist, quite yet. They are not quite as informed (facial recognition being more ubiquitous and trivial), and not quite as aware (making big money for accidental news stories, like your local sheriff doing something kinky in a bar) not quite so obvious. But, it is coming along, nicely.

    Gerard van Vooren February 18, 2016 1:23 AM

    @ Anon,

    First of all, has no-one learned that Open Source does NOT make software better or secure? glibc is the latest casualty.

    Read the GPL. It says clearly that it comes without any warranty. But I don’t think this thread is about open- vs closed source.

    The Apple vs. FBI case has zero to do with encryption, and everything to do with security and privacy.

    Let’s add trusting trust to that. Can you trust the FBI completely, looking at their history? Can you trust every individual working there? Why couldn’t work there an Aldrich Ames?

    This is a huge problem, and means that any company could be asked to modify any software “in the name of national security”.

    Which national security? That of the US? What about iDevices being sold in Russia, Germany or Brazil? Should the FBI still have access to these devices, or local FBI’s?

    You see, that’s where trusting trust goes off the scale.

    j8v3 February 18, 2016 2:48 AM

    @Gerard van Vooren
    @Anon

    This has everything to do with proprietary/non-free vs open/free software. See my comment on this same thread here. In free software, you can inspect your software and fix it, but not with iOS.

    rj07thomas February 18, 2016 2:56 AM

    I think this whole thing is a bit of a red herring anyway. People’s privacy isn’t being leaked through their phone, it’s being stolen from healthcare providers, DoJ, Facebook…

    keiner February 18, 2016 5:21 AM

    @Linda

    Absolutely correct, the sham debate on this blog is part of the matrix to obfuscation of the totally obvious:

    Electronics is pawnd by NSA/CIA/FBI.

    End of story…

    AES February 18, 2016 6:09 AM

    @Observer

    (I doubt the touchid is all that strong, or that the latest system can handle brute forcing attacks, as well as many other possible implementation attacks, including combinations of software and hardware attacks.)

    TouchID is designed for convenience – it allows a user to use a strong passphrase at boot/after 5 incorrect attempts.

    Most people don’t want to key in really long passwords each time they want to check their emails/text messages/make a call/etc. TouchID works by unlocking the handset using the stored hardware encryption key and user key combined (a simplified explanation).

    TouchID allows users who otherwise wouldn’t use a cryptographically secure passphrase to now use one knowing that they will only have to enter it under certain conditions (which I went into earlier on).

    If absolute security is required then disable TouchID and manually enter your complex passphrase each and every time.

    For most users I suggest my earlier advice will provide more than adequate protection and it will also mitigate the possibility of any effective brute force attack.

    I doubt we’d be having this conversation if a complex passphrase had been used on the phone but I do agree that it’s political.

    “Use a complex passphrase (alphanumeric >16 characters, numbers, symbols etc”
    “If you suspect your device will be seized then power it down”

    Gordon February 18, 2016 6:58 AM

    Lots of ill-informed technical comments here, but rather than focus ( or add) to those; as anyone looked at the cost side.

    The order requires that Apple provide a cost estimate for the work, I can’t remember if it says so explicitly, but I believe that this is because Apple will be re-imbursed for the work; so what is the cost?

    There is an example earlier in the comments of a locksmith compelled to open a lock he may have made/fitted.

    What is actually happening here is more akin to a locksmith manufacturer whose business depends to a significant degree on a claim that their lock cannot be defeated, be compelled to defeat it. If they do open the lock, they are shown to be a liar, and open to civil suits, and their business model is severely damaged.

    So what is the cost of compliance to Apple?
    Phone encryption is a selling point because people want a phone that doesn’t give up their secrets when lost/stolen
    or they want a phone that doesn’t give up their secrets when the government gets it
    or they have an aversion to state overreach, and think that governments should work for the data they steal.

    The first of these is a simple matter of programming, since most thieves don’t have the wherewithal to break into properly designed and implemented software encryption.

    So the cost to Apple is somewhere between the total development cost of the secure enclave minus the cost of a relatively simple system ( this is the minimum cost) and the total profit on all iOS sales where strong security was a significant factor. Lets say 10 billion thingies. This would make the FBI decide how much they want their PR.

    On the actual cost, since this is malware, Apple would need to do more testing than they would on a normal release to ensure it is proof against hacking the ‘only on this one device’ bit of the code. Including beta testing and third party bug bounty, I think I would want to spend a couple of years testing something so dangerous ( joke). But anyway, not just a week from an intern, probably at least a million including disruption to normal work flow from development, testing, release, code signing authority, authentication server….

    In the interest of fairness, the security that Apple are being asked to defeat is obsolete, but that may be too nuanced for Apple’s market, so this isn’t sufficient to disregard the argument.

    On the other hand, there is a risk that commercialising the argument damages the moral argument that the government should not be able to compel innocent bystanders to act against their just beliefs.

    Dirk Praet February 18, 2016 7:10 AM

    @ Athinira

    That is incorrect. Legal precedent have already been set that clearly says that …

    Isn’t that pretty much what I said? Please re-read my comments.

    @ Thoth

    Of course you can backdoor the smartcard chips … which has thousands of variants and brands manufactured from around the world unless the basic argument is that all hardware and chip manufacturers are malicious

    Of course not. It would probably suffise to target those most widely used in and with popular devices like iPhones, which manufacturers could then be compelled to use in new product lines as to make them compliant with “lawful requests”. I have no doubt that, technically, and for some of us, it is perfectly possible to come up with bullet proof solutions but LE and IC will continue to attack them by any means possible as long as the law gives them the power to do so. And if they can no longer hack or backdoor them, the end game is that they will just outright ban them. That’s why I think this fight first and foremost needs to be won at the legal level.

    CallMeLateForSupper February 18, 2016 7:45 AM

    tech: Manm this is some righteous encryption… wait!… I’m in![TM]
    Comey: Go for the contacts list.
    tech: Empty.
    Comey: Check archived textx.
    rech: Wiped.
    Comey: Archived emails, then!
    tech: Wiped. Except for this one draft… to FBI! “Fuck those guys.”
    Comey: Make a press release: “Today FBI technicians successfully accesed the terrorist’s phone data and gained actionable intelligence.”

    Bob Therrien February 18, 2016 8:15 AM

    Isn’t Data Wipe the equivalent of automatic destruction of evidence?

    Is anybody home at the fancy legal and technical institutions of Cambridge? I have been waiting over a day to see this “destruction of evidence” argument in the paper or somewhere. This blog is the closest yet — but still equivocated.

    Regis February 18, 2016 8:32 AM

    If Apple does this what is there to stop the Chinese/Russian/(fill in the blank) government to furnish one of their valid court orders to do the same thing to Apple products in their country? If there was a lost American diplomats iPhone would the FBI feel the same if FSB wanted it unlocked?

    Wes February 18, 2016 9:07 AM

    I’m confused about something that seems to be missing from all the coverage on this: how likely is the FBI to break the encryption, even if Apple does let them bypass the 10 password restriction?

    One would think that in a properly designed system, the password is put through some slow hash function before being used as a storage key. If this is the case, then a dictionary attack may be unfeasible anyways, correct? Or do we suspect that Apple used a fast hash function?

    QnJ1Y2U February 18, 2016 9:40 AM

    @Wes
    The unlock code is thought to be a four-digit number, so even if Apple was using a hash that takes eight seconds (they’re not), it would take less than a day to brute force the entire space.

    Oh nos February 18, 2016 9:54 AM

    @Bob Therrien

    No. At least in the USA destruction of evidence requires a mens rea. This means that not only does one have to be destroying evidence one has to do so knowing that it was going to be used as evidence. Otherwise every time someone deleted a file that was involved in a crime they would be guilty of destruction of evidence.

    ianf February 18, 2016 10:16 AM

    @ me Meanwhile, major media outlets “report” on the issue by parroting officials, effectively working to keep the general population as uninformed as they are

    Yes to the first clause, but your conclusion of the media “working towards” some such goal is neither within their remit, nor necessary: it is the end-effect of the mindlessly served pap. I deplore as much as anybody the evisceration of reason from text content, but let’s keep in mind that the real and mythical papers that we grew up with are no more in these times of free web content. Where a newspaper in the past might have had 10 editors, 5 fact checkers and proofreaders, they now have 2, 0, + a realtime SPELCHQR. They had to adapt or die, and we, the no longer news-buying public, have a lion share in that blame.

    @ Thoth You can search the archive for Nick P, Clive Robinson, Wael and myself and we had some discussions back then.

    This is a bit administrivial, but still… so you’re effectively inviting the reader(s) to sift through the accumulated Schneier blog db to uncover some golden nuggets that may or may not contain (unstated) key phrases or keywords, a Sisyphean task for anyone but these hazy nuggets’s authors. Clive Robinson uses the same look-back-only-I-know-what-for-but-am-not-telling technique a lot, which really tells us that (1) neither of you cares for your previous words of wisdom enough to maintain a private tagged index into own postings; and (2) your time to research these URLs [one to many] is more valuable than the aggregated time of these others [each can do his own research]. And yet, we all communicate in a hypertextual medium with unambiguous, easy to record item-direct-access addresses (yes, I do maintain a private searchable record of own web comments/ submissions [using Gmail’s search engine of all things]).

    @ Anon February 17, 2016 10:27 PM

      [There are so many “Anons” here that one can never know which one one engages with. For that reason alone, couldn’t you at least adopt some distinguishing numerical index, or different spelling of? Try (far as I know not taken yet) “Ah!Non!” or why not “Onan”?]

    Looks to me like Google is engaging in damage limitation, rather than actually coming out fighting against this [FBI ‍vs. ‍].

    Not even damage limitation yet, pissing in ALSO-FORAGES-HERE presence in the legal territory. Google competes with Apple, but if the core of either’s business model are threatened by the government, they’ll close up ranks.

    @ Gordon asks […] “has anyone looked at the cost side?

    A very important point, which may yet prove to be the key to Apple’s defense strategy: if they quote some $10B, and 3 year delivery frame for the “FBI requested program” + the cost of development of a new iOS version that will make all updated units impervious to that first, then have it rejected by the court as unreasonable, THEN the ensuing legal battle will be about how FBI Comey and DoJ Sheri Pym know Apple’s business better than Tim Cook. I’d like to see that on C-SPAN ;-))

    Kohn February 18, 2016 10:22 AM

    This is theatre, Apple has already done this at least 70 times already, this is the play acting that’s required for Apple to comply with mandated encryption backdooring without it damaging their business.

    “The government made us do it”

    At my age the only surprises left to me, is that it still surprises me how gullible people are.

    parabarbarian February 18, 2016 11:08 AM

    Seem to me that forcing Apple to create a modified version of the OS for government use will constitute involuntary servitude.

    Observer February 18, 2016 11:11 AM

    @Gordon

    On the actual cost, since this is malware, Apple would need to do more testing than they would on a normal release to ensure it is proof against hacking the ‘only on this one device’ bit of the code. Including beta testing and third party bug bounty, I think I would want to spend a couple of years testing something so dangerous ( joke). But anyway, not just a week from an intern, probably at least a million including disruption to normal work flow from development, testing, release, code signing authority, authentication server….

    A company like Apple would have maybe one or two security people who are really qualified to test the system. They would have many more developer engineers, but engineers are not trained to do this. It is very different type of work, though it can seem to others as if it is not.

    Apple would likely hire an outside, qualified team. And they are expensive. Eight person team, 300 – 500 an hour. Two months work.

    A bug bounty would have to be at least six figures. For every bug found. And as there are many moving parts, that would mean several likely bugs. In many cases, at least two types of bugs would be chained together. One, a root privilege bug, and the other, the vulnerability which breaks it.

    Government – many different governments, many different departments – would easily pay six figures for this. And that is exactly the field this level of vulnerability analyst tends to find their self.

    If they are not leading companies, working as very high dollar consultants, or busy otherwise on something else. That level of security researcher is very well employed, which only makes the price skyrocket all the more.

    If the code and hardware remained completely static, you would likely see vulnerabilities continue to come out for the next two years. Maybe one every six months. Not all the time.

    The code and hardware would not remain static, so vulnerabilities would continue to come up.

    The real cost, however, I believe, is in sending the message that the system is backdoored by the US Government. The US Government has declared – at the very least – just about every foreign citizen a legal and verifiable target. This is especially true for government and high level corporate person. They are the ones who seal deals for mass quantities of phones.

    This is also a slippery slope. If there is give here, where does it stop. This is not a new push, they have been trying to get every hardware and every software they can backdoored.

    That is how totalitarian nations operate.

    Why buy American technology if there is no guarantee for security?

    What does “American” even mean, if there is no focus on security for liberty of conversation, speech, trade?

    Just another failed system that ‘sounded good on paper’ (the constitution), but ‘didn’t work in practice’. Because, terrorists. And the politicians who used them to further their own selfish agendas.

    The Apple CEO says “they mean well”. Hell is paved with good intentions. And hell no, they don’t mean well. Stalin didn’t mean well, Hitler didn’t mean well. Totalitarianists do not mean well.

    Everyday criminals are way better people then wannabe tyrants and their goons.

    “Americans” who view the constitution legitimately as their enemy, as nothing more then toilet paper… they don’t mean well. Especially not when they have the pretension of serving the people and are paid to do so.

    Sally Shears February 18, 2016 2:42 PM

    I’m reading/hearing legal and political points.

    I think this will be fought out in the court of public opinion. The issues are complex. I don’t know how I would start explaining this to friends, but feel an urgent need to do so.

    Kim Zetter’s article in Wired, two articles in today’s NYTimes, provide a start.

    The big issue seems to me whether and to what extent the govt is demanding that Apple create and hand over a fully enabled master key to iPhones. I want to understand this better.

    Then: Do we want our govt to have such a key? Other governments? Anyone? Once out, it’s out.

    My sensitive personal info was exposed in the #OPMHack. Govt systemically underinvests in Cyber Security. Lots of issues.

    Sally Shears February 18, 2016 2:51 PM

    @Wes wrote:

    I’m confused about something that seems to be missing from all the coverage on this: how likely is the FBI to break the encryption, even if Apple does let them bypass the 10 password restriction?

    The user’s unlock code is combined with a hardwired unique device id to create the encryption key.

    So, once they have the unlock code, they have the decrypted contents.

    To another commenter, from articles, it’s unclear whether this phone has a four-digit, six-digit, or more complex passcode. This generation of iPhones starts with a default six-digit passcode.

    Too bad it doesn’t have touch-unlock. I’ll bet cadaver fingers would unlock a phone.

    Daniel February 18, 2016 3:24 PM

    @Sally Shears…

    Yes, it will be fought in the court of public opinion with headlines like the one in USA Today, “Apple refuses to unlock killer’s phone.” There is already a concentrated effort to paint Apple as the bad guy, or at least the unreasonable guy.

    So while the issues may be complex the reality is that the court of public opinion is filled with low information voters who will not base their positions on the resolution of complexity rather upon simple sloganeering. Trying to explain to such people technological or privacy nuances is pointless. Better to fight slogan with slogan. How about, “FBI wants Apple to reveal your porn browsing habits”? (sarcasm)

    AES February 18, 2016 3:59 PM

    @Sally Shears

    It wouldn’t have made any difference if the device had TouchID. If you read my earlier posts you’ll understand that TouchID destroys memory-resident encryption keys after 5 wrong attempts or after 48 hours of inactivity or after the device has been rebooted.

    If the state (the terrorist’s employer) had a proper MDM policy in place then they’d have been able to unlock the device. I find it improbable in the extreme that he’d use a government-issued iPhone to communicate his criminal plans.

    From the news stories we’ve been told that he destroyed 2 other mobiles but not this one. Go figure.

    John Adams February 18, 2016 4:07 PM

    Bruce,

    I am surprised that not enough commentators, including yourself, have picked up on the fact that call it what you want (backdoor, one iPhone ‘unlocked’, etc.) that:
    -the encryption cat is still out of the bag.
    -indeed, not even in this case, is the FBI asking for a permanent backdoor, which I am sure (hope) neither Apple nor the US public would allow to pass
    -hence every criminal enterprise will simply use a long enough password to make it impractical to brute force unlock the phone and gather actionable intelligence (even say 5 years to crack a 60 bit entropy pw is too long to have an immediate payback)
    -most criminals will use even longer passwords to make it even harder to brute force even a phone with such custom SIF
    -the touchID allows quick access to a high entropy pw protected phone by the owner
    -criminals (and non criminals passing US border control; thank you Jonathan Zdiarski) can simply power down their devices to ensure that even without auto-erase and brute force via lightning enabled, it would be impractical
    -and hence the biggest losers in this would be the majority of iPhone users who use simple 6 digit PINs, making them vulnerable to criminal (governmental) hacking.

    Yes, just the crims will have strong passwords on their iPhones and the backdoor will open the rest of the law-abiding public to potential hacks.

    AES February 18, 2016 4:17 PM

    @John Adams

    I too am surprised. I agree with your points (and echoed them further up) so I’ve put together a short list on how to harden your iPhone.

    Nick P February 18, 2016 4:38 PM

    @ Bruce

    It was a good response. Yet, there’s another angle posted here many are missing. They might actually be able to do it. They seem to be wisely dodging the possibilities in their responses. Only question is whether they’ll be forced to do something like in the link.

    @ Anon, Thoth

    Re secure backdoors

    The lawful intercept scheme I mentioned was here. It needs revision to be clear I’m assuming, rather than wanting, that courts or legislators eventually pass laws that apply warranted search to electronic devices with crypto. In other words, I began work on it as an insurance policy for the scenario where we lost in courts but the judge could be persuaded with a middle ground. The very first link shows that was possibly the case with Lavabit, which inspired my idea. Plus, it’s a problem hard enough to be worth my time for intellectual challenge. 🙂

    In a follow-up, I further contended that security pro’s are being dishonest when they say they can’t build a technical mechanism to selectively let people in. I pointed out that they say the opposite when people ask for the same thing using words like “virtual private network,” “remote administration,” “remote updates,” and so on. Same thing. The main difference is the risk associated with managing the keys, both technical and legal. Clive wrote extensively about that starting here, showing it was very hard. Google his name within this domain with “key management” or “KEYMAT handling” to find plenty more posts.

    The best result of my thought experiment was Dirk Praet’s counterpoints that I can’t find the link to. One was the common worry that all kinds of governments will demand access to the system. The best one is one even laypeople could understand: every intelligence agency, organized crime group, and terrorist group in the world would target wherever the keys were stored. Saying one can securely protect them is betting against every hacker, spy, and thug in the world. An unwise bet to place. I finished that one up by pointing out that the U.S. government, that part anyway, consistently failed to protect its secrets from foreign spies. Why would they do this differently?

    Gweihir February 18, 2016 5:40 PM

    We have several issues here:

    1. Can Apple legally refuse if the attack is above a certain difficulty/cost level for them?
    2. How difficult is it actually?
    3. Can Apple refuse if the damage to their reputation does exceeds a certain threshold?

    I am sure Apple could get into that phone. But what if it requires a few months for a group of very smart people with some very expensive equipment (ion-beam workstation)? What if it takes longer? And what if they screw up and erase the secure key storage by mistake, a mistake entirely possible in such work?

    Now, the government can certainly not force Apple to do unlimited work for them and certainly not with forced success (or else). And they can certainly not force them to destroy their own reputation utterly. The question is, where are the limits. And the FBI is trying to find out. I seriously hope they will be disappointed with the results.

    Niko February 18, 2016 6:44 PM

    @Gordon

    The locksmith analogy is an interesting theoretical, but there is no unbreakable physical lock in the real world. All locks and safes, except perhaps for the vaults at the largest big banks, are easily breached by anyone with enough time and the right tools. Safes exist mainly as a delaying tactic against thieves who want to be in and out of your home or office as fast as possible.

    Peanuts February 18, 2016 7:31 PM

    Let’s talk iOS information leakage, shall we,
    iOS 9.2.1 leaks everything in the list below without hacks, jailbreak or exploits

    the character set you use according to your password complexity.

    If you have set an alphanumeric password of numbers only, when you attempt the unlock, it shows the number keypad.

    Add a letter and then the unlock shows the keyboard.

    and it will leak if your using the simple numeric, it auto completes at length of 4

    Seri is a siv of leakage, no need to unlock to use her. If you activate the loose ship sinker while locked, no pass code attempt needed she will happily talk to strangers

    For example
    Message to “dad” she will gleefully say which dad and list them
    Much worse say “edit Dads phone number” the idiot will show you the name, address, city, state zip and phone numbers

    At least while betraying you she so kindly adds that she can’t change dads # but if your FBI you did t want to change dads phone number, you just wanted to meet dad and give him the comfrey chair

    Now every single contact on your Seri enabled devices can be brute forced off any iPhone with Seri enabled pass code not, required

    Found these back doors around iOS 9
    Have a nice day

    Peanuts

    Plus while on Seri as a method of hacking, if you were a hinky actor type you could say disable enable {wifi, airdrop, Bluetooth} it’s done with no pass code requires

    Thoth February 18, 2016 8:03 PM

    @all
    John McAfee … yes .. the McAfee founder … decided to hack the iPhone for the FBI since Apple is reluctant to backdoor the iPhone for FBI with a challenge for himself of 3 weeks to crack the iPhone otherwise he would go on TV and eat shoes ….

    In one stroke of move, John McAfee, if he manages to hack into the targetted iPhone might be able to save Apple from the mess of court orders while giving FBI what they want with a twist (since the original intention of FBI was probably to embrace and stain Apple’s name in the on-going Crypto Wars).

    Since the targetted iPhone doesn’t use a ARM TrustZone equipped environment (no “Secure” Enclave equipped), it would make the job easier for John McAfee since the security on it would be much weaker.

    A little tip for John McAfee is to get low level and below with physical copper wire probes and you would be surprise how much side-channel possibilities exist (with or without ARM TrustZone). Dismantling the iPhone casing and tapping the ARM chip is by far the best way to get whatever you want out of it but since iOS has been known to have possible software vulnerabilities, software attacks would also work.

    Link: http://arstechnica.com/staff/2016/02/mcafee-will-break-iphone-crypto-for-fbi-in-3-weeks-or-eat-shoe-on-live-tv/

    Peanuts February 18, 2016 8:11 PM

    Wonder if that will be with or without drugs on in or on or if he will sober, hookers or not, pay per view or free tv?

    That’s a reasonable question you know you were wondering

    Peanuts

    Mane February 18, 2016 8:14 PM

    Bruce,

    Apple refusal to comply means that they believe that they can most probably compromise their own devices. Otherwise, why not say that they simply can not?

    It baffles me how Apple’s spin-doctors manipulate both the media and some people on this thread to think that their response actually means that their devices are more secure.

    Wael February 18, 2016 8:55 PM

    @Thoth, @ianfthe gold digger whose looking for nuggets

    This forum had discussions regarding physical backdoors in the past and usually it turns into a flaming thread

    Yea! Flame wars happen when you say things like this:

    In a technical sense, the root TPM key if owned by the user and the user signs his own OS to securely boot, that is fine in theory. But in practice, the TPM is used as a lock-in “security”

    How many times did I tell you to read the specifications before you say things like that, huh? ????

    Wael February 18, 2016 9:07 PM

    @Thoth,

    with a challenge for himself of 3 weeks to crack the iPhone otherwise he would go on TV and eat shoes …

    Rumor has it that he ordered a pair of shoes made out of beef jerky. He’ll wear them to the TV station, too (for deception.) I mean so what if they taste like mushrooms and tinea pedis? Still better than eating a real shoe, dontcha think?

    Marcos El Malo February 18, 2016 9:19 PM

    @AES
    “If the state (the terrorist’s employer) had a proper MDM policy in place then they’d have been able to unlock the device.”
    Actually, the County of San Bernadino was his employer, not the State of California. Not sure that it ultimately matters which level of government, but there you go.

    @ianf
    If you ever plug your SoS comment data into a wiki, I propose you name it Wikisquidia.


    Forcing a person or company into involuntary servitude to hurt their own best interests is not stepping onto a slippery slope. It’s already sliding down the mountain. Basically, the FBI is demanding that Americans relinquish their 1st, 4th, and 5th amendment rights. And if the government can direct what a company can or cannot build, that would include weapons manufacturers. Buhbye 2nd amendment.

    Niko February 18, 2016 9:59 PM

    You really have to wonder what planet the EFF is on with comments like these:

    Compliance with the judge’s order will also set a dangerous precedent globally.
    Authoritarian regimes will be further inspired to attack the journalists, human rights advocates and other members of civil society who rely on strong encryption for their work.

    Authoritarian regimes already have extremely broad lawful intercept regimes, beyond what the FBI has ever dreamed of, have their own SIGINT agencies if those fail, and if all else fails(rubber-hose cryptanalysis).

    Never Say Never February 18, 2016 10:09 PM

    @carlton

    The second point is the assumption that breaking current security will make us all safer. No one is being made safer by disabling the security of our phones.

    not necessarily. Imagine if we did remove all encryption from all phones. I imagine there would be at the very least dozens of people who would not succumb to the temptation to commit various crimes thinking that the technology of the phones would help them keep their secrets. I’m not saying we should throw the rest of the world under the bus just to keep those folks out of jail, but… not everybody loses.

    Rich Hairry Tomcat February 18, 2016 10:28 PM

    @SallyS

    I’m reading/hearing legal and political points.

    I think this will be fought out in the court of public opinion. The issues are complex. I don’t know how I would start explaining this to friends, but feel an urgent need to do so.

    I’ll help. Start with “It’s going to get worse before it gets better.” Then tell your friends to watch more sci-fi. Trek, Dark Angel. Throw in The Wire for non-sci-fi good measure, though remind your friends that The Wire was all pre-Snowden, however as such, provides valuable insight towards gleaming the cube. The Cube is also a good primer on human nature as it relates to government control and math. Gleaming The Cube as well. Pump Up The Volume. Leonard Cohen music. Research the history of Cannabis persecution. As Max would tell Joshua – “People are afraid of what they don’t understand”. A truly insignificant portion of the population understands any of these issues in remotely enough depth to make an optimal public policy decision. As such, democracy is not going to be our friend here for quite some time. Probably at least a generation or two.

    Observer February 18, 2016 10:31 PM

    @Nick P

    In a follow-up, I further contended that security pro’s are being dishonest when they say they can’t build a technical mechanism to selectively let people in. I pointed out that they say the opposite when people ask for the same thing using words like “virtual private network,” “remote administration,” “remote updates,” and so on. Same thing. The main difference is the risk associated with managing the keys, both technical and legal

    Not untrue, but…

    • vpns, are not singular key, access all. One vpn key won’t access everyone else’s vpn, everywhere.
    • same with remote administration, which is understood to be very delicate security wise
    • remote updates, similar problem to the above. There can be one key to manage all, kind of. But it is already on the system. The US did compromise the integrity of a seemingly unbreakable key system which MS implemented, in the Stuxnet attack. That did not give them the capability to break everyone’s remote updates, everywhere. At all.

    In the case of just cracking one phone, or even if they supplied a backdoor to the disk encryption, it might be similar to MS updates, kind of.

    Or kind of similar to server certificates.

    But the similarity there does remain: singular vulnerabilities found in those systems are very bad.

    And they are found and will be found.

    Stealing certificates even from major servers, if someone wants it bad enough, they will certainly get it.

    The best result of my thought experiment was Dirk Praet’s counterpoints that I can’t find the link to. One was the common worry that all kinds of governments will demand access to the system. The best one is one even laypeople could understand: every intelligence agency, organized crime group, and terrorist group in the world would target wherever the keys were stored. Saying one can securely protect them is betting against every hacker, spy, and thug in the world. An unwise bet to place. I finished that one up by pointing out that the U.S. government, that part anyway, consistently failed to protect its secrets from foreign spies. Why would they do this differently?

    This is resoundingly true.

    Only this “battle” is one that can be lost and the war won’t be finished. What they want are ubiquitous backdoors that give complete access, everywhere. That is the war. This is just one battle.

    Politically, if they win this, they may have some traction. People will, however, simply use separate encryption systems or destruction systems, if they feel the pressing need.

    The message will go out that American systems are backdoored. No choice or chance about it. That is a terrible message to send, and anyone who needs better security will get it.

    Cynically, the only security value the debate gives is the false impression that everything is not already eminently hackable.

    (I sincerely do not believe the pundits are aware of this, nor should such fools be made so, lol.)

    Observer February 18, 2016 10:37 PM

    @Rich Hairr Trenchcoat, etc, ‘politics’

    Where do the candidates stand on encryption

    http://politics.slashdot.org/story/16/02/19/0019218/where-do-the-presidential-candidates-stand-on-encryption

    In a divided election year, encryption brings parties together — against technology. That’s the sobering finding based on transcripts from the remaining presidential candidates, all of whom came out against cryptography and for government backdoors to varying degrees. It’s a testament to the post-Snowden era (and Apple’s fight against a court order to backdoor an iPhone) that every candidate has been asked about the issue multiple times, but only one candidate even acknowledged that backdoors cause great security concerns for the public.

    They are probably deciding to keep quiet to avoid annoying portions of the population, while the submitter probably likes the one who acknowledged the problem. Sorry. That aside. Disturbing.

    On your cynicism: I do think the general populace gets the problems. There are many problems in these regards deeply effecting core portions of the population. Then, there is the remaining popularity of a wide variety of dystopian futures, which actually have a pretty wide audience.

    The remaining probably suspect there may be government nerds who may want to view their private parts, if they can. And may instinctively be opposed to that possibility.

    Buck February 18, 2016 10:44 PM

    @Thoth

    In one stroke of move, John McAfee, if he manages to hack into the targetted iPhone might be able to save Apple from the mess of court orders while giving FBI what they want with a twist

    Hmmm… So far, I hadn’t actually considered the wild-card in all of this. Especially without that twist, I’ve been wondering where the NSA comes up in this; for now, here is my tentative list:

    • The NSA had absolutely no reason to suspect anything out of the ordinary for our suspects; and additionally, they do not make a habit of hoovering up everyone’s private keys (just in case)
    • The NSA could indeed provided the decryption keys to the FBI, but for some reason, they now choose not to do so
    • The contents of the phone have already been decrypted and shared with the FBI, but a legal precedent is still useful (with or without finesse/guidance/push from the NSA)

    I have no reason to believe any of the above is more likely than the others, but I have been left wondering how the FBI could possibly hope to come ahead now in light of everything else we know…

    Nick P February 18, 2016 11:11 PM

    @ Observer

    Remember, I agreed that the access level via key sharing was a greater risk than the comparable technologies. The gripe I have is that both types of software allow selective access. That’s what technologists keep saying is impossible in backdoor debate. Then they do it in other areas. So, I call it out.

    If anything, they should focus on the real risk: governments can’t protect the keys whose release can cause much more damage than terrorists.

    @ Buck

    It’s worth thinking about such things given there’s a precedent. Remember that, pre-Snowden, the FBI fought court battles over various encryption schemes (including iPhones) talking like it couldn’t access them. Snowden leaks showed NSA had a way in plus cooperated with FBI and DEA via parallel construction. They couldn’t use the capability unless they could 100% deny it via a different method of collecting same evidence. They could be doing that now.

    That these things are actively being used to target Americans with due process bypass is also worth bringing up every time the topic comes up. The leaks show cooperation with all kinds of agencies and deception to courts. We don’t have to say that hypothetically we could be hit outside terrorism: it’s already happening.

    Peanuts February 19, 2016 12:33 AM

    The government back door advocates treat encryption software and what they perceive as its design and purpose as identical to that of ransomware.

    A cat that can be stuffed back into the bag

    One pundit tonight Ralph Peters said “encryption levels should be licensed” like munitions in his mind.

    Scary shit to hear real uninformed morons get an 8-12 million person bully pulpit without pushback.

    I hope the next president is tech literate, and not jussst literate enough to screw over us and the remaining generations of free people in perpatuity.

    Carson is the only one I’ve seen with enough neurons firing I’d trust, but currently down a bit in the polls. Sad

    Seems rather hopeless with the current administration which does not encourage healthily collaboration or innovation in the slightest.

    Peanuts

    Amber February 19, 2016 4:11 AM

    Surely all Apple need to do is IT-standard malicious compliance:

    They outsource the design of the opsys mods to an offshore team in India who then contact the Judge for his SPECIFIC user needs.

    Apple then program exactly what the off-shore team have had the judge sign-off as the fully-detailed technical requirements.

    It’ll be as successful as most IT projects.

    65535 February 19, 2016 6:10 AM

    Since I am at the bottom of this thread and almost every angle has been covered I’ll just make some gut level guesses. I’ll go with Occum Razor which can be interpreted as stating among competing hypotheses, the one with the fewest assumptions should be selected.

    @B

    “100% political; the FBI does not technically need Apple’s help here and Apple can technically comply with this order…”

    Maybe and maybe not.

    I don’t disagree with Dan Guido that it can be done – but I am skeptical it can be done safely, secretly, and without leaking copies of the “root up-date patch via a usb cable” – while in open court [there will be many technical experts needed for Judge Pym and defense attorneys examine the actual process. This is an invitation for one of the many parties to make a copy of said ‘malware up-date’ and use it in the wild]. Also, breaking the “Secure Enclave” chip will probably reveal sensitive trade secrets to said parties. I am against the Riverside Court’s order.

    https://regmedia.co.uk/2016/02/17/apple_order.pdf

    As some people have mentioned the FBI wants easy ‘Economical Access’ to all Encrypted devices and get it through a legal route [instead of parallel construction… which is used to varying digress of success, DEA, NSA, FBI, Local Police and so on].

    Although Apple has been shown to cooperate with the NSA and the FBI it wants to shed that under-handed spying image and improve its “Secure Branding” of it products and make more profits. Apple does cooperate as much as possible with the Government and even handed over Farook iCloud data. That is logical.

    Tim Cook’s open letter:

    “..the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority… This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.”

    “The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge. Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government…” –Tim Cook [and/or his PR guys]

    http://www.apple.com/customer-letter/

    I agree with Tim Cook.

    This blatant over-reach by the Government is very damaging to the People, the Fourth Attendant and Apple’s business plan. It is a worth while fight. I also agree with Bruce and Congressman Ted Lieu.

    The court order mentions Verizon as a target – but surely, Verizon must have handed over all of the CDRs and the associated metadata. The FBI knows who placed and received said cell phone calls and there locations.

    As to the many possibilities for other agencies and their competing motives that Nick P and Buck bring up – who knows.

    “The NSA had absolutely no reason to suspect anything out of the ordinary for our suspects; and additionally, they do not make a habit of hoovering up everyone’s private keys (just in case)

    “The NSA could indeed provided the decryption keys to the FBI, but for some reason, they now choose not to do so

    “The contents of the phone have already been decrypted and shared with the FBI, but a legal precedent is still useful (with or without finesse/guidance/push from the NSA)”

    https://www.schneier.com/blog/archives/2016/02/judge_demands_t.html#c6717428

    My hunch is the NSA wants to distance itself from this debate because it breaks the law with impunity and has been dox’d [and taken a beating in the process].
    The same goes for the DEA who probably uses Parallel construction and extra-paramilitary activities all the time.

    Below is a compact summary by The Resister “Here’s a clear, technical Q&A”
    http://www.theregister.co.uk/2016/02/17/apple_iphone_5c/

    The twist of Mr. McAfee offering to break the iPhone is interesting. It provides an insight into who will help the Government and who will not.

    The last thing that could go wrong with this case – as others have pointed out – is the possibility that the FBI wins and then loses in the long run. Surely, Russia, North Korea, Iran, and other countries will demand a similarly copy of the hacked iOS image and the cat will be out of the bag so to speak. All Americans with iPhones then will have the prospect of being hacked when traveling abroad [including politicians, lawyers, judges, banking executives and so on].

    I agree this court case is bad and should be fought against at every level.

    keiner February 19, 2016 6:10 AM

    Has Apple done it before?

    ww.thedailybeast.com/articles/2016/02/17/apple-unlocked-iphones-for-the-feds-70-times-before.html

    Is this all just show to make Apple users feel safe? 😀

    ianf February 19, 2016 7:29 AM

    @ Peanuts […] hopes the next president is tech literate, and not just literate enough to screw over us and the remaining generations of free people in perpetuity.

    Is that an expression of your belief in Presidential Omnipotence? Because, clearly, you expect the next tenant of the White House @ 1600 Pennsylvania Ave., Washington, D.C., to personally unscrew something or other, rather than screw it down real hard for what sounds like Eternity. Because, as Dear Leader, that’d be in his/her purview, and responsibility, of. Is that what those elections are all about… strange people, Americans.

    @ Amber proposes that “Apple engages in IT-standard by-the-book compliance: outsource the design of the requested mods offshore to India, have that team ask the judge for SPECIFICS; then program EXACTLY what she signed off as the fully-detailed technical brief.

    Apple wouldn’t offshore such a sensitive iOS, even one-time-backdoor, design, but they could well afford to import and establish a special temporary division of call-center-like star programmers right there in California for that very purpose (also save a lot on telephone charges). It could work. Is someone from Apple’s board of directors reading this, seems like they ought to.

    Curious February 19, 2016 7:53 AM

    I wonder, what is Apple’s relation to NSA in all of this? Presumably Apple would be totally subservient to NSA, assuming it makes sense that NSA has dealings with Apple one way or another.

    Would it be unreasonable of me, to think that NSA already has commanded Apple every which way to undermine every possible aspect of Apple security generally speaking?

    I don’t own any Apple products, but I can’t help but being a sceptic and think that Apple’s relation to national security and NSA is as interesting as their relation to law enforcement and FBI.

    Nick P February 19, 2016 8:59 AM

    @Nick P

    Yes, I agree with your sentiment on that. I agree that it is important to be entirely ‘to the point’ and as accurate as possible in this debate.

    Very good direction, apologies for being unclear on that.

    @RSA

    Also, on yours. I was very impressed by your point, and attempted to note this by stating people who know phones call them handsets, as you did, and may have difficulty calling them “phones” or “smart phones”.

    Observer February 19, 2016 10:27 AM

    On the articles posted by Bruce:
    https://www.lawfareblog.com/not-slippery-slope-jump-cliff
    https://www.lawfareblog.com/apple-selling-you-phone-not-civil-liberties
    https://benlog.com/2016/02/18/on-apple-and-the-fbi/
    http://www.nydailynews.com/news/national/apple-unlocked-70-iphones-refusal-article-1.2536178

    I liked the lawfareblog ones, and there is some solid ‘devil’s advocate’ positions there. I can be a privacy advocate, but I am, by trade, a hacker. I have found security vulnerabilities in smart phones before. From that perspective, I prefer the government – the FBI, domestic law enforcement, specifically – to hack the phones themselves.

    Three main reasons for that:
    1. the cost, as Nicholas Weaver pointed out, but I am very well aware of and have been saying anonymously and privately for years. This does help ensure it is used sparingly.
    a. a factor not mentioned is, if the security vulnerability is discovered by the attacker, they have a very good chance of losing that vulnerability permamently. Same factor with using the latest and greatest surveillance tech on mundane cases, or cases where the target has known capabilities to discover and reuse or otherwise disseminate such technology. (Something, to a degree, almost anyone has these days, via youtube, twitter, facebook, etc.)
    2. telling everyone you are mandating backdoors in American software and hardware products says “do not buy American”. Eventually, that is guaranteed to hurt the American economy. In general, that is the worst message to send to the global buying audience
    3. telling targets you are mandating backdoors makes those backdoors useless
    a. to this, some say, there is some preventive action in letting people know they are being watched, it causes them to better behave. That may fit some scenarios, but not these scenarios. These scenarios call for secret surveillance. Otherwise, those who anyone needs to hide their data will simply use other software, other hardware. it won’t stop them from doing what they are doing, just push them even further underground and largely only effect everyday citizenry

    On principles:
    1. this is not how free nations should operate, they are leaving behind what free nation principles are and moving towards totalitarian principles in these moves

    These nations may have been initially founded on race. That is not how they are anymore. Now, they are founded on sublime principles. Principles which require upholding to maintain their existence.

    That is first and foremost where authorities of these nations should be remaining vigilant, and the authority to say this, no one needs to remind them of because it is very clear in the founding documents of any such nation.

    All of the authors talked about how this was political and a slippery slope.

    There are finer points, not mentioned, as well:

    1. the FBI, like true intel organizations, have the capability to hack the phone; personally, my main concern is that they hack, say, presidents, congresspeople, corporate vips. As it stands, ‘if the vendor did not help us do it, we did not do it’ remains an excuse for them. That should be taken away. It is not true now, and would not be true. You should assume compromise. You should also assume abuse. Otherwise, you may not make plans to detect it, and to deal with it.
      a. pretending they can not is disingenuous and deceptive, which sends a bad message to society about law enforcement who should be shrewd, but not disrespectful of rightful laws
    2. one poster mentioned ‘this may enable foreign governments to hack people’s phones when they are visiting their country’. Correction: they can hack your phone, now, if you are domestic, or then, if you are in their country. They have no legal obligation to follow domestic, US laws. Nor does even US intelligence in that case.
    3. phones may be hacked by OTA updates, via the vendor, aka, the ISP/telco. Because they have root level applications by default on the phone. They may also be hacked by the manufacturer, because they also have root level applications on the phone. Further, phones have a very wide attack surface, and while it may be difficult to find certain types of vulnerabilities, ‘all in one’s’; finding vulnerabilities that can obtain user level access is not that hard; finding vulnerabilities that can find privilege escalation is also not that hard; putting user access with root access, then, is a pretty easy route and you can expect all major governments to have that
      a. in general, getting MITM is not difficult with phones, stingrays people with little amount of money can build or buy; does not require much technical expertise because the software is COTS (consumer – actually open source, Off The Shelf). “Owning” legitimate base stations is not a feat. Anyone can see they are all over the place and have little security. And, for governments, telcos, they effectively own all base stations anyway.
      a1. if mitm, the attack routes for system compromise are many; strong implementation of certificate systems can help, but that is far from perfect and there are countless third party applications

    Marcos El Malo February 19, 2016 1:10 PM

    @keiner
    A previous comment contains a link that debunks the story of Apple cracking iPhones for the government. I encourage you to read the comments before posting comments.

    @Amber — death by management! That’s a great idea! The judge didn’t specify a time-frame nor specified how many engineers Apple must devote, giving Apple an opportunity to drag this out if the appeal doesn’t go their way.

    However, the next danger would be if the court appoints a compliance monitor, basically a government overseer to insure that Apple is putting all possible effort into the project. At this point, the court effectively controls the company — a horrible prospect. As horrifying as it is, it follows logically from the court’s decision. And there is precedent for this, although one would need a very tortured interpretation of the precedents.


    Regarding whether the FBI already has the info from the NSA, it might be that the FBI doesn’t want to ask because they are rival bureaucracies (in addition to the evidentiary problems). I would hazard a guess that the FBI absolutely hates being dependent on the NSA.

    While we absolutely don’t want the FBI to become a vassal organization to the NSA, shifting the nation closer to totalitarianism is a terrible way to do it.


    In my humble opinion, the FBI needs to focus on domestic terrorism. They really don’t need additional powers to fight that because 1) the domestic terrorists broadcast their intentions and plans on social media and 2) because they are easy to infiltrate and/or disrupt. The Cliven Bundy gang was stupid, but they weren’t especially stupid.

    State Security Trumps All February 19, 2016 1:41 PM

    However, the next danger would be if the court appoints a compliance monitor, basically a government overseer to insure that Apple is putting all possible effort into the project. At this point, the court effectively controls the company — a horrible prospect.

    I would call this prospect the normalization of the Stasi/PRISM agent with the faked resume on staff at the company.

    Sancho_P February 19, 2016 3:24 PM

    Re: McAfee

    Oh yes, then he will start the show by saying:

    “Sorry folks, sorry your honor, now that the phone is bricked.
    But someone is happily helping me with the shoes, right here in the studio, big applause, here he is: Tim Cook ! ”

    🙂

    +++

    Re: Installing a special SW/FW without the consent of the device’s owner:

    Apple has lost the case anyway.
    An unauthorised update is a game stopper.
    It may stop any and all phones from working. Error 53+ ?

    A cyber-stopper, a kill switch for any “friendly” nation? NK, China, Russia, …?

    And wait until the kids find out, it will strike back to the US.

    Observer February 19, 2016 3:32 PM

    @Evil Kiru

    Cringeley article:

    Though I have not much followed the supreme court, I noticed in this article made about Scalia that he has actually been strong on individual rights, ala, libertarian:

    http://arstechnica.com/tech-policy/2016/02/through-the-ars-lens-looking-at-justice-scalias-opinions-dissents/

    Weak on human rights like sex choices.

    Noticed some commentators also pointed out that Scalia probably wouldn’t have voted against the FBI here, not for him.

    If the above ars article accurately portrays his viewpoint, then he certainly would have.

    If this were gay marriage or the right to ‘fuck as you please we don’t need cops in the bedroom’, no.

    Daniel February 19, 2016 3:48 PM

    For those who are keeping track two different law professors have joined the fray.

    Michael Dorf has this post

    http://www.dorfonlaw.org/2016/02/apple-fbi-and-all-writs-act.html

    in which he argues Apple should lose…

    and Orin Kerr has this post….

    https://www.washingtonpost.com/news/volokh-conspiracy/wp/2016/02/18/preliminary-thoughts-on-the-apple-iphone-order-in-the-san-bernardino-case-part-1/

    wherein he argues that it is much too soon to tell who has the better argument.

    Nathanael February 19, 2016 4:07 PM

    In business terms, Apple cannot comply; it destroys their business. Their options are to win the case or to go overseas and shut down all US operations (apart from mailing phones to the US, and daring Customs to intercept them).

    CallMeLateForSupper February 19, 2016 4:56 PM

    @Bruce
    “The FBI is demanding that Apple give them free engineering work.”

    That comes as news to me. Of the dozen or so articles I’ve read about FBI putting the bite on Apple, only one mentioned cost, and that was in passing and not dollar-specific.

    All Writs sucks pond scum enough; recipient of it should not get the double-whammy of eating the cost of complying.

    (But all the hoopla is a waste of good wrath, right?, because McAfee’s walk-on-water whiz-bangs will have that puny phone popped any minute now.)

    Jordan February 19, 2016 5:24 PM

    Don’t you have to unlock the phone before you can install new software? Isn’t that Security 101?

    panax February 19, 2016 7:55 PM

    The Apple ID password for the San Bernadino phone was changed within 24 hours after the government gained possession.

    http://www.techinsider.io/apple-the-fbi-screwed-up-san-bernardino-investigation-2016-2

    “The fact that the password was reset means that Apple was unable to retrieve info from the iPhone’s unencrypted iCloud backup like it has for past investigations, according to reporters Apple spoke with. If the password hadn’t somehow been reset while in law enforcement custody, the FBI likely wouldn’t need Apple to create a tool that lets it brute force hack the iPhone’s lock screen passcode and gain access to the device’s encrypted contents.”

    http://abcnews.go.com/US/san-bernardino-shooters-apple-id-passcode-changed-government/story?id=37066070

    “Apple could have recovered information from the iPhone had the iCloud password not been reset, the company said. If the phone was taken to a location where it recognized the Wi-Fi network, such as the San Bernardino shooters’ home, it could have been backed up to the cloud, Apple suggested. ”

    “A federal official familiar with the investigation confirmed that federal investigators were indeed in possession of the phone when the reset occurred.”

    It seems it was the employer who reset the password.

    http://gizmodo.com/the-san-bernardino-terrorists-icloud-password-was-accid-1760158613

    Niko February 19, 2016 8:45 PM

    @Sally Shears

    If Apple was honest, they designed ios8 specifically to defeat a US warrant or legal order. Whether that’s good or bad depends on how much you trust law enforcement, but all this talk about criminals is a red herring. It would be fairly easy to design a phone and corresponding “backdoor” that only worked if you had physical access to the phone. In fact, that appears to be the case with pre-ios8 phones as they required LEOs to ship the phone to Cupertino. Identity thieves aren’t going around stealing individual phones. You alluded to the OPM hack. Hacking centralized databases with thousands to millions of records is a far more efficient way for criminals to get your data. The other issue is lost phones. If you have to hack Apple to get at their “backdoor firmware”, that should rule out script kiddies or whatever random person happens to find your phone and tries to pawn it.

    Observer February 20, 2016 12:11 AM

    @Sky

    Didn’t the NSA already have backdoors in place on Apple devices?

    The most likely form of backdoor the NSA would have is one (and more) they have found, unintentional vulnerabilities.

    It is possible they have been able to put in intentional vulnerabilities.

    This would be unlikely available to the FBI and it very likely may not be designed to grab the data at rest, anyway.

    Grabbing data at rest is mostly a law enforcement concern in this situation.

    (The difference is subtle. Basically, an unused, unlocked phone versus one being unlocked and used regularly.)

    Technically, the FBI does have the capability, at least to hire consultancies, to crack the phone.

    Demaning Apple do this is a political move to try and get closer to having whatever backdoors they want, wherever they want it.

    They would prefer to have mandated backdoors everywhere, in everything.

    Which is absolutely not acceptable outside of a North Korea or Saudi Arabia.

    Observer February 20, 2016 12:35 AM

    @Niko

    If Apple was honest, they designed ios8 specifically to defeat a US warrant or legal order. Whether that’s good or bad depends on how much you trust law enforcement, but all this talk about criminals is a red herring. It would be fairly easy to design a phone and corresponding “backdoor” that only worked if you had physical access to the phone. In fact, that appears to be the case with pre-ios8 phones as they required LEOs to ship the phone to Cupertino. Identity thieves aren’t going around stealing individual phones.

    I disagree, partly, and here is why:

    This is not a “I hate cops” situation, otherwise American intelligence would not come out for Apple, which they are [effectively]. While that may be a “NOBUS” scenario, it is also true that Americans travel, a lot. And their phones can be purloined temporarily or permanently while overseas by foreign intelligence. Even in a friendly country, it is more likely for an adversarial country to easily reach the phone.

    And many friendly countries, there is no treaty for.

    The older phones should not be thought of as “being designed for easy access”, per se, whether they simply were older phones and did not have the same foresight put into them towards security.

    Some have mentioned how Apple has enforced end to end encryption and data at rest encryption since the Snowden files have been released. How that was a publicity move. Yes, it was, partly, but it is also a natural transition. What causes companies – more then anything else – to focus more on security, historically? Is released vulnerability information. And that is exactly, effectively, what the Snowden files did.

    No one is saying “script kiddies” here. Though script kiddies, by definition, historically, can “push button” kernel level vulnerabilities someone else has already found just about as easily as ‘user level vulnerabilities’.

    The major “criminal” threat that does exist, the everyday one, is highly organized, well funded organized crime… and nation state sponsored activity. Which might literally be through the plausibly deniable organized crime actors, or it might be simply through foreign intelligence.

    Further, encrypting hard drive systems for data at rest is now standard: across Windows, across Macs, across Linux. A major driver for this is: portable devices routinely have sensitive information on them, and are routinely either lost or stolen.

    It is very easy to misplace or have a phone stolen.

    We used to see cases, almost daily, where a laptop was stolen with sensitive information on it, and that information had to be assumed given up.

    Now, if a laptop or phone is stolen, the data is often presumed protected, despite the theft.

    Exactly because of these protections.

    Which is more common? Murder cases or terrorist cases where the remaining information is on that phone, or lost phones with sensitive data? By far, the later. Even though many murder cases have been cracked because of data that was left on computers.

    IMO, the best course of action for the FBI is to obtain more funding for high level hacking, and implement that. And keep their mouths shut about what they can or can not crack.

    There are plenty of areas where they should open their wide mouths and talk and argue, this is not one of them. They are best keeping their heads down, and putting money and resources towards being competent at hacking. They do have a legal right to do this in confined instances. This is certainly one of them.

    The problem may be because they are “merely” one division of ultimately a lawyer department of government. Politics is the major guiding factor here. Lawyers do not think like intelligence. One editor responded the FBI is likely at odds with the parent division, DoJ, this is certainly untrue. The DoJ is obviously guiding them on this.

    It is shortsighted and anti-intelligence.

    Observer February 20, 2016 12:41 AM

    ^^

    The problem may be because they are “merely” one division of ultimately a lawyer department of government. Politics is the major guiding factor here. Lawyers do not think like intelligence. One editor responded the FBI is likely at odds with the parent division, DoJ, this is certainly untrue. The DoJ is obviously guiding them on this.

    And, yes, I am very aware that Donovan and some other early leaders of CIA were lawyers, and leaders of FBI counterintelligence have been lawyers. So not that lawyers are necessarily far from the action of intelligence, lol, but simply that lawyers and intelligence are often very much at odds.

    Niko February 20, 2016 1:15 AM

    @Observer

    While many Americans may travel overseas, few of them are important enough to be targeted by foreign intelligence services. For the few who are, get two phones/laptops: one you use as your primary phone at home and another you use when you travel in hostile countries.

    Observer February 20, 2016 1:43 AM

    @Niko

    While many Americans may travel overseas, few of them are important enough to be targeted by foreign intelligence services. For the few who are, get two phones/laptops: one you use as your primary phone at home and another you use when you travel in hostile countries.

    First of all, I have a problem with anyone calling themselves “Niko”, because to me, that is a lanky,beautiful German chick. Who was a good friend of Andy Warhol. But, she did spell her name with a “c”, not a “k”. That aside. And, I assume you can not drink as much vodka as she could, god rest her soul…

    https://en.wikipedia.org/wiki/Nico

    That is great advice, but outside of intelligence, vips won’t do that.

    They are babys and don’t take threats seriously.

    And, you are mistaken. “Hostile” countries… China is hostile? China is US greatest trade partner. Israel is hostile? Israel is the US most firm ally in the Middle East. Germany or France is hostile? What about Japan? These are allies. And they will spy on our vips.

    People think that if you are in… say, the UK… that is OK. Because China or Russia won’t hit your people there. Or come from there.

    But it is standard for intelligence to use intermediaries.

    If you want to meet a spy – an agent, as the CIA calls them – from a major nation. You meet them foremost in a third party country. Not in their own country.

    But, hey, believe what you want to believe.

    Everyone does.

    There is no truth, because people believe what they want to believe, based on their preferences.

    Buck February 20, 2016 3:20 AM

    @Observer

    (Going by many of your other comments here, I do believe that we are in general agreement on this particular point of discussion. Regardless, please allow me to elucidate my confusion about the following quote from you — even if only for the sheer sake of keeping discussion alive 😉

    This would be unlikely available to the FBI and it very likely may not be designed to grab the data at rest, anyway.

    No qualms from me about the latter clause (although it does raise some other good questions)…

    Yet, hasn’t EO 12333 specifically been interpreted to supposedly take control over exactly this sort of case? Not the sharing of valuable vulnerabilities of course, and I can understand the reasons for that… But, the data itself?

    It’s not like that would ever have to be presented in court (unless the DOJ is now trying to get precedent to sentence dead bodies). More realistically, they may have suspected that others were involved – but, if that were true, those individuals are probably already under strict surveillance and left without any ability to act dangerously.

    Now, what my most cynical-self suspects is that the PCLOB is finally about to take a look at yet another publicly-known authority, and those authorities feel that a replacement for this intel-source must be legitimized somehow; but unfortunately, such things tend to require a crisis for action. My optimist-self hopes it doesn’t have to come to that (and that you’re actually wrong about one of your other ideas)…

    Curious February 20, 2016 4:30 AM

    “Secret Memo Details U.S.’s Broader Strategy to Crack Phones” (19. Feb 2016)
    http://www.bloomberg.com/news/articles/2016-02-19/secret-memo-details-u-s-s-broader-strategy-to-crack-phones

    “In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.”

    Maybe the White House is responsible for FBI wanting Apple to develop a break in tool, and if Apple was ok with that decision in secret, then the US government wouldn’t have developed a backdoor, but Apple would.

    Observer 3 February 20, 2016 5:11 AM

    @Curious

    Apple had asked the F.B.I. to issue its application for the tool under seal. What does this mean, is this true and where/how did NY Times get this information from? Sounds a little weird to me, with Apple allegedly having asking FBI this way, or have I misunderstood the context perhaps?

    It means that Apple were fully willing to assist providing the FBI kept quiet about Apple’s cooperation in the matter. Page 30 of the PDF document below shows that it was kept “under seal”.

    http://www.wired.com/wp-content/uploads/2016/02/SB-shooter-MOTION-seeking-asst-iPhone.pdf

    MarkH February 20, 2016 6:03 AM

    A naive question, from someone who doesn’t know the handset internals (apologies if this was already answered):

    Why doesn’t the FBI make a forensic copy of the handset flash?

    I suppose that the FBI may have judged that the risk of physical damage to the hardware is too great. But if this could be safely done, then it would open two lines of attack:

    SIMPLE ATTACK

    Rig an externally controllable flash circuit to the handset, in order to exhaustively search the passcodes.

    Variant A: Flash is externally writable

    A1. If the “failed passcode attempts” counter can be located on the flash, reset the counter to zero after each try.

    A2. If for some reason it is a problem to reset the attempts counter, restore the flash after every n attempts, where n is of course not more than the threshold for flash erasure, and is chosen least time per average attempt.

    Variant B: Flash circuit disables writing from the handset

    In a typical flash, this would require interception of special sequences used to trigger writing to the flash. Flash erasure (and probably updating of the failed passcode attempts count) by the OS would no longer be possible.

    According to multiple press accounts I saw, the passcode search space is 10,000. If this is true, any variant of the simple attack could unlock the handset within a few weeks.

    SOPHISTICATED ATTACK

    According to more than one comment above, the encryption key is derived in part from a “chip ID”, embedded I presume in the CPU.

    The boot code space of the flash might be loaded with a small program which either reads out the chip ID, or if this is not feasible, replicates the key derivation function over the set of possible passcodes.

    This would require knowledge of the key derivation function, which I suppose could be reverse engineered without great difficulty.

    Once the special boot code has been executed, the resulting keys can then be tried on the forensic image.


    Now, I can imagine non-technical reasons why the FBI would prefer to force Apple to unlock the handset. But is there any technical reason why the FBI can’t use attacks like those above, in order to retrieve the data — without help from Apple or anybody else?

    If anybody here can provide a link to an article which already answers this, or knows the handset internals sufficiently to explain what the obstacle is, I will be eager to learn!

    Curious February 20, 2016 7:06 AM

    @Observer 3

    Not being a journalist or anything that would perhaps keep track of all the details and intricacies of this situation with Apple and FBI, I can’t help but wonder, that if it is so that Apple initially thought it was basically ok to help FBI to break into the iPhone (or perhaps some version of it) as long as nobody got to know about it, then perhaps the White House could be thought of as having wanted to throw Apple under the bus so to speak if Apple didn’t fully cooperate with FBI, or if Apple didn’t publicly accept the idea of providing what might be manufacturer’s backdoor tools for their electronics.

    I can’t but feel that I am stretching here for some kind of great scandal idea here, but I like being cynical so I tend to assume the worst of people. Trhough, it is presumably more or less obvious if something like the idea I have here makes good sense or not all things taken into consideration.

    Curious February 20, 2016 7:13 AM

    To add to what I wrote:

    Imagine the awkward situation any corporation would be in, if they ever had hidden deals with the government. By simply assuming that a company’s global trustworthiness and popularity was at stake with any revelation, or admission to directly undermining their own security by cooperating with a government, I think a government could use that predicament for basically blackmailing them, to have a corporation do whatever they want, whenever they want. 🙂

    Clive Robinson February 20, 2016 8:51 AM

    The issue has several issues that are not realy being talked about.

    Firstly, that “expendable DNA” is secure against rubberhose / moonshot injection / thermorectal / $5 wrench crypto analysis in the good old fashioned “dead men don’t talk” method of secret keeping.

    But perhaps more importantly we have,

    1, The phone user (deceased).
    2, The phone owner (user employer).
    3, The FBI (holder of the phone).
    4, Apple.

    Now it’s fairlyclear that theuser is a dead end as far as enquires go. So the FBI had two options, sanction the owner or Apple.

    From a technical view point the sensible option be approach Apple…

    But for some reason the FBI went to the phone owner, got the paperwork sorted out and then it all went horribly wrong for some reason that has not been disclosed…

    So the FBI wants Apple to pull their a55 out of the fire…

    What is unclear is if the state of the phone is known, that is the FBI/owner muckup may well have already lost the key…

    Either way for evidentiary reasons the phone is “toast” due to the FBI muckup, because the “evidence has been tampered with” in a way that could be chalenged in court should it ever be used that way. Thus it is now decidedly unclear as to if the phone is of any use anyway for the supposed FBI purpose of gathering evidence (though if not badly corupted it may still be of use for intel purposes).

    Thus knowing what was at stake people should be asking, ‘Why the FBI made a very bad choice in the first place of using the owners “technical resources” that were deficient?..’

    The answer to that question might shine a very powerfull light on a can of worms the FBI might not want to see the light of day…

    New Update February 20, 2016 10:26 AM

    Director of Privacy Albert Gidari LLM at Stanford Law School has written a very interesting article quoting Section 1002(b)(1) of CALEA. It means the government’s argument is useless:

    (1) Design of features and systems configurations. This subchapter does not authorize any law enforcement agency or office

    (a) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services;

    (b) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.

    John F February 20, 2016 11:46 AM

    To what extent does the Fourth Amendment to the Constitution have a bearing on this issue. It would seem that an iPhone is an ‘effect’ as is intended in the Amendment.

    Buck February 20, 2016 2:01 PM

    IANAL, but I am under the impression that the CALEA statute does not apply to a company such as Apple. At least, they are not considered to be a “manufacturer of telecommunications equipment”
    From 47 U.S. Code § 153 – Definitions [my emphasis added]:

    (52) Telecommunications equipment

    The term “telecommunications equipment” means equipment, other than customer premises equipment, used by a carrier to provide telecommunications services, and includes software integral to such equipment (including upgrades).

    Clive Robinson February 20, 2016 3:27 PM

    @ Buck,

    The term “telecommunications equipment” means equipment, other than customer premises equipment,

    What you are quoting is badly worded and ment originally for POTS equipment often called “land lines”.

    In Europe where GSM originates the differentiator is known as the “demarc” short for “demarcation point / barrier”. In essence on the line side is “service provider” equipment and on the other side “customer premises equipment”.

    For obvious reasons with mobile phones there is neither a line or premises, and technicaly the “demarc” occurs inside the mobile phone. Thus the radio equipment, signaling equipment, SIM etc are under the control of the service provider not the service subscriber.

    The fun comes when there is no physical demarc like a socket, which is where the fun starts with “Smart Phones” some have seperate CPUs and OS’s for the coresponding side of a demarc, others don’t. It’s not clear cut in the case of the Apple phone, as some CPUs etc are shared.

    Im assuming that as the call has terminated, and the records are on the user end of the comms chain that if pushed the Judge would probably decide that it’s on the “user side” of the device not the service provider side and Apple would have to make convincing argument otherwise.

    The mistake Apple has -supposedly- made is to not have built the try counter and coresponding try timeout in hardware. Something they will sort out in the very near future.

    deLaBoetie February 20, 2016 4:30 PM

    There are some analogies with the (vague, incomprehensible, awful) provisions in the UK’s draft Investigatory Powers bill. That could certainly be read as requiring any manufacturer/software that has communications capability (although the Apple case is about how to access device local data), to assist government requests to modify the software to allow hacking.

    Coupled with a draconian gag order, and not even a promise to recompense you the full amount for your trouble.

    But, here’s the rub – there is no way the government will reimburse your reputational damage, and the corrosive effects on loss of sales, particularly outside US jurisdiction. What’s even worse is that the gag orders will rightly cause customers to assume that there is such a hack, and your company is damaged whether it did or didn’t compromise its systems. In any case, the truth will out sooner or later, so the (possibly fatal) damage to reputation is assured.

    What’s very weird about this case is:

    1) The lifting of the seal on the court requests

    2) Why it isn’t obvious that the request is burdensome? – we’re talking the most valuable global company, who’s brand is a major part of that value, and something which significantly damages the brand. How is that not burdensome? The direct costs of doing the bodge is peanuts by comparison.

    Buck February 20, 2016 5:06 PM

    @Clive Robinson

    What you are quoting is badly worded

    No doubt about that! I understand your point about the “demarc” and it is compelling.
    For the benefit of others’ understanding, there is still another relevant definition in this code [again with my emphasis added]:

    (16) Customer premises equipment

    The term “customer premises equipment” means equipment employed on the premises of a person (other than a carrier) to originate, route, or terminate telecommunications.

    It’s not really clear to me what ‘originate’ and ‘terminate’ could mean in this context. Are these terms well-defined through case-law? Have any precedents already been set in applying CALEA to ‘handset’ manufacturers? Same question again, but this time for ‘smart’ phones?

    Skeptical February 20, 2016 5:45 PM

    So much for the “security for all or for none” argument. Let’s suppose that what the Justice Department is asking for is feasible. Then the vulnerability in the model and OS of the phone in question exists. Only the good graces of Apple (that stalwart defender of human rights and freedom, who so bravely rushed into the PRC to both make and sell their products) and its security measures prevents the vulnerability from being exploited.

    According to the “a door for someone is a door for anyone” argument, this means that we should expect this vulnerability to be exploited, and widely exploited, very soon. It means that what the court does here should be of little consequence.

    Let’s put this even more starkly: the argument implies that it is irrelevant whether a company is bound to assist a government in exploiting a vulnerability. After all, once there is a door, the game is over.

    Unless of course, the theoretical existence of a vulnerability isn’t enough. Unless suddenly the protections offered by an entity safeguarding certain key material and technology essential to the exploit actually can matter in assessing the security of a system.

    And this leads us to the unpleasant conclusion that – yes, I’m sorry – the “lawful access” debate is not quite so open and shut. If Apple’s proprietary code and signatures constitute significant security, then “lawful access” is not a question that can be answered in the abstract.

    Apple has put on a magnificent con. This is the same Apple that had no qualms about cutting the jobs and living standards of its employees in its quest for cheaper labour – never mind if its new workers lack legal protections and work under conditions that would infuriate any of you. This is the same Apple that has no qualms about entering markets where it will fall under the laws of authoritarian and anti-democratic governments – none of which will care in the least what the US federal courts think can be demanded of Apple. A company in China is subject to the laws of China.

    For Apple to wrap this misbegotten marketing campaign in protestations of its love for democracy and its patriotism is, quite frankly, sickening. This is a multinational company that regularly pushes the outer limits of the law to maximize its profits. It has loyalties to no one. Its highest value is better denominated in currency than described in the language of ethics.

    Let’s be clear here.

    What the court has ordered is not a departure from existing law. Courts have long held the power to compel the reasonable assistance of third parties connected to a matter in order to ensure that court orders have effect. This is a product Apple designed, to which Apple holds the particular knowledge requisite to enable the unlocking of that product; this is a product that is the subject of an entirely lawful investigation, in which the public interest in gaining access to the product is compelling.

    This has no bearing on what China or Russia will demand of Apple. What they demand has nothing to do with the US courts.

    Instead, this absurd theatre, with Apple casting itself as the champion of human rights, is the end-product of the application of a simplistic ideology to complex policy problems combined with the craven pursuit of profit. So far as I can tell, Apple has likely wasted thousands of hours of work by those working diligently to investigate a terrorism case, to say nothing of the time and energy it has detracted from an already overburdened judicial system. What Apple is doing neither improves security for anyone, nor facilitates a reasonable discussion.

    Support for Apple in this instance seems so obviously reactionary that one wonders whether anyone will have the courage to admit a mistake. Principles can be misapplied, and good arguments can be misused. The civil liberties community is in danger of sacrificing a fair amount of credibility in so reflexively embracing Apple’s asinine misadventures.

    Mark Mayer February 20, 2016 6:22 PM

    @Curious
    It’s my understanding that confidential filings are the norm for cases like this and not at all uncommon in non-investigatory cases (such as garden variety civil litigation between tech companies). It happens in two ways: the parties mutually agree or, absent agreement, one party asks the judge to put the documents under seal. It’s very common for judges to grant this unless there is a compelling reason not to do. Sometimes third parties, such as journalists, petition to have court filings unsealed in the public interest. Anyway, this is not the smoking gun some people believe it to be, and it’s rather thin evidence that Apple was willing to comply in secret.

    @Buck
    Originate = make a call
    Terminate = receive a call (or maybe hang up?)
    Think of it in terms of end points.

    @Skeptical
    It’s important to note that the vulnerability the FBI wants doesn’t exist. The DOJ is demanding that Apple re-engineer its product to introduce the vulnerability. I think this is an important distinction, whether it’s feasible to create a vulnerability vs. whether one is known to already exist. I’m not sure where your argument goes if you make that distinction.

    The thing that is really unprecedented here is the idea that the DOJ can compel Apple to create a forensic/hacking tool that damages the functionality of Apple’s products. They’re not compelling Apple to hand over information; Apple and every other U.S. company already does that when the FBI or other LE agency shows up with a warrant or court order. On earlier models Apple could unlock the phone and hand the data, i.e., they provided a lawful service under legal obligation. This is prior to ios8 and ios9. The security features implemented in these more recent OSes prevent Apple from simply unlocking the handset, which is why the FBI wants Apple to create a tool to pry apart the security implementation.

    So, are you comfortable with the DOJ dictating how you implement your security?

    Sancho_P February 20, 2016 6:38 PM

    @Buck, Clive Robinson, re CALEA

    It seems CALEA
    – is to be meant for interception of communication, not data at rest,
    – is directed at telecommunication carriers,
    but
    – is “badly worded” (?) to include “manufacturer of telecommunications equipment”,

    so Mr. Gidari wrote:

    ”CALEA is not limited in its applicability to telecommunications carriers at all as the government has represented to the court.  It applies to manufacturers and providers of telecommunications support services as well.  Apple is a manufacturer of telecommunications equipment, namely the S5 phone in the government’s possession.  Apple is entitled to the protections and limitations of CALEA just as it must comply with manufacturer requirements in the statute.”


    @Skeptical

    Yes, probably a huge door, let’s hope they will close it,
    also for our beloved OSs !


    @Moderator: unused

    Buck February 20, 2016 7:01 PM

    @Mark Mayer

    Originate = make a call

    Terminate = receive a call

    Well, obviously! But, who makes the call? Who receives those calls?? Is it the phone-dialer, the telco who sends the signal to initiate phone-calls, or maybe even the only company that’s supposed to have the one key that can authorize any such transaction??? I dunno… :-\

    AvidReaderAppleVsFBI February 20, 2016 9:44 PM

    I’ve enjoyed reading many of the posts here and on Friday’s squid:

    https://www.schneier.com/blog/archives/2016/02/friday_squid_bl_514.html

    regarding Apple and the FBI. In addition, many of the links have been informative, too.

    It was interesting to see that there appear to be some events scheduled for next Tuesday (second to last paragraph in the following link)

    https://www.eff.org/deeplinks/2016/02/apple-americans-and-security-vs-fbi
    Regardless some of us don’t march very well, and many people are influenced more in a one-on-one basis.

    Third, assuming relatively secure devices (like Apple devices) are a net positive for the military and/or intelligence communities it would be wonderful if greater pushback against law enforcements’ needs and/or wants was forthcoming from those communities.

    Fourth, coincidentally and somewhat off topic, the FBI figured prominently in a documentary film about the Black Panthers last Tuesday, the same day Tim Cook’s letter appeared, I think.

    http://theblackpanthers.com/home/

    more info about the film and background information:

    http://www.pbs.org/newshour/rundown/the-5-best-takeaways-from-the-black-panthers-documentary/

    https://en.wikipedia.org/wiki/COINTELPRO

    Hal O'Brien February 21, 2016 3:57 AM

    Two things I’ll mention:

    First, the suspects are dead. That means whatever data the FBI are looking for cannot possibly be to build a case against them. Since that’s true, the only reason to access the data is to prevent future crimes. Once you set that precedent, you might as well try to breed the pre-cogs of Minority Report. Since every single piece of data owned by anyone could contribute to a future crime, it would all be accessible via court order.

    Secondly, the FBI appear not to have thought through what putting a back door on iPhones would mean for Little Brother. Wikileaks will be very happy when someone starts breaking into Federal iPhones bought off the GSA Schedule.

    Roger Wolff February 21, 2016 5:17 AM

    One of the arguments of the EFF is that “security software is hard” and that “you can’t skip QA before signing”.

    Those, IMHO, are weak arguments. Security software is hard because it is immensely hard to prevent leaking of information that should remain hidden. Not an issue in this case. An “ifdef” can easily disable the “wait after ten tries” code.

    And signing as argued by the EFF must not be taken lightly. However for a “limited distribution” case like this, the “target platform” is exactly known, (i.e. QA needs to focus on this one phone, not “all phones out there”) and it needs to work just once. You’re not inconveniencing millions of people with a botched upgrade. (there is just ONE thing you do not want and that’s bricking the target phone).

    Things to learn from this by apple:
    * Do NOT allow upgrades to the security firmware. Needs to work now and in the future.
    * firmware upgrades should only be allowed on unlocked phones.

    Clive Robinson February 21, 2016 6:50 AM

    @ Roger Wolff,

    there is just ONE thing you do not want and that’s bricking the target phone

    Indeed, it’s the only requirment that must not in any way be broken which is why the EFF amongst others claim in your words,

      One of the arguments of the EFF is that “security software is hard” and that “you can’t skip QA before signing”.

    There are only two things we know with a reasonable degree of knowledge, the FBI want Apple to put code on that phone, which is in an unknown state. This is because the FBI have hidden the fact they have been messing around with it already according to the phones owner who made changes under the FBI’s direct instruction.

    Thus it’s hardly supprising Apple don’t want to touch the phone, because as others have pointed out it looks like the FBI are setting Apple up for political reasons.

    But you are making assumptions in your statments about the way Apple have written the “key protection mechanism”. That is you are assuming it is in effect simple structured sequential code amenable to simple changes.

    The reality is that as Apple have actually taken steps to protect the key with the mechanism, then there is a reasonable probability they have also taken steps to protect the mechanism in various ways. If they have done this then the code is very unlikely to be amenable to a few quick “ifdef” changes.

    As I’ve pointed out –on the squid thread– there is over a quater of a century of “Copy Protection” and other anti debug/trace techniques that I would expect a security expert to be aware of. Such techniques are not constrained to just “malware” commercial code uses it as well to protect “trade secrets”. I’ve used them several times in my career with the likes of electronic locks, radio devices of various types and other devices with very high IP investment. I’ve also taught some of the techniques to others so they can protect the valuable IP of “trade secrets” in their products.

    As for your two “Things to learn” points we don’t actually know that Apple has not already done this.

    Apple’s phones have to pass all sorts of quite stringent requirments as “R&TTE” part of which means that you don’t alow modification to the “air interface” code etc etc. This is because it would break certification which would be an eye wateringly large financial burden on Apple for a full product recall to fix it. Which is why some mobile smart device designers opt to use “immutable binary blob” code on segregated CPUs etc to prevent this eventuality. So it’s a technique I would expect Apple to be well on top of in one way or another, and to use. If as I suspect they do have parts of the code base that can only be changed by connecting the phone up to the production line programer, they may have included parts of or all of the “key protection mechanism”, it would after all be in line with their general policy of protecting user data.

    The problem is we don’t know what protection mechanisms Apple have put in place and that has created “a vacuum of information” thus way to many people are falling into the potential trap of believing what the FBI is requesting the court to get Apple to do is possible. When in reality the FBI obviously does not know what it is doing as it ignored Apples previous advice or the FBI is being deliberatly malicious for political reasons, and todate the latter appears the most probable by their behaviour that has so far come to light.

    CallMeLateForSupper February 21, 2016 8:58 AM

    Our Bruce, in Washington Post:
    “Why you should side with Apple, not the FBI, in the San Bernardino iPhone case – Either everyone gets security, or no one does.”
    https://www.washingtonpost.com/posteverything/wp/2016/02/18/why-you-should-side-with-apple-not-the-fbi-in-the-san-bernardino-iphone-case/

    Conor Friedersdorf in THe Atlantic:
    “The Conscription of Apple’s Software Engineers – The FBI wants to force tech workers to write code that they believe to be unethical, dangerous, and harmful to their country.”
    http://www.theatlantic.com/politics/archive/2016/02/the-conscription-of-apples-software-engineers/463338/

    ianf February 21, 2016 10:54 AM

    This Medium-short essay “Why Tim Cook is so furious” written by iOS/ Mac developer Gernot Poetsch contains an straightforward elucidation of how the iOS’ onboard multilayer encryption works, and why Apple simply can not afford to comply with the court’s order—whether compelled to do so, or even asked nicely… because doing it in whatever manner would effectively destroy their core business: Apple’s customers are its end-users, not the USGov! So here the FBI may have shoot itself in the foot (unless that was their intention all along… stranger things have happened.)

    Gernot also adds an important addendum[sic!] to @AES’ and others’ earlier iPhone field hardening advice:

      […] Touch IDmakes long, secure passcode convenient. It saves and enters the passcode for you, but if you’re in a place where you worry about [“security services”/ border interference, etc], press it with any wrong finger 5 times, and TouchID will be completely deactivated until you enter the passcode manually anew. [Then power the device down].

    Mo’ @ medium.com/p/be24163bdfa.

    ianf February 21, 2016 11:08 AM

    Speaking of Apple’s cleverly designed Touch ID, I’d have welcomed one more layer of utility:

      a “guest” (OR a “under duress”) second profile invoked with a specific finger—which would unlock the device with just some preselected subset of apps and non-critical data [mp3, epub, etc], while wholly preventing access to the rest of it. This alternative unlock needn’t be subjected to the same “reset after 5 incorrect finger presses; after 48 hours of device rest; and after reboots” rule as the ordinary Touch ID.

    (Just as at the dawn of business computer age, when people used work desktops for various NSFW things, many of such programs had a static spreadsheet image/ equiv. that could quickly be invoked to cover whatever was underneath on the screen, and provide a degree of ocular “alibi”.)

    Because the way things are going, soon we may risk impounding of any device that Very Knowledgeable At Combatting Terrorism Border Guards somewhere can not compel us to light up so that they may impromptu-inspect it for VERBOTEN contraband.

    Daniel February 21, 2016 11:59 AM

    @AvidReader

    I’m less impressed by Sotomayor’s observations that you. Yes, it’s true that limited resources in theory constrain any actor, even a behemoth like the federal government. But this theory ignores two realities. The first reality is sheer enormity of the resources the government can bring to bear. So what if the federal government can’t shoot all the fish in the barrel–it doesn’t need too–if it shoots 50% that’s enough to leave the other 50% terrified.

    The second problem with this theory is what I sometimes call the “individual vs the NSA problem”. The government doesn’t simply has huge resources generally, it has God-like resources versus any single actor in the system. So if a person has a target on their back, it is nigh impossible to avoid taking the hit.

    So I am dubious that the theoretical limit on resources that the NSA posseses bedevils its practical effort in any meaningful way.

    AvidReaderAppleVsFBI February 21, 2016 3:15 PM

    @Daniel

    After reading your post several times my overall response is “that’s true.”

    Push back against big government (the FBI or DOJ), regarding “Judge Demands that Apple Backdoor an Iphone”, may require push back from big business (Apple and others), big government (the military, the intelligence communities, state and local law enforcement, and others) and/or a host of other individuals and organizations. Resistance to forced backdooring of hardware, could lead to strange bedfellows.

    Regardless of ones’ employment status or who one works for (if at all), such a backdoor could facilitate wholesale (not targeted) surveillance at a reasonable cost within the U.S. borders. (I’m trying to stay way from tangents like what already exists, classified vs. unclassified statutes and their interpretations, budget impacts, power impacts, the Patriot Act, 702, 12333, parallel construction, forced updates, and the like)

    Regarding the lower left quadrant there is a zone in the Justice “Sotomayor Surveillance Scale where resistance will become light (or nonexistent), due to a program’s low public visibility, its low cost, or both. This is the sweet spot that advances in surveillance technologies give the government the opportunity to shoot for.” From below Figure 4 in:

    https://cyberlaw.stanford.edu/blog/2015/08/government-cheating-sotomayor-surveillance-scale

    Perhaps that potential sweet spot shouldn’t be given to the DOJ and FBI through forced backdooring of hardware at this time.

    Two more things:

    1) Regarding possible unintended consequences:

    “… the FBI appear not to have thought through what putting a back door on iPhones would mean for Little Brother. Wikileaks will be very happy when someone starts breaking into Federal iPhones bought off the GSA Schedule. (@Hal O’Brien)

    2) Regarding possible thoughts to live by:

    something like “If you are not in a straight jacket then you are not paranoid enough.” @Wael (from some other thread)

    Wael February 21, 2016 4:55 PM

    @AvidReaderAppleVsFBI,

    something like “If you are not in a straight jacket…

    That’s right! Make no mistake! Paranoia is exactly what makes @Nick P such a great “Security guy”.

    But there is a subtle difference between people who’re in a straight jacket because they “think proactively” about “unthinkable” possibilities, and those that really need to be checked into an asylum with a heavy duty Kevlar straight jacket 😉

    Mark Mayer February 21, 2016 5:17 PM

    @Clive

    When in reality the FBI obviously does not know what it is doing as it ignored Apples previous advice or the FBI is being deliberatly malicious for political reasons, and todate the latter appears the most probable by their behaviour that has so far come to light.

    Why can’t the FBI be both incompetent and malicious? 😉

    [To be fair, I’d say that at least in this case the DOJ (FBI’s parent organization) is the one acting with malice in a well planned political/PR campaign, but maybe the distinction isn’t important. Absent this case against Apple, the FBI screw up wasn’t consequential. There is slim chance that there is useful new data on that phone and even slimmer that it is of major import.]

    ianf February 21, 2016 5:43 PM

    @ Wael, straitjacket, alt. straightjacket but always ONE-WORD, else you’re talking fall season’s straight jackets fashion.

    Wael February 21, 2016 6:18 PM

    @ianf,

    straitjacket…

    Thank you sir! Vocabulary book updated. To show my gratitude I would have ordered a straitjacket for you, but my hands are tied

    Dirk Praet February 21, 2016 6:50 PM

    @ NotYouAgain, @ All

    Transfered from Friday Squid Blogging: Up Close and Personal with a Giant Squid:

    Gotta love the grugq for this one:
    https://medium.com/@thegrugq/feeble-noise-pollution-627acb5931a2#.fo3uojiqq

    Probably the best analysis I’ve read so far. A must read for anyone even slightly interested or involved in the matter. In short: this is nothing but a purely political move on behalf of the FBI.

    @ To All

    Why the All Writs Act cannot apply.

    Excellent additional legal argument. Thanks for the pointer.

    @ Skeptical

    What the court has ordered is not a departure from existing law

    As I said in an earlier post: the entire case hangs on the fact whether or not Apple is technically capable of doing so for just this one device. If not, then it IS a departure from existing law quite allright, and as per @thegrugq’s spot-on analysis, for purposes that have nothing to do with further investigation of the company iPhone of one of the SB shooters.

    @ Wael

    But there is a subtle difference between people who’re in a straight jacket because they “think proactively” about “unthinkable” possibilities, and those that really need to be checked into an asylum with a heavy duty Kevlar straight jacket…

    Comparing @Nick P. to Donald Trump indeed is a bit of a stretch 😎

    Atlas February 21, 2016 6:57 PM

    if I were Apple id let them shut me down to teach the government an Atlas Shrugged moment. The lesson would be worth it.

    Principles matter or they don’t,

    the law in conflict with itself is hardly a settled matter to give more power to thoes utterly drunk on it, doing so won’t make them sober, it will just justify future acts by the bully which are meaner and nastier

    A.

    Wael February 21, 2016 7:22 PM

    @Dirk Praet,

    Comparing @Nick P. to Donald Trump indeed is a bit of a stretch 😎

    Correct! Hence the disclaimer “But there is a subtle difference…”. @Nick P is good 😉

    ianf February 21, 2016 7:35 PM

    Were @Atlas Apple, he’d let the USG shut him down to teach it an “Atlas Shrugged” moment. The lesson would be worth it.

    Who/what ARE you—an adolescent enthralled by that Ayn Rand intellectual snake oil saleswoman from beyond the grave? Time to grow up, start by popping some pimples (squeezing out zits) or something.

    Mark Mayer February 21, 2016 9:23 PM

    @Atlas

    The scenario you describe is virtually impossible because as CEO, Cook doesn’t have that authority. The chairman of Apple doesn’t have that authority, nor does the full board. It would take a majority vote of all shareholders of voting class shares. Apple is not going to protest by shutting down and moving to Galtlandia.

    Here’s the most Tim Cook can do if Apple loses all appeals: he can resign. Perhaps, if he is threatened with “comply or be arrested for contempt of court”, he can choose jail. But that’s basically resigning + jail.

    Or he can make the best of a bad situation and perhaps funnel lots of money into lobbying to have the law changed.

    Nick P February 21, 2016 11:05 PM

    @ Wael, Dirk

    Oh hell naw! Im gone for 1 long shift 2 come back 2 an attack delivered with respect n shit everyone forgive Wael for being three verses and a solid refrain short of real nerdcore rap even tho’ he been working hard at it… still dropping top rhymes as pro as my gramma’ spellin’ and punctuation 🙂

    Note: And what the hell does that hilarious cow have to do with anything!? Better not be about by essays.

    “Comparing @Nick P. to Donald Trump indeed is a bit of a stretch 8-)”

    Appreciate you noticing that only one of us looks and speaks like a horse. Wait, a mule due to one key trait they alway have. (evil grin)

    “Correct! Hence the disclaimer “But there is a subtle difference…”. @Nick P is good ;)”

    Alright, that’s what I’m talking about. You get semi-proper grammar now to lower your blood pressure from the OCD. 🙂 Yes, this difference also exist between high assurance and mainstream security types. Recent example is all the focus on hardware. A hack or two led most people to think: firmware in this device so firmware in those devices; what about chips; IOMMU everything; trustzone. I ask, “Wait, the problem was the logical or electrical implementation of the hardware. It caused problems. So, what are all the methods for specifying, implementing, verifying, and testing those? And what esoteric issues we run into?” Led to my foray into all things ASIC with 100+ papers on how to do them right. Every now and then, one can see the difference.

    Here’s an excerpt from an online conversation where an assembly elitist tried to call out safe language users as not working with ideal model of computation. My limited HW skills paid off:

    Him: “The good reverend Laphroaig preaches:

    If the 0day in your familiar pastures dwindles, despair not! Rather, bestir yourself to where programmers are led astray from the sacred Assembly, neither understanding what their programming languages compile to, nor asking to see how their data is stored or transmitted in the true bits of the wire. For those who follow their computation through the layers shall gain 0day and pwn, and those who say “we trust in our APIs, in our proofs, and in our memory models and need not burden ourselves with confusing engineering detail that has no scientific value anyhow” shall surely provide an abundance of 0day and pwnage sufficient for all of us.”

    Me: “An assembler elitist with a semi-fallacious argument. Let’s rewrite that in view of a lower-level elitist to show it still looks true, shows love for assembler as foolish pride, and still fails to matter in face of good, high-level tools.

    If the 0day in your familiar pastures dwindles, despair not! Rather, bestir yourself to where programmers are led astray from the sacred RTL/Transistor language, neither understanding what their assembly languages and microprograms compile to, nor asking to see how their data is stored or transmitted in the true bits of the CPU’s network-on-a-chip and memory plus analog values and circuitry many run through at interfaces. For those who follow their computation through the layers shall gain 0day and pwn, and those who say “we trust in our assemblers, our C compilers, our APIs, in our proofs, and in our memory models and ISA models and need not burden ourselves with confusing engineering detail that has no scientific value anyhow” shall surely provide an abundance of 0day and pwnage sufficient for all of us.

    Source: LISP, Forth, and Oberon communities who did hardware to microcode to language & OS all integrated & consistent. :P”

    Him: “I surmise the good reverend elevates Assembly not because it is fundamental, but because it is a level deeper than the domain of coders who yield unto us exploitable codes. Verily, I demand of ye, produceth thou the exploit of an Assembly 0day that was wrought from Transistor language!”

    Me: “Rowhammer. :P”

    BOOM! Eat that! The Apple case is making the same mistake as people are focusing on the software too much. No, the thing runs on something called transistors. Even IBM’s legendary cryptoprocessor had methods of defeating it. You can bet whatever crap thrown together for cheap ARM chips can be bypassed. The obvious method is compelling the info on the HW, software, and any tamper-resistance. Or reverse engineering that using other phones. One derives the attack that way, tests it with some phones, and then uses it on the target. Costs money but the attack is reusable: rest are marginal.

    Of course, this case is clearly not about terrorism at all so much as legal expansion. If it was terrorism, the FBI could do the job themselves because that security functionality is nowhere near as safe as it appears. Yet, most of those charged with making “secure” phones or whatever are assuming an impenetrable black box in these discussions with a focus on software attacks. They’re still not getting it. By it, I mean real security: a holistic concept that works ground up applying everything we’ve learned. They learn the tactics of attack but not the science of effective defense. True almost every time. 😉

    Nick P February 21, 2016 11:14 PM

    re secure phones

    Anyone interested in what would go into a secure phone can look at my most recent write-up on the topic. Covers two prior designs plus Wael’s breakdown that it built on. Uses, but doesn’t cite, a number of advances in hardware-based security. My next one will have more focus on dealing with issues at logic gates, analog interface, and so on. I still assume the adversary being in physical possession or knowing emanation attacks = total compromise.

    ShootMeNow February 21, 2016 11:51 PM

    @ianf Stunned at the multiple insults and cyber bullying some are provoking from you. On more than one post at two different posters on the two Apple threads. And while none of the posters referenced you, it was your task to take them on personally.

    Usually, trolling It’s a tag team effort to keep the rif raf in line their self ritchous asses posting stupid complete thoughts, but you got this man.

    No matter how many people are turned off and reject schneier due to your patrolling, we know it’s the Internet and your doing a great job of being our thought police reminding every visitor of it

    Clive Robinson February 21, 2016 11:53 PM

    @ Mark Mayer,

    Here’s the most Tim Cook can do if Apple loses all appeals: he can resign. Perhaps, if he is threatened with “comply or be arrested for contempt of court”, he can choose jail. But that’s basically resigning + jail.

    He currently has a little more “wriggle room” than that, and it’s something I suspect multiple high Tech company executives are thinking or talking about, and no doubt some have already started the process for other reasons –such as customer trust– to ensure their shareholders get the returns.

    Basicaly you restructure the company across jurisdictions in a way that makes the FBI / DOJ / Executive bullying ineffective and gives customers reasons to trust the company more than others.

    As a rough rule of thumb –though American exceptionalists disagree– even executive legislative power has limits one of which is “to be constrained within it’s own jurisdiction” (which is one reason why certain parts of the CIA etc exist).

    With the right organisational structure it would not just be company profits held outside the US jurisdiction, but the actual conversion of the IP into goods as well.

    It realy is something the FBI / DOJ / Executive need to think about, because Corps have already “Off Shored” / outsourced low end jobs abroad, which can be shown to have hit both the US home economy and tax take.

    So if the Corps decide to Off Shore steadily higher level jobs, then the US-Tech sector will diminish to the point that the only silicon being moved in Silicon Valley will be the desert sand as it gets blown through yet another ghost town.

    The thing is as many European Countries can tell you, once a tech sector leaves your shores it usually does not come back because the human expertise has gone with it, and thus any new IP benifits the new host nation and tends to lockin the Corp there.

    China knows this which is one of the reasons it played games with scarce raw commodities it has a near monopoly on such as rare earth metals. The result was “tech moved” onto Chinese shores and the knowledge needed for new IP moved with it.

    As far as government bullying goes Atlas does not shrug, he puts on his hat and coat and walks out the door to where the climate is more conducive, not even leaving an empty overcoat behind. Meanwhile the likes of Comey are so busy fighting turf wars in their own back yard they don’t realise what an empty dust bowl they have made it. Unlike others Comey and ilk don’t actually look over the fence to see that grass is definitely greener than dirt. So they don’t realise why those who do look, such as Corps simply walk out the gate and leave the urchins to faux fight in the dirt they have created. But for Comey and his ilk, that’s OK because they are Kings of their own dirt piles, and as long as their dirt pile is the biggest in the yard they feel they have achieved something. But as has been pointed out before, being dirt poor means you find out the hard way you can not eat dirt no matter how much of it you have, so that dirt pile no matter how big is a pyric victory at best.

    The funny thing is that the US Gov is actually making the ability of Corps to Off Shore much easier with the TTP and related trade treaties, and also limit Executive power within US jurisdiction. Whilst TTP is a bad thing in general, as a weapon it is double edged and can be as easily wielded against it’s creators as it can against any other Government. This is because the judicial process involved is independent of the jurisdiction, and those involved in it know as do IP lawyers which side of the bread has not just butter but jam as well and it’s the large Corps that pay the top dollar.

    Interestingly Donald Trump is against such treaties, he may just be stabbing in the dark –his favourite occupation– or there might actually be an inkling going on under that “pony mane”.

    Thus the next US Executive –who ever it might be– may find that the Ancient Chinese Curse of “May you live in interesting times” has settled in on the roof of 1600 Pennsylvania Ave, and is going to take more than the usual discouragement to get it to shift.

    Bill February 22, 2016 1:25 AM

    Luckily data on iPhones is safe from brute force attacks:

    “The birthday paradox is one reason why larger key sizes are necessary
    for security. If we move our attention from DES to an AES 128-bit
    key, there are approximately 3.402 * 10^38 possible keys. Applying
    the birthday paradox gives us 1.774 * sqrt(3.408 * 10^38), or
    32,724,523,986,760,744,567 keys that need to be attempted to have a 50
    percent chance of finding a match. This number is large enough that
    it is computationally infeasible to break it through brute-force
    methods, even with the birthday paradox.” (Modern Cryptography, p. 324)

    Wael February 22, 2016 1:48 AM

    @Nick P,

    can look at my most recent write-up on the topic.

    A little shy of two years ago! Amazing how fast time goes by. The older you get, the faster it goes… It’s accelerating!

    My next one will have more focus on …

    Looking forward to it!

    I still assume the adversary being in physical possession or knowing…

    In physical possession of “what” or “whom”? You’ll need to detail the object of “possession”: device, user, law, resources, organizations, and a rubber hose to go with them.

    Wael February 22, 2016 2:07 AM

    @Nick P, @Dirk Praet,

    Oh hell naw! Im gone for 1 long shift 2 come back 2 an attack

    What can I say! You snooze, you lose!

    And what the hell does that hilarious cow have to do with anything!? Better not be about my essays.

    Oh, no. It has nothing to do with you. I thought it was funny. I challenged a few people if they watched this video and not smile, that I’ll pay them $10. No one passed the challenge.

    Yet, most of those charged with making “secure” phones or whatever are assuming an impenetrable black box in these discussions with a focus on software attacks.

    My characterization is a bit different. It’s more prevalent that most those charged with making “secure” phones [..] only focus on their area of experience (and I don’t say expertise.) A cryptographer will focus on complexity and hardness of algorithms, and protocols, a software person will look at “secure” code-cutting, toolchains, which language is more secure,… A hardware person will look at clocks, crosstalk, race conditions, and all the other HW related issues, …

    What’s sometimes missing is an understanding of the larger picture, fundamental principles, what can and cannot be expected to be achieved in a theoretical setting or a real life environment with all its interdependent parameters, technical and non-technical.

    Curious February 22, 2016 8:06 AM

    Heh, I edited the sentence, but now “the Apple” doesn’t make good sense. I had probably written “the iPhone” before ultimately changing the sentence. This kind of editing flaws happens often with me. Being too quick about posting my comments.

    Sancho_P February 22, 2016 5:59 PM

    @Curious

    Yes, the worldwide kill switch called “update” isn’t often mentioned in the media.
    Yet we have the same in all our systems, from server to desktop to RasPi to IoT.
    Clearly here in this story the fact “without owner’s consent” is the misery, but if we take it one step further it is not.
    I left Win when SP3 suddenly was “mandatory” with XP, you couldn’t run any new SW on SP2 (and without .NET xyz, and …).
    This is a kind of artificial obsolescence of intangible products that may render the whole HW useless.

    OK, only if the owner (¿really the owner or the licensee?) agrees to update the device it can be of further use.
    Does any “owner” know what the download will do to the device or data?
    How often do I install SW against the system’s warning on my Mac?
    Yes, we have to and we will consent, on a daily basis, and they are not accountable.

    Regarding the artificial obsolescence let’s take it to the next level:
    How long will it take until your device will stop working “for safety reasons” when you do not allow it to call home to mummy?

    So your car will simply refuse to start if service is overdue.
    Your phone / PC won’t work any more if you don’t update.

    Good bye to air / energy gapped systems.
    Good night to NK or China if they don’t comply, the timer is already installed (Win 10?).

    Oh, and good night Washington, if a bright kid in the UK finds out how to change the timer.

    Matt R. February 23, 2016 9:59 PM

    In this day and age of Bot army’s and VM’s I wonder why the FBI doesn’t just create 1000 clone’s of the phone, then have the 1000 clones each try the 10 codes before they lock up. Surely the FBI has the technical skills to harvest the raw encrypted data from the phone. Virtual Machines have been around since the old days of the IBM 360, surely someone can produce a VM that completely emulates an iPhone 5c, or any IOS device for that matter. No doubt, the NSA has super computers that could easily host a 1000 VM’s. I’d be surprised if they haven’t already, for this very purpose.

    Niko February 23, 2016 11:08 PM

    @Clive

    Mark’s right. Tim Cook, and for that matter, the board of directors, lacks the legal authority to reincorporate Apple without a shareholder vote. That would require a shareholder vote. Since re-incorporation would force all shareholders to realize their taxable gains, you need an incredibly compelling business case to get the shareholders to accept that tax pain, and for various reasons, Apple the corporation may not be able to realize any taxation savings from an inversion. It’s pure fantasy to think that Apple is going to reincorporate overseas.

    65535 February 24, 2016 7:08 AM

    Responding to the minuscule possibility that Farook used Telegram messaging app on his work iPhone 5C, Arstechnia poster notes:

    “Speculating on installed software is silly, Apple already knows what software was installed. (Excluding developer installed; but they’d know about that usage too) “-Mark086, Arstechnia

    See:
    http://arstechnica.com/tech-policy/2016/02/if-fbi-busts-into-seized-iphone-it-could-get-non-icloud-data-like-telegram-chats/?comments=1&post=30690531#comment-30690531

    I would further speculate that although Telegram app uses encryption I would assume Call Data Records [CDRs] and Metadata would have left a trail leading the FBI to know if this Telegram app was used – or not.

    I doubt there is significant data on Farook’s cell phone which owned by San Bernardino County. This is contrary to the FBI explanation that “a whole lot more data” could be accessed with Apple’s hacked software that is supposed to run in RAM. The FBI appears to be on a fishing expedition.

    Arstechnia poster Rosyna:

    “It should be noted that Apple gave the list of App purchases to the FBI.

    “Therefore, the FBI can do a union of the apps with backup data on the iCloud backup and the apps purchased/downloaded from the registered AppleID on the phone to get a list of possible services the terrorist used. (Because not all apps are capable of contacting others)… With that list, the FBi could then subpoena the services involved to get records. Most services would be able to comply with existing tools.”- Rosyna, Arstechnia

    See:
    http://arstechnica.com/tech-policy/2016/02/if-fbi-busts-into-seized-iphone-it-could-get-non-icloud-data-like-telegram-chats/?comments=1&post=30690571#comment-30690571

    Also see:

    http://arstechnica.com/tech-policy/2016/02/if-fbi-busts-into-seized-iphone-it-could-get-non-icloud-data-like-telegram-chats/?comments=1

    And

    http://arstechnica.com/tech-policy/2016/02/if-fbi-busts-into-seized-iphone-it-could-get-non-icloud-data-like-telegram-chats/

    Wikipedia description on Telegram app:

    “Similar to services like WhatsApp, Telegram accounts are tied to telephone numbers and verified by SMS or phone call. Users can add multiple devices to their account and receive messages on each one. Connected devices can be removed individually or all at once. The associated number can be changed at any time and when doing so, the user’s contacts will receive the new number automatically. In addition, a user can set up an alias that allows them to send and receive messages without exposing their phone number. These aliases or usernames are also linked with an official https://telegram.me/ URL (where is the chosen username). Usernames can be changed at any time. Accounts can be deleted at any time and they are deleted automatically after six months of inactivity by default, which can optionally be changed to 1 to 12 months. Users can replace exact “last seen” timestamps with fudged messages such as “last seen within a week”- Wikipedia

    https://en.wikipedia.org/wiki/Telegram_%28application%29

    [and]

    https://en.wikipedia.org/w/index.php?title=Telegram_%28software%29&redirect=no

    If there are any experts on the Telegram app please give us more information.

    Clive Robinson February 24, 2016 7:57 AM

    @ Niko,

    That would require a shareholder vote. Since re-incorporation would force all shareholders

    WHo said anything about “re-incorporation”?

    It’s far from necessary, if that were the case 99% of the currant tax dodging arangments by Corps through the likes of Eira and Luxembourg would not be happening.

    Without going into many tedious explanations they “spin off” the likes of R&D for the security asspects into another (semi) independent business unit in another jurisdiction.

    This sort of IP dodge is currently used to shift profit out of high tax jurisdictions into low tax jurisdictions via “Royalty Payments”.

    For the dodge to work tax wise the (semi) independent entity has to have full control on the IP, so that the Corp business units have to go to it and “ask nicely” not say “you will”.

    Remember that the AWA is limited to US jurisdiction, and thus has reach limitations over and above the “undue burden”.

    That way all the court can do is compell the business unit in it’s jurisdiction to “ask nicely” to which the royalty entity in a different jurisdiction can simply “just say NO” or ask absolutly outrageously large fees and Royalties pluss binding contracts with major penalty clauses from the FBI / DOJ which if they breach…

    But more interestingly they can force the phones concerned out of US Jurisdiction by saing “only at our research lab in Elbonia” or some such.

    Yes there could be downsides like losing US Gov contracts, but it would require the democraticaly ellected US politicos to pass a change in law to force ALL phone suppliers to put in the required access. Whilst the USG got away with making it “world wide” once over GPS chips that subsiquently enabled “tracking” on the old “Health and Safety” and “think of the children” excuses it is by no means certain that would happen again. Thus there would just as likely be “US phones” with obvious Back doors and “other nation phones” without backdoors that would enter the US by various ways as the security concious worked out how to do it whilst on holiday etc.

    The thing is the US is not the major market it once was, there are only 300million or so US citizens. Europe is around twice that which means the West is only around 1/7th of the worlds population. If you then consider the BRICs although on an economic downturn, they do have around half the worlds population under their influance market wise…

    But also consider it was the US that started the TPP treaty nonsense that alows corps to get judgment against states, I would be fairly sure there is a way there by which a clever set of corporate legal eagles can put the screws on any government including the USG.

    If Corporates wanted to play rough when the FBI and DOJ try to get cute then at the very least they could tie them up for years to come in various ways.

    The downside of this of course will be the drop in US Tax take. If Corporates reorganise to take their IP off shore, then that would mean US High Tech jobs on six figure and above salaries would start becoming a thing of the past. The smart ones would move abroad as is now happening a lot in Science and Research. And whilst “out” find reasons to stay out. Some as a couple of my friends have done, will dump their US Nationality for tax and or ethical reasons, oh and better working conditions and even love, marriage and family (as one friend in Europe put it “After being hear for a year I woke up one morning next to an angel, and realised I was willingly kidnapped and hand cuffed by a small band of gold, so decided to make it all legal and become a usefull citizen so I could stay”)…

    As somebody else has mentioned the FBI’s Comey thinks he’s the body around which the rest of the world turns… he’s very likely to be disabused of that notion if he pushes to hard and becomes an embarrassment to those that currently let him have his little day dreams.

    Nick P February 24, 2016 11:39 AM

    @ Niko

    What Clive is describing is what I designed and once built for hosting centralized services at risk to local governments’ snooping. Designing it for nation-state resistance is tricky in general. That just makes it harder. Essay is here.

    Darren Chaker February 25, 2016 12:39 PM

    This effort to force Apple to create code which doesn’t exist to thwart its own security measures will fail and create precedent to protect us all. Support the ACLU, EFF and similar organizations who will be on the front lines of this fight as it will secure our Fourth Amendment rights.

    Don’t believe the hype – this case is about using a tragedy to support discarding privacy for all of us. Once it’s established Apple can discard the encryption, then Google is next on the decryption hit list, then desktop encryption, whole disk encryption, and anything else government cannot get into – hence it will force people to use foreign based products and hurt American jobs, competitiveness, etc. Demonizing Apple makes it appear there are lurking terrorists. If we were so concerned about terrorism, there’d be far greater border security, and filtering of people like the San Bernardino attackers who were approved to come to the USA when they should not have been.

    Best to all, Darren Chaker

    ianf February 25, 2016 1:54 PM

    @ Darren Chaker: HEAR, HEAR!.

    [Repeating myself of 5 days ago] “… one has to wonder: does FBI r.e.a.l.l.y. imagine that it could force perhaps the highest amiable public profile US tech company into handling them their crown jewels?

    ronald March 1, 2016 5:36 PM

    I’m late to this comment but here goes anyway:
    The gov’t wants what it wants, and they want it for FREE.
    The FBI wants Apple to write code, encryption code, the hardest code to write. This takes labor and effort and time. To be fair and polite did the FBI offer to pay Apple for their efforts.
    NO! Not a chance.
    Like any situation: say please and show up with a big check and you might get what you want.
    Also
    Mobil phones are and international phenomenon. There are countries where tyrants will execute others if they somehow decode a citizen’s cell phone and find something written that they don’t like. Encryption protect those guys too!

    ianf March 1, 2016 6:11 PM

    @ ronald is late to this comment but here goes anyway [goes intentionally suppressed]

      Yes, you are. To have any impact now, you need to print out that your opinion/ advice, address it to Director Comey c/o FBI, Washington, D.C., USA, and put it in the mailbox (don’t forget the return address in case the FBI would like to ask for clarification). Put in some baking soda in the envelope to ensure that your contribution will be treated seriously (baking soda is legal). And good luck with talking sense to Mr. Comey.

    Leave a comment

    Login

    Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

    Sidebar photo of Bruce Schneier by Joe MacInnis.