Schneier on Security
A blog covering security and security technology.
« New Real Estate Scam |
| Risks of Cloud Computing »
July 29, 2009
iPhone Encryption Useless
Interesting, although I want some more technical details.
...the new iPhone 3GS' encryption feature is "broken" when it comes to protecting sensitive information such as credit card numbers and social-security digits, Zdziarski said.
Zdziarski said it's just as easy to access a user's private information on an iPhone 3GS as it was on the previous generation iPhone 3G or first generation iPhone, both of which didn't feature encryption. If a thief got his hands on an iPhone, a little bit of free software is all that's needed to tap into all of the user's content. Live data can be extracted in as little as two minutes, and an entire raw disk image can be made in about 45 minutes, Zdziarski said.
Wondering where the encryption comes into play? It doesn't. Strangely, once one begins extracting data from an iPhone 3GS, the iPhone begins to decrypt the data on its own, he said.
Posted on July 29, 2009 at 6:16 AM
• 33 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Isn't Zdziarski missing the point? The encryption comes into play when you remote wipe your phone (available through MobileMe). What happens then is that the iPhone's encryption key is wiped disabling all access to the phone's data. This procedure doesn't take nearly as long as actually wipeing all data from the phone, although the effect is the same.
Apple's homepage states: "iPhone 3GS offers highly secure hardware encryption that enables instantaneous remote wipe." (http://www.apple.com/iphone/iphone-3gs/more-features.html)
And that's the point. Making the remote wipe feature faster.
From the article:
"He added that the ability for the iPhone to self-erase itself remotely using Apple’s MobileMe service isn’t very helpful, either: Any reasonably intelligent criminal would remove the SIM card to prevent the remote-wipe command from coming through. (In a past Wired.com report, Zdziarski said the iPhone’s remote-wiping ability pales in comparison to Research In Motion’s BlackBerry, which can self-delete automatically after the phone has been inactive on the network for a preset amount of time.)"
Huh, this reminds me of when I took that 'class' on full disk encryption, they said some customers wanted FDE drives that didn't require a password, when asked why, they said they wanted to be able to cryptographically erase their data.
Guess Apple must have been one of those customers.
No ... I think he was talking from the perspective that iphone is designed with an idea of as a secure file system. That portion is badly broken -- that's how jailbreak/unlocking is done.
Just last week, I exchanged an iPhone 3GS, and so I manually 'wiped' the phone before taking it into the store. I was amazed that it took just a few seconds, instead of the 2+ hours that I had read about online. Apparently, what the manual wipe actually did was just discard/erase/wipe the key to the filesystem. Pretty cool.
Meanwhile, Bruce's post reminds me of how vulnerable I may be if I misplace the phone...
I'd suggest that the encryption in the the phone does exactly what Apple intended. Indeed, as Apple apparently took suggestions from their business users about what features they needed, I'd suggest that Apple implemented exactly what most business users wanted.
Most missing phones are not stolen, they're misplaced or lost. Of the phones that *are* stolen, the goal is the phone, not the data. Any 'reasonable intelligent criminal' doesn't care if you wipe it or not.
So we're left with deliberate theft for the contents of the phone, eg: espionage. If you have that kind of data, what are you doing with it on your phone in the first place?
I think this is all a tempest in a teapot.
I was thinking ok so assume open line coms security practices.
Then I read what Lance Kidd former CIO of Halton Company had said,
“Your organization has to be culturally ready to accept a certain degree of risk, I can say we’ve secured everything as tight as a button, but that won’t be true…. Our culture is such that our general manager is saying, ‘I’m willing to take the risk for the value of the applications.’”
Oh dear aparently this might be because according to Kidd,
"A security expert performed an evaluation of Halton, and he said it was possible for any hacker to find an infiltration no matter the level of security.
Whilst true like physical security you don't leave all your doors and windows open when you "take a walk in the park" or are "out for lunch".
Further Kidd thinks,
“It’s like business continuity, you prepare for disasters. You prepare for if there’s an earthquake and the building breaks down, and you prepare for if there’s a crack in [information] security.”
What would that plan be? To "shut the stable door" when you see "the horse has headed for pastures new"?
When are people going to learn that ICT Sec is not like "business continuity" for normal insurable risks covering physical loss.
Business continuity is like preparing to go walking in remote hills, ICT Sec is like planning to engage the enamy in the dark.
It's way way more difficult as it involves not just down time and loss of insuable assets, but loss of organisational reputation and deals, potential class action and criminal prosecution from distinctly unfriendly people.
Further ICT Sec failures tend to be "all or nothing" not like bin diving or other forms of Industrial spying et where it's snipits of info.
A failure of Comms Sec gives almost instant sight of the latest thinking and direction of an ofganisation at the highest levels.
Apple and the other smartphone manufacturers are positioning these devices as alternatives to notebook computers. In effect, we'll soon all be walking around with pocket-sized general-purpose networked computers. From old habit we still think of these as phones, whence your notion that "the goal is the phone, not the data". But they are not mere phones any more.
This being the case, every corporate/government embarassment you've read about in the past few years, where someone loses a laptop with reams of proprietary|legally-encumbered|privacy-sensitive data, is merely a foretaste of what's to come, when instutions allow employees to load such data onto these easily lost devices. When that happens, the data is in fact more valuable than the device, at least to some folks.
Remote-wipe won't cut it here, especially as it is so easily circumventable. The data must not be readable without authentication, and that must be a strong guarantee. Apple's efforts in this case are too amateurish to be taken seriously. This, it seems to me, is of a piece with Apple's more general attitude towards security practice, which clearly still takes second priority to usability in every sector of their business.
Not "a tempest in a teapot". Rather, a wildly mis-hit effort targeting an enterprise market they clearly don't understand.
So the encryption isn't actually used for encryption? Is that what I'm hearing?
The only reason Apple put in encryption and calls it encryption is so they can quickly and remotely "wipe" the drive just be deleting a world-readable, non-password protected key?
Is that about right?
Watch the videos about this feature. It requires no "jail breaking" to get a raw, unencrypted image of the data from the iPhone. It doesn't matter if the phone is passcode (4-digit pin) protected.
Everyone is missing the point: Apple is about "it just works".
Security can't "just work" -- the user has to be informed and intelligent.
Therefore, Apple doesn't do security -- not in any sense but the simplest "let's avoid viruses and stolen passwords kind of way". They're a consumer product company -- not a client product company.
The article is very sparse on data, but as I read it, the author had access to a phone not protected by a pass code in order to install his tools.
As I understand the working of the 3GS encryption, it is essentially a full disc encryption of the entire flash memory with the key stored in special memory on the processor. That means that the encryption does not protect against security holes in the OS, which is what was demonstrated, but if Apple is smart enough and stores the device pass code also in processor special memory, you would need not only listen on the processor - memory bus, but also to replace the memory content while the processor is running in order to circumvent the pass code lock. Certainly doable, but not really cheap.
Now a phone syncs with iTunes even when the pass code lock is active, but this only works when the computer is already paired with the iPhone, otherwise you need to unlock the pass code before you can get access.
So unless we hear a report where a pass code locked iPhone 3GS has been hacked, I suspect this to be an exaggerated claim.
So Apple put in a feature it was told people wanted (remote wipe), and didn't get it right. I'm not particularly surprised. Lots of people screw up in their first try at something, and security is particularly easy to screw up.
The iPhone was originally (AFAICT) for people who wanted an easy-to-use smartphone, who don't want it password-protected all the time, and who don't want their data to be wiped automatically if they're out of range of cell towers. For example, I take mine sailing, and can be out of range of cell towers for a couple of days or longer. It's still useful as a calculator and book reader and camera and other things, and I'd be unhappy if it wiped itself. The Blackberry is more normally seen as a business tool, and not as a really fun little device. (Also, what's the longest time allowable to prevent people from getting the data, and shortest time to avoid having temporary outages trash data? Methinks there's some false security involved here.)
So, Apple would need to add some sort of business mode, which is probably a software update away. Some businesses need to call Apple and tell them exactly what they want.
What bothers me more is people who would walk around with critical data on their phone.
It strikes me as odd that usability has overriden security (mind you this Apple with the real "hot phones" that discolour the case).
It sounds like a "specification" error.
That is the data is encrypted on the device, the key is protected and pass phrase locking etc is there (ie all the needed bits)
However if the phone is unlocked or put on a "known cradel" then for "usability" the encryption is transparent to the cradel...
I guess this for auto updating of things like calenders and desktop syncing etc where file transfer goes on in the background simply from "docking" the device.
It's the sort of "usability" error that crops up all the time when encryption is added at a later date as oposed to designed in from the start.
This just goes to show you: physical access is root access. Security is mostly an illusion.
"So Apple put in a feature... and didn't get it right... Lots of people screw up in their first try at something.
You beat me to it 8)
With regard to,
"Also, what's the longest time allowable to prevent people from getting the data, and shortest time to avoid having temporary outages trash data? Methinks there's some false security involved here."
Remember Blackberry keep the data keys on servers.
So wiping the key to encrypted data after 20mins or so of "Loss Of Signal" would not actually be a problem as it could be auto reloaded on "Aquisition Of Signal".
If you then kept a copy of the key encrypted via the hash of a pass phrase, if the unencrypted key was required befor AOS then the user types in the pass phrase to "work of line".
I think even Apple could get this right if somebody told them ;)
What is the odds that if Apple have mucked up the "transparent docking" usability issue, that they have also mucked up one or two other things.
Like for example using the wrong encryption modes so an attack in depth due to standard file format info is possible...
It's protected by a four-digit PIN and the security is weak? I'm shocked, shocked.
Like Bruce, I'm not sure we have enough details on this yet. To me, the biggest detail is: how do Apple and its typical iPhone customer _want_ this to work? Just because it seems insecure with respect to how we think it should work, that doesn't mean this is a failure. One might respond that this is then a failure of specification rather than one of implementation, but a failure nonetheless. I think that most of us may be poorly placed to evaluate the specs on this device. Our judgment is not the business's judgment. If one thinks this behavior should be configurable, one is overlooking both the primacy of configuration as a source of insecurity and the no-hacking-no-customization-you'll-get-what-we-say-and-you'll-like-it ethos that Apple has always had.
It is the recovery mode that is causing the issue, by providing access to the flash data without authentication. It should provide two options: 1) access when entered the pass code or 2) access to a wiped disk.
It reminds me of an old problem Palm had, where you would gain access to the device using the debug console. They fixed it, let us hope Apple will fix this problem rather quickly.
This is a rather elementary security mistake, and I hope that given that Apple's marketing material claims protection against brute force access, people burned by it will successfully sue Apple for compensation.
@Malvollo and others
Sure but it is quite simple to enforce a more secure passcode. In fact you can set up policies for this in an Entrerprise environment
From the Enterise depolyment Guide for iPhone:
Protect your enterprise data by conﬁguring device passcode policies. Set the minimum
number of characters, number of complex characters required, passcode expiration,
device lock interval, and maximum failed attempts. "
So the problem is not that you are limited to 4-digit passcode - you're not.
"If a thief got his hands on an iPhone, a little bit of free software is all that's needed to tap into all of the user's content."
This seems to be the crux of the problem but it brings forward a major issue in phone security.
When you can install a custom kernel and then still secure shell into a device with access to the data you are talking about a single-user architecture.
I used to have to fight against this continuously with product marketing at mobile companies. They invariably assume a mobile phone is single user, and they love this assumption because multi-user security is more expensive to develop and maintain properly.
A far more secure treatment of the device (that favors the manufacturer) should be to lock-out access to keys when the kernel is changed, or at the very least it should prevent access to the keys (and thus unencrypted data) without specific user authorization. The latter would favor the consumer and be more in line with other systems.
By way of example, when an unknown user updates the kernel on a Macintosh and then tries to ssh to it and extract the data in unencrypted format...
Apple may have implemented the FDE as a quick way to wipe data, but they certainly seem to be promising more. From their site (http://www.apple.com/iphone/business/integration/):
"...your data is secure with support for encrypted data in transmission, HARDWARE ENCRYPTION FOR DATA AT REST, and encrypted backups in iTunes."[my emphasis]
Further down on the same page:
"iPhone 3GS protects your data through encryption of information in transmission, at rest on the device, and when backed up to iTunes...In the event of a lost or stolen iPhone, you can even clear all data and settings by issuing a remote wipe."
Personally, that last paragraph makes it sound as if remote wipe is an added benefit to encryption.
Plus, even if not, the entire thing is disingenuous at best. What are the chances that people would associate "FDE" with "remote wipes ONLY?" Why even mention FDE? FDE is mentioned because people think "FDE = encryption," not "FDE = remote wipes," and Apple wants people to think that their iPhones are as good at data security as a rival smartphone-maker that actually has FDE implemented correctly.
Seems to me that Apple screwed up. Based on the comments here, though, I'm pretty sure Apple's security team will catch up with their marketing department's claims.
Clive Robinson: The discoloration was because of a bad cover the specific user purchased for his iPhone. Not due to the materials in the iPhone itself.
I think Apple should fix this by really making it configurable (its what the IT-admins want anyway, right?).
Somthing presenten on Blackhat against TrueCrypt FDE:
"Stoned is able to bypass the full volume encryption of TrueCrypt. "
Horst: Very interesting! One interesting thing about Bootkits using MBR is that computers using (for exampel) GUID Partition Table (http://en.wikipedia.org/wiki/GUID_Partition_Table) would be immune (until people start target GUID as well) while still retaining compability.
Or am I mistaken?
Apple says "Jailbreaking iPhone could pose threat to national security"
First reported by Wired.com, Apple's comments explained that jailbreaking allows hackers to alter the phone's baseband processor (officially called the BBP chip), which is the chip that enables the phone to connect to cell towers.
Apple stated in its filing that by changing the BBP's code, "More pernicious forms of activity may also be enabled. For example, a local or international hacker could potentially initiate commands (such as a denial-of-service attack) that could crash the tower software, rendering the tower entirely inoperable to process calls or transmit data. In short, taking control of the BBP software would be much the equivalent of getting inside the firewall of a corporate computer--to potentially catastrophic result."
Okay leave aside why we let a company deploy a technology that COULD jepordize national security, why would you as a company give any paying customer a tchocke that lets them inside your "firewall" on the understanding that they "won't" try to.
Is the Agatean wall all over again?
@Daniel Wijk: See http://www.h-online.com/security/...
"At present, only machines running the traditional BIOS are vulnerable. The attack is unsuccessful when the BIOS successor the Extensible Firmware Interface (EFI) is at work on the motherboard."
If EFI is (still) immune because of GPT - I dont know.
"by hacking the BBP software through a jailbroken phone and taking control of the BBP software, a hacker can initiate commands to the cell tower software that may skirt the carrier's rules limiting the packet size or the amount of data that can be transmitted, or avoid charges for sending data ... In short, taking control of the BBP software would be much the equivalent of getting inside the firewall of a corporate computer - to potentially catastrophic result."
Horst: Just as I thought, thanks!
Shoot me an email if you still want more details. I'd be glad to provide you with some of the tools I give to law enforcement for obtaining raw disk, removing passcode, etc.
HI, JONATHAN I HAVE AN IPHONE 3GS THAT IS LOCKED WITH A PASSCODE, COULD YOU PLEASE HELP ME OUT WITH SOME DETAILS OF HOW TO RECTIFY THIS. THANKS.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.