How Apple Continues to Make Security Invisible

Interesting article:

Apple is famously focused on design and human experience as their top guiding principles. When it comes to security, that focus created a conundrum. Security is all about placing obstacles in the way of attackers, but (despite the claims of security vendors) those same obstacles can get in the way of users, too.


For many years, Apple tended to choose good user experience at the expense of leaving users vulnerable to security risks. That strategy worked for a long time, in part because Apple’s comparatively low market share made its products less attractive targets. But as Apple products began to gain in popularity, many of us in the security business wondered how Apple would adjust its security strategies to its new position in the spotlight.

As it turns out, the company not only handled that change smoothly, it has embraced it. Despite a rocky start, Apple now applies its impressive design sensibilities to security, playing the game its own way and in the process changing our expectations for security and technology.

EDITED TO ADD (7/11): iOS security white paper.

Posted on July 5, 2013 at 1:33 PM35 Comments


kevinv July 5, 2013 2:16 PM

@NobodySpecial – were they wrong? iOS isn’t invulnerable to malware but it doesn’t seem to spread and there doesn’t seem to be as much as it as their is for android. and little proof that anti-malware software would’ve reduced any of what is out there.

Arnold Schwarzenegger July 5, 2013 2:16 PM

iOS suffers from full sdcard encryption bypass as you can openly buy forensic tools to access devices. If they provided a serious implementation they would probably take over every medical office with tablets for billing. A doctor could run their entire practice with a tablet taking automated electronic appointments, storing records and billing. Guess they arent’t interested which leaves Android to take that market eventually

NobodySpecial July 5, 2013 2:23 PM

No it’s not clear that anti-virus would help, but the “we are secure, there is no need for antivirus” is still worrying. If there was a virus would a company with this attitude announce a fix or just legal the problem away with DMCA/libel suits?

The bigger problem is that the nature of virus has changed. They are no longer about kids making a splash and causing random damage. If there was a way for an iPhone virus to listen in your keystrokes or track your websites/calls it might remain secret for a very long time if it was only used on a few select victims.

Grizwald July 5, 2013 2:34 PM

@Nobodyspecial – Antivirus isn’t prevented on Android but that has done nothing to improve the situation there.

Antivirus is just not a good solution, and it burns computing resources and creates a false sense of security. That is why Apple doesn’t allow it.

Anon July 5, 2013 3:07 PM

Oh please.. Apple doesn’t want antivirus software installed because every pop up would be a slap in the face to their “we don’t need AV’ marketing line…

tim July 5, 2013 3:19 PM

@Arnold Schwarzenegger

Have you been to a medical office recently? All I’m seeing are iPads and iPhones. Apple has left it to third parties to deliver vertical software solutions. And they are delivering.

Michael Simpson July 5, 2013 3:54 PM

@Arnold Schwarzenegger…..

You wrote: “iOS suffers from full sdcard encryption bypass as you can openly buy forensic tools to access devices. ”

You are wrong. If you choose to use the iOS full encryption with strong security, all you have to do is to change it in Settings/General/Passcode. It will turn off the simple 4-digit PIN and allow you to enter a long strong passcode with up to 37 characters utilizing 77 possible Alphanumeric/Symbol characters.

There is NOT a simple SD card encryption bypass. Once the strong passcode is enabled with full tablet encryption, it’s done. They are HIPPAA compliant and are used in tens of thousands of doctors offices and hospitals, American Airlines now uses iPads with strong security in the cockpit.

In a recent online test of how well resets and wipes actually work – guess which had ZERO artifacts left after the factory reset/wipe? iOS. There were all kinds of things left behind by Android.

Facts, Arnold. Facts.

Nebulus July 5, 2013 5:15 PM

For me it’s really simple: Apple wants full control of their devices, so have no intention whatsoever to use any of them, no matter how secure their products might be.

Clark and Son July 5, 2013 5:27 PM

Is the gist here that Apple systems don’t get malware at the same rate as others because they make pretty hardware? I don’t know what ‘make security invisible’ means in a practical sense but by those words it means obscurity — at the moment. It could mean in actuality that Apple builds from the ground up with security in mind. I’m not convinced as recent news items say otherwise and I’ve had my own recent experiences with unbeknownst malware becoming evident with a second opinion. Perhaps I’m naive but my security suite doesn’t really get in the way and I’m not anti-password (just understand the limitations and where I need to focus attention due to risk). I’m now interested in encrypting everything because I’m not a big fan of guns. I’m hoping encryption is a 1st amendment issue.

Nebulus July 5, 2013 8:12 PM

@Nebulus really weird that a positive article about Apple makes you feel like you need to justify the choices you’ve made.

Ac2 July 5, 2013 11:45 PM

Given recent NSA revelations and iDevices failing the mud puddle test, I can’t see how storing my passwords in the iCloud can be considered a good idea…

Figureitout July 6, 2013 12:31 AM

Making the user experience clean and all; I can admire the simplicity (I printed a box for a pir sensor using this same philosophy). But, it’s like high-level coding; you won’t know what’s happening in the event of an attack and these devices are too complex that they shouldn’t be contributing to this falsehood. One can’t even harvest many reusable parts from a dead smartphone; maybe some precious metals.

And Bruce’s latest google talk, it was pretty chilling when he compared Apple to China; how they lock down the tech.

hawkse July 6, 2013 4:38 AM

@Michael Simpson:
Interesting. Honestly. Do you have any good links to your facts? Would like to read more.

Delphi Ote July 6, 2013 8:36 AM

Antivirus on a mobile device? Really? Really?!?! When did the comments around here get so inane?

Signature detection is easily circumvented. Profiling code is extremely resource intensive, and had high false positive rates. Even antivirus on a PC is outdated. People never used antivirus on servers, because they don’t work. Hell, they’ve even been a vector of attack. If you’re attempting to execute code you don’t trust, you’re already pwned. Period.

Has there ever been a major iPhone virus or worm? No. Do they need antivirus? No. Whitelisting and sandboxing are working much better. Leave the 90s behind.

Shady Device July 6, 2013 9:45 AM

@michael simpson

If you want facts read VIAforensics 80 page report on mobile security which explicity recommends you never store anything critical on an encrypted iOS device because there are easy methods to just bypass it completely which governments and LE can buy. (if they can buy it, so can criminals with clever social eng)

A 7970 gpu running at 750MH/s can also reduce the number of guesses to less than 10 in one hour for a password of any size due to low iterations on both iOS/android. However this isnt even needed since I watch Customs here plug in phones and bypass it in seconds, same with Blackberry encryption regardless of password strength which is why my country forbids medical records kept on iOS, Blackberry, and all stock Android devices unless you make a custom rom that increases iterations to slow down cracking.

Jim July 6, 2013 11:47 AM

Anyone who is not a knuckle dragger understands the possibility for useable security. I think useable security is only made a myth by those who want to send a message of dominating, sociopathic behavior.

But, Apple has a lot to prove there. They are good at design, but they are still human beings. And as others have pointed out, they have spread their legs for the more noxious elements of the US Government.

Europe, and the rest of the world, is starting to think of US products as the US Gov hypocritically thought of Huawei.

If you want software that has guaranteed intentional security vulnerabilities so some cretin’s in a post-modernist Hoover to be able to come in and do as they please — go with Apple for security. 🙂

(Thanks kids of Hoover and Hitler and Stalin… R333l Americans. :/ )

Michael Simpson July 6, 2013 12:25 PM

@hawkse: A good place to start would be the iOS Security Whitepaper. It explains the difference between the 4-digit PIN and the strong security (it’s a good read for a lot of things).

MIT Technology Review

iMessage Trips Up Surveillance

@Shady Device: All I can say is “Via Forensics” is not using current hardware/software or, they’re comparing basic iOS PIN security versus strong encryption.

The AES encryption, and its implementation in the iPad/iPhone, has been shown time and again to be every bit the security and privacy workhorse. Many times, companies like Elcomsoft will claim they can “break” this and that when if you read the fine print, it’s only when the device is on and the key is still in memory and the user hasn’t signed-out. Remember, encrypted devices are just like any kind of physical safe – if it’s already open, you’re in.

There’s so many more articles I could post, but I’m sure Bruce has a limit on number of URLs before it’s kicked out as spam.

I’ll end with this video showing results of devices supposedly “wiped” when, in fact, they are not. Again, the iOS devices were the ones where this is implemented in such a fashion as the forensics guy said “nothing” remained.

UselesSWanker July 6, 2013 1:03 PM


If there was a way for an iPhone virus to listen in your keystrokes or track your websites/calls it might remain secret for a very long time if it was only used on a few select victims.

There was. Here’s a PDF from BlackHat 2010 about some of the issues.

Robert Thille July 6, 2013 1:18 PM

One thing I’d love to hack into my iPhone is an electronic switch which disconnects the data lines when the screen is locked. So if I plug my phone into a ‘hostile’ charger, or someone (law enforcement or otherwise) they wouldn’t be able to pull any data off it without first unlocking it with my code (or physically undoing my hard-hack).

Brandioch Conner July 6, 2013 1:27 PM

@Shady Device

If you want facts read VIAforensics 80 page report on mobile security which explicity recommends you never store anything critical on an encrypted iOS device because there are easy methods to just bypass it completely which governments and LE can buy.

The question would be whether they can do that remotely.

Security is not about becoming invulnerable. That is impossible.

Security is about reducing the number of people who can EFFECTIVELY attack you.

Natanael L July 6, 2013 5:20 PM

On Apple’s comment “we can’t read iMessage conversations”:

In short: If you lose your device AND forget the password, the key to the messages should be lost. And yet, if you do a password reset and enter the new password on the new device, all the old messages comes back.

Either they weren’t encrypted or Apple held the key.

someone July 6, 2013 9:27 PM


I dislike Apple and their products a lot but in all fairness the point has to be made:

“A preliminary black-box test seems to indicate old iMessages, text messages, and e-mails are stored in iCloud and can be restored using the iForgot mud puddle recovery test,” Soltani said. “It definitely appears that iMessages are restored from iCloud backup, not the iMessage service.”

The way I read this it means the message service is probably safe. What has been restored was being copied from the users unlocked phone by the back up routine and stored in the cloud separate from the message handling. So their iCloud encryption or key handling seems to be farked.

I consider this similar to users usingn TC FDE, but then employing a third party backup software which runs on the users unlocked OS, grabbing the plaintext after the FDE decrypted it. If you shovel elsewhise secure FDE data during runtime into something less securely implemented, all bets are off.

It seems staying clear from backups on iCloud (if iOS grants you this much freedom at all) the iMessage setup still seems to hold up its “end to end with no backdoor to be exploited by employees” promise.

Nick P July 6, 2013 11:05 PM

@ someone 9:27pm

“The way I read this it means the message service is probably safe.”

You make decent points. Probably is a gamble, though. Companies that prefer not to gamble with the security of their assets should use a communication method with certain end-to-end security. And only the usual 0-days and administration issues to worry about.

“hold up its “end to end with no backdoor to be exploited by employees” promise.”

What about contractors or national security letters? 😉

Nebulus July 7, 2013 8:45 AM

@Nebulus (this is beginning to look weird 🙂 ): I was simply pointing out that no matter how much security Apple is offering, this is not necessarily an incentive to me to buy their products, because that security is offset by a lack of control.

Brandioch Conner July 7, 2013 2:00 PM

From the first line of that link:

Apple receives so many police demands to decrypt seized iPhones that it has created a “waiting list” to handle the deluge of requests, CNET has learned.

So that changes it to a physical security issue. Rule #1 – there is no security without physical security.

As for the remote exploits, that has not been shown yet. And given the millions of iPhones in circulation with millions of credit cards linked to them that should make them prime targets. Which does not seem to have happened.

NobodySpecial July 7, 2013 9:04 PM

One would hope that it is possible to have data security without physical security – presumably armies lose bits of equipment in enemy terriroty fairly regularly

Sofakind July 8, 2013 11:37 PM

For Arnold and anyone else, “If you give up physical security you are toast!”

Thats it, it is very basic. If someone else has your hardware for a long enough time they will get in eventually. It may require freezing the RAM and extracting the data before it fades into the ether. It may require 25 years from now the invention of a quantum computer running infinitely many and all possible cases simultaneously and wiping out your previously big primes that used to be hard to factor. It could be a sieve no one has ever discovered that trivializes the previously unbreakable/unbroken. Outside of assured perfect security such as OTP’s, you are toast if you give up physical access, end of story.

Bottom line, if you give up physical security of the comp/device then yes perhaps forensic tools will get in but thats not the manufacturer’s fault, that is the user’s fault.

I believe even Bruce would agree with me on this point.

Spaceman Spiff July 8, 2013 11:54 PM

If it isn’t open, it isn’t secure! I think Bruce would agree that there is “no security by obscurity”! (Sorry Bruce for the possibly out-of-context quote:-) If it can’t be generally vetted openly by many different parties, then it cannot be secure. Ipso facto!

Figureitout July 9, 2013 1:36 AM

Spaceman Spiff
–I disagree, to what degree do you take “being obscure”? Give me your keys, let me see your systems, let me observe your lifestyle and find the holes/install some goodies. No? Stop being obscure. If it can’t be understood then there’s some security b/c someone is going to have some late nights at the very least.

That quote is most relevant to the math, the algorithms; not the big picture.

Clive Robinson July 9, 2013 9:57 AM

@ SpacemanSplif,

    If it can’t be generally vetted openly by many different parties, then it cannot be secure. Ipso facto

Err no, if that were true all the products from the likes of the NSA, GCHQ, et al, would not be secure.

By and large just as much bad code happens in Open Source as it does in Closed source. Even code that goes through a run of the mill code review process is fairly uniformly bad when it comes to security, simply because the number of expert reviewers is very very small compared to the number of code cutters.

There is an argument to be made that actually having access to the source does not improve security (except for simple specification/coding mistakes). This is simply because of the underlying tool chains, code libraries and DLL’s used by developers these days and that the code review for security should be done on the actual linked object code in an appropriate disassembler.

If this argument is correct then it does not matter if the high level code is Open or Closed source.

However there is a partial counter argument, in that those writing code that will be seen by many eyes will take more care with the code they produce simply to avoid embarisment.

Even if that is not true, it’s certainly the case that writing code for many others to colaborate with tends to produce clearer and cleaner and better structured and documented APIs.

Thus the argument is not realy about Open/Closed source but the coding style of those writing the code and their ability to design interfaces well, as an enabler to working effectivly in colaboration with virtually unknown people who may not even speak the same language.

I guess you could say “Clean design gives clean results, messy design gives messy results”.

paranoia destroys ya July 9, 2013 11:18 PM

I’ll let others debate the whys of security and extent of vulnerabilities for the Mac. But a commonly repeated statement never made sense to me.

“Apple’s comparatively low market share made its products less attractive targets.” 12 to 15 years ago in the era before OSX, Apple had a high rate of exploits when their market share was smaller than today.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.