New iPhone OS May Include Device-Unlocking Security

iOS 12, the next release of Apple's iPhone operating system, may include features to prevent someone from unlocking your phone without your permission:

The feature essentially forces users to unlock the iPhone with the passcode when connecting it to a USB accessory everytime the phone has not been unlocked for one hour. That includes the iPhone unlocking devices that companies such as Cellebrite or GrayShift make, which police departments all over the world use to hack into seized iPhones.

"That pretty much kills [GrayShift's product] GrayKey and Cellebrite," Ryan Duff, a security researcher who has studied iPhone and is Director of Cyber Solutions at Point3 Security, told Motherboard in an online chat. "If it actually does what it says and doesn't let ANY type of data connection happen until it's unlocked, then yes. You can't exploit the device if you can't communicate with it."

This is part of a bunch of security enhancements in iOS 12:

Other enhancements include tools for generating strong passwords, storing them in the iCloud keychain, and automatically entering them into Safari and iOS apps across all of a user's devices. Previously, standalone apps such as 1Password have done much the same thing. Now, Apple is integrating the functions directly into macOS and iOS. Apple also debuted new programming interfaces that allow users to more easily access passwords stored in third-party password managers directly from the QuickType bar. The company also announced a new feature that will flag reused passwords, an interface that autofills one-time passwords provided by authentication apps, and a mechanism for sharing passwords among nearby iOS devices, Macs, and Apple TVs.

A separate privacy enhancement is designed to prevent websites from tracking people when using Safari. It's specifically designed to prevent share buttons and comment code on webpages from tracking people's movements across the Web without permission or from collecting a device's unique settings such as fonts, in an attempt to fingerprint the device.

The last additions of note are new permission dialogues macOS Mojave will display before allowing apps to access a user's camera or microphone. The permissions are designed to thwart malicious software that surreptitiously turns on these devices in an attempt to spy on users. The new protections will largely mimic those previously available only through standalone apps such as one called Oversight, developed by security researcher Patrick Wardle. Apple said similar dialog permissions will protect the file system, mail database, message history, and backups.

Posted on June 12, 2018 at 6:23 AM • 23 Comments

Comments

meJune 12, 2018 6:48 AM

Before people start arguing that this move has been done only to prevent police devices from working, i'll cite Matt Blaze tweet:

https://twitter.com/mattblaze/status/1004006362085560321
You might also call what Apple did "fixing a serious security vulnerability", unless of course you think that the only purpose of data security is to thwart law enforcement.

meJune 12, 2018 6:52 AM

I'm happy to hear that apple is going in the privacy direction.
They can do it since they sell hardware(=make money from it) and not only software (=make money by selling people data).

I don't have an iphone and i don't think i'll buy it:
i prefeer freedom over security, i want to be able to install any app that i want, not only the "approved" one, so that when one day google/apple wake up and decide to ban adblock, vpn, or whatever i can still use them.
I know that is risky, that i could install a malware (or someone else).
but i'm happy with the risk, again, for me freedom is much more important than security.
same goes for mass surveillance.

JeffJune 12, 2018 7:27 AM

@me
Re freedom over security -- I'm thinking about moving to the opposite side of this. I don't use an iPhone because I don't want to be told what I can and can't do with my devices, but if Apple really does create a secure platform that does what I want and they don't abuse me afterward, I'd consider it. I guess its whether you want to be owned by someone you know or owned by a hacker from Russia. For my phone, I can see why picking Apple might be the right answer.

For my laptop, however... completely different story.
Jeff

0lafJune 12, 2018 8:18 AM

I do have an iPhone and I chose it because it lets me do what I need to do with less data haveresting then the alternative which is Google.

Google doesn't allow you to do what you want either. So I'm not sure what would fulfill your needs.

Impossibly StupidJune 12, 2018 8:49 AM

@me

i prefeer freedom over security, i want to be able to install any app that i want, not only the "approved" one

Then you actually prefer security, because the only thing that affords you the freedom to do what you want with your device is the security that keeps it from being controlled by other people. The odds of you waking up and finding the manufacturer has blocked some benign software is pretty low, but the odds of you waking up to a malware infection on an insecure device is pretty high.

And if push comes to shove, any "unapproved" app could be compiled and installed on your phone directly. That's how I planned to do an end run around the App Store when I briefly had troubles trying to release one of my apps. Apple's walled garden doesn't actually restrict your freedoms any more than other mobile devices, it just makes it harder for jerks to abuse you.

meJune 12, 2018 10:29 AM

@Impossibly Stupid
>Then you actually prefer security, because the only thing that affords you the freedom to do what you want with your device is the security that keeps it from being controlled by other people.

True, but i don't see android as super-insecure. i'm a big fan of the pc, and i don't use smartphone a lot, i have installed few apps and i'm fine, js is blocked in the browser.
as soon as the only problem is people installing fake whatsapp because the market is full of fake clones, or similar problems installing apps, i don't have any problem.
if there will be diffuse use of exploits that does not require user interaction i'll consider switching to apple.

and anyway after keeping my phone for 24h offline i modded it with cyanogenmod, for various reasons for example:
*every time* i turned on location it asked "you want to share location to google or only turn on gps location?" and if you tick the "don't show again" you are left only with "accept" as "don't accept" is grayed out.
is this a real chooice? so i rooted it.

just another example:
i don't even have a pin on my phone, you can swipe it and it's unlocked. so this apple feature to me is useless.
i don't see any reason to add a pin/password to access my phone, *in my specific case/threat model*

OliverJune 12, 2018 10:38 AM

Hi Bruce,
anything that twarts the efforts of the jack-booted thugs in their survelliance is a good thing!
Kudos to Apple for finally doing something effective about it!
Cheers, Oliver

albertJune 12, 2018 11:13 AM

@me,
"... i have installed few apps and i'm fine, js is blocked in the browser...."

How do you view the twitter link you provided?

I have Java blocked, and Twitter asks "Would you like to proceed to legacy Twitter?". I say 'yes' and I get "403 Forbidden: The server understood the request, but is refusing to fulfill it."

. .. . .. --- ....

PhilJune 12, 2018 11:38 AM

It would be nice if Apple engaged with standards organizations. Each of the last major releases of ios has caused major changes for developers wanting to use standard federation authentication methods from IETF, W3C, and OpenId.

While protecting privacy is laudable, breaking authentication frameworks that enhance security and privacy seems counter productive to the goals Apple is pursuing.

Who?June 12, 2018 1:15 PM

Hey guys, am I missing something? Apple is a proud member of the PRISM club!

They are doing what they do best, playing in both sides: they sell "secure" devices to customers while allowing unrestricted access without warranty to government.

Who?June 12, 2018 1:18 PM

Bad wording, I should have written: "...while allowing warrantless unrestricted access to government."

dewayneJune 12, 2018 8:49 PM

albert, I get the same thing ("legacy Twitter" always gives a 403—and really, they decided it was easier to keep two parallel codebases than to figure out how to use progressive enhancement for a 140-char message?).

If you turn off stylesheets on the "we've detected" page you'll see the content there, after 10+ pages of chuffah. View / Page Style / No Style in Tor Browser.

Does it Mr. Duff?June 12, 2018 9:18 PM

Or does it increase sales volume for GrayShift and Cellebrite since their units now are needed within one hour of every potential warrantless search?

Gerard van VoorenJune 12, 2018 10:22 PM

@ Phil,

"Each of the last major releases of ios has caused major changes for developers wanting to use standard federation authentication methods from IETF, W3C, and OpenId."

Especially OpenId, that could have "ruled the world". But we all know by now that the "winning product" is the mobile phone, isn't it? Besides that, I have given up all if not most of IETF and especially W3C. These clowns still think (well do they actually think at all?) that expandability is the best feature, while the opposite is true. And I seriously believe that they are incapable to redesign this whole mess.

ThothJune 13, 2018 1:01 AM

@all

And soon Cellebrite et. al. releases new updates yet again to acquire access back into those new shiny upgraded iPhones.

When does people ever learn that if you have something sensitive, separate them as far as possible from modern electronics ??!

Forget about security of secrets on smartphones. That's just a pipe dream and a marketing show.

meJune 13, 2018 2:31 AM

@albert
Here it asks the same thing but works

anyway i was writing from pc

RocklobsterJune 13, 2018 12:43 PM

@Bruce I'm glad you mentioned Cellebrite. I tried bringing that to peoples attention on one of the security forums but the moderators deleted my post which was critical of the phone manufacturers that they would build in a backdoor into so many different phones that it can't be anything but a backdoor that Cellebrite uses the same method on all of them to bypass user lockouts by taking a hex dump of the contents via USB.

albertJune 13, 2018 1:17 PM

@dewayne,
Thanks...

@me,
I should note that I always have JS disabled. I didn't try with JS enabled; it's Twitter :(

. .. . .. --- ....

Clive RobinsonJune 14, 2018 6:32 AM

@ All,

Another view but from the UK,

http://uk.businessinsider.com/apple-will-make-it-harder-for-police-to-access-locked-iphones-2018-6

It's more of the same from the US including a "Dumb Hick Cop" giving a cross of Going-Dark and Think-Of-The-Children jaded old lies.

However when you get down to the mention of The University of Surrey (a place I know of old) you get a slightly different perspective on the GreyKey, that is partialy right but still wrong.

The argument is the GreyKey can not be used for mass surveillance because it has a "Device Present" requirment. Whilst that is true it ignores a couple of issues,

1, Customs/borders/check points.
2, Random criminals including police gaining access without consent or knowledge of the owner.

Whilst the first can in a number of cases be avoided (don't travel). The second of criminals such as police, private eyes, identity thieves and others can not be achived without a lot of diligence and a degree of OpSec and knowledge of the law.

As the second group are way more prevalent than many would think not addressing the issue or worse deliberatly side stepping it is not sensible.

However the final point about any security on mobile phones is transitory is true for all consumer level products currently. Put simply they have not been designed correctly to protect private or confidential information at much more than "Annoying little brother" attacker level. That their little sisters used to once buy diaries with cheap locks on thinking it would keep them out...

Which brings me back to the point I keep making about not keeping private or confidential information on electronic devices not designed correctly for security. Designing such devices is not easy and requires considerable knowledge and experience, and often results in expensive equipment[1].

Thus to avoid the expense, a simple mitigation is the use of encryption in a correct manner by both the operator and the system. One system that puts the security on the operator's use of traditional OpSec is "hand ciphers" where only the ciphertext goes on electronic devices for communication or storage.

Which brings up another issue, there are three basic things you can do with information,

1, Communicate it.
2, Store it.
3, Process it.

As a rule of thumb it is best to limit the functionality of a device. Thus a communications device should not be capable of storing or processing information, a storage device likwise should not communicate or process. Not shown on the list is the line where encryption can be used as a security measure. Of the three currently only "information processing" cannot be efficiently or practically done whist the data is encrypted[2]

[1] The price of security electronics is generaly way higher than it should be. In part because the cost of "product one" is both high and has to be amortized over as few as a hundred production units. But this has a knock on effect, due to the high price the purchaser expects it to be "servicable and repairable" this means that you can not use the more cost effective security measures which are not servicable by design (various encapsulation methods). Thus the designers have to take not just more expensive methods but also employ further methods to mitigate the deficiences of those methods (think security screws on the case needing further break circuit switches and other reliable case opening detection systems with backups incase they fail).

[2] Some information processing can be done fairly easily on encrypted data such as modulo addition under an additive stream cipher. But they all have limitations that need to be resolved on decryption of the information.

ThothJune 14, 2018 6:31 PM

@all

If Apple does goes the length to aid totalitarian regimes remove anti-censorship apps, VPN apps and even certain media contents that are controversial in the eyes of these regimes, how are they even trustworthy to not hide nasty surprises and provide aid in return for cash for those willing to pay them in their black accounts books ?

Clive RobinsonJune 15, 2018 12:32 AM

@ Jonathan Wilson,

And it looks like the hackers building devices for the cops believe they have a workaround for this new feature:

They might or they might not, it depends on if Apple "are jerking their chain" or not.

The only code for the hackers/crackers to test against currently is Apple's "beta release" which can and probably will be changed before the final release.

Thus it is altogether possible that Apple has put some of the USB locking functionality in but not all of it, either by accident or design. But even if they have not there is still a window of opportunity for Apple to fix things before final release.

Back in the late 1990's I had noticed an issue with online security at banks, in that they were in effect "training the crackers". I have in the past posted my observations on this blog.

Put simply the banks kept releasing small incremental improvments in their security and the changes were too small. That is whilst they sort of fixed one type of security issue often it was by "rearanging the deck chairs" thus mutating rather than fixing, they also did not fix all the security faults the knew of.

Thus attackers that had reserves of funding could climb the attack tree a bit to get to the new level of "low hanging fruit". As a side effect of this the crackers became more expert in what the systems they were attacking and developed new methods and tools to attack with which would speed up future attacks. I noted back then that it was the same process as learning to climb a mountain. Most people can not climb a mountain, but they can walk up a hill, thus they can train by successive increments to get fitter and learn new methods and aquire new tools to be able to climb even Everest. I noted because of this that the banks should realy not raise the bar by tiny increments but significantly in one go so that the crackers would run out of reserves before they could come up with new attack methods and tools.

Since then things have moved on and it is now the technology companies not raising the bar on security improvments sufficiently, so the forensic companies in effect get trained to climb mountains.

I suspect that Apple in particular might have noticed this issue (it would be hard not to these days). Thus they might decide to run a feint improvment in the beta code reserving the real improvment for the final release. It's certainly something I would consider if I was in their place, as they would be more than entitled to regard the forensics companies as the "parasitic criminals", they actually are[1], thus fair game.

I know it sounds like "security through obscurity" but it would be better to look at it as "buying time". Those who have been around a while will know that 100% secure is an unobtainable goal and especially so on consumer grade equipment. Thus consumer device security is a succession of races between the product designers and the forensics companies. Of which the device designers have the choice of where to set the start of each race, not just for themselves, but the forensics companies and the consumers that buy their products.

That is there are three parties in the race and the ideal for the designers and consumers is that the forensics companies never get to finish any race before the next race is started. That is they don't alow the forensics companies enough time to find a fault, design a response into their product and get it to market with sufficient time that they are going to make a profit on the process.

One way for the product designers to do this, is not to give the forensics companies a head start over the consumers. That is do not give them access to the new security code early in the alpha and beta test phases but only in the consumer upgrade release.

Unfortunately the forensics companies have one time advantage crackers did not. The law has a significant time advantage in that the opportunity to prosecute is measured in years not the short time period a cracker has of just months of product life. Thus if a law enforcment agency grabs your phone they can leave it on the shelf for years waiting for a forensics company to come up with a workable attack...

[1] Yes the forensics companies are both criminals and parasites. As I keep pointing out "technology is agnostic to use" and it is the "directing mind" that decides that use and the legislation that decides if that use is illegal or not. The US in particular put very broad scope legislation in place like the DMCA to protect IP holders against those the IP holders regarded as criminals and thus made circumvention of protection features a criminal act. Thus the forensic companies are knowingly committing a criminal act by producing and selling their IP protection circumvention tools. The fact that the original product designers get no benifit from the forensics market makes the forensics companies "parasites" as well.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.