Comments

dvv September 19, 2014 1:11 PM

Even if they cannot do that now, they will be able to do it again soon, and for reasons they won’t be able to discuss.

Danile September 19, 2014 1:23 PM

FWIW Android is doing the same thing.

http://www.pcmag.com/article2/0,2817,2468839,00.asp

Myself, I trust neither party. The companies have to have some way of responding to NSA letters. If they think they can get away with washing their hands of the problem I think they are wrong. Congress will pass a law if they have too.

Note that I am not making that as a normative statement but as a predictive one. The NSA is not going to roll over while Apple and Google place NSA-free devices in the hands of the unwashed masses.

Chelloveck September 19, 2014 1:48 PM

How long until Congress passes the THINK OF THE CHILDREN AND KITTENS act, requiring a way for law enforcement to gain access? Or until Apple/Google are charged with contempt for “refusing” to unlock the phones?

x17AF September 19, 2014 2:07 PM

@Danile: I think the idea should be that you SHOULD trust neither party. I’m guessing that Google’s claims will be easier to verify by checking the implementation in AOSP (though I think most people are not running AOSP phones, so it’s not like you can compile the carrier-based firmware from source to verify).

That said, I think it’s a step in the right direction to require anomalous behavior on the part of either Apple or Google in order to break these things, since every time they try to use it surreptitiously, they are risking discovery. It’s not the kind of security I’d like, but it’s certainly better than what we have now.

Hal O'Brien September 19, 2014 2:16 PM

“The companies have to have some way of responding to NSA letters.”

Hm. Yeah, that’s a weird one, given the only reason NSA letters exist is to provide cover for the idea NSA can’t just look at what they like on their own. The recursion, it burns.

ramriot September 19, 2014 2:23 PM

Also note that the Canary section of their regular security report is now missing. Can we take it that the above statement has been written for them while the CEO is at gunpoint? Perhaps not.

On other notes, it may well be true provided that by the time of asking the phone has been without power for at least 30 seconds. In lock mode or in standby does not cut it as some of the key material would be in RAM and they have a way of brute forceing the passphrase without the phone triggering a wipe.

Also that does not stop them from being required to back-door a device going forward. Something they will deny but a careful read of their own security white-paper make clear is possible, e.g. See how they control the list of public keys used for sharing secure iMessages, transparently to the user. They can totally introduce their own one to the stack.

It all comes down to this, ensure when under threat of LEO confiscation that your devices are switched OFF, Or that you pick a STRONG passphrase and That you do not trust closed security models.

Curious September 19, 2014 2:24 PM

Not being a techie (and so take this with a grain of salt so to speak), I can’t help but think of Microsoft’s ‘Windows’ as a means for spying on its users and I don’t see how my sentiment will change anytime soon. 😐

I suppose common sense have that there might perhaps be some kind of risk (of being detected) if someone were to use the Windows OS for either changing or accessing someone’s elses computer, but for me it is all so obscure and its workings seemingly complex that I wouldn’t know what to believe if anyone tried to reassure me of just how implausible it is for MS to sort of start tampering with my pc.

I sort of feel that my own pc isn’t really mine, and I don’t like that.

From the days of 8-bit September 19, 2014 3:00 PM

Curious
Not being a techie (and so take this with a grain of salt so to speak), I can’t help but think of Microsoft’s ‘Windows’ as a means for spying on its users and I don’t see how my sentiment will change anytime soon. 😐

Pretty much all operating systems have vulnerabilities where people can slip in nasty software and spy on you or even give you a very bad day. This includes so-called isolated systems, as it depends on how “isolated” the system really is. If data is transferred to the isolated system via network, floppy, or USB drive, then it isn’t isolated at all.

Windows has always been a problem because it really started out as a hacked-up piece of garbage. It started out with an inherent “open trust” model, where one process could, using a standard command, cause another process to bring up and run some arbitrary code. (I have no idea why they thought that was a good feature.) It started out without any security features at all. Fact is, Bill Gates completely ignored networking, and it was an add-on, until the Internet became too big for Gates to ignore.

Microsoft has always been “challenged” about writing good code. The last time I worked at Microsoft, I was chewed out over code I didn’t write. The developers were doing things like converting a bit field to a list, by XORing the bit field with 1 (the mask), and then shifting the mask to the right. Then they fixed it by shifting it to the left, but by sizeof(int) times (4, instead of 32). Finally during a code review I told them, “If you’re going to repeat the code of the function above the one you’re writing, could you at least use that as a template?” And the developers replied, “The function just above mine does it correctly? Cool!” Yes, the developers didn’t even look at the code just above where they were writing new code.

No, I am not returning to Microsoft, ever. Those people are idiots.

TimH September 19, 2014 3:17 PM

Read the Apple statement more carefully: “Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data,” Apple said on its Web site. “So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”

See the “in their possession” modifier? Apple are not saying they can’t read a device, but they can’t enable others to do so.

Will September 19, 2014 3:37 PM

One interesting aspect that doesn’t get discussed much is that the Apple’s war chest is possibly much more than the Defense Industrial Complex all together.

Makes you wonder who can lobby most and who senators will bow to if push comes to shove?

The Truth September 19, 2014 4:24 PM

@ From the days of 8-bit

Hi. If you could be so generous as to enlighten an ignoramus such as I, as to what/which pc operating system you use, so that I may one day be free of Microsoft’s continual reception of all of my secret activities.

Thank you.

Kai Howells September 19, 2014 5:10 PM

It’s my understanding that to unlock a phone running < iOS 8 that Apple don’t specifically have any magic tools to bypass the PIN code, assuming you have one set. If you don’t have a PIN set then the contents of your device are effectively not encrypted and it’s open to one and all.

They can provide access to data on a PIN locked phone by booting it from a modified firmware image that ignores the passcode restrictions (specifically the number of failed attempts before the phone is disabled or wiped) and then they brute-force the PIN on the device.

It’s unclear as to how the changes in iOS 8 prevent this from happening. Some of the changes that I am aware of are that the PIN unlock code takes around 80ms to run through one attempt – to a human that’s almost imperceptable however when it’s repeated thousands or millions of times this slows things down somewhat. With a 6 character alphanumeric unlock code that’s something like 5 1/2 years to run through all permutations. This is not something that will get faster with increased computing power as it has to run on the device in question which has a fixed amount of processing power – as newer and faster iOS devices come out, they can tune the unlock code to always require a fixed amount of time to run.

Citations are needed for the above information, but I can’t find the source article I read sometime over the past few days.

Hugh September 19, 2014 5:21 PM

Not sure this is as secure as it sounds.

Apple apparently says the data on the phone is encrypted with the users passcode (or a key protected with the passcode). So for a user with a 4-5 digit passcode, that’s only 10,000 to 100,000 combinations, even assuming people don’t use passcodes such as 1234. If the police are able to get a copy of the encrypted phone image, this may not take that long to brute force.

Now if you just have the physical phone and can’t get the encrypted Flash contents without getting locked out after 5 bad guesses, then this helps. But otherwise, maybe not.

Sancho_P September 19, 2014 5:50 PM

1)
Apple regarding iOS 8:
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.

I must have missed something.
Why would it be more than iOS 7? Was the device passcode known to the server?
I didn’t read the term “encryption”, and “in their possession” sounds suspicious, indeed.

Yes, they start fighting all the gangsters, the good and the bad ones.

*** Thank you, Mr. Ed Snowden! ***

2)
On your flight from Frankfurt to Rio, with a stopover in London, when you “Sorry, can not remember my passcode” …
– Would you think to catch your connecting flight?
When you then buy a new device in Rio, look for a model called “Miranda”.

It is sold as but is not a protection against gangsters.

3)
From the linked WP article:
“Our ability to act on data that does exist . . . is critical to our success,” Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information. [emphasis added]

OMG.

TLA: Suggestion understood, thanks.
—> “Now we can go down to business”:
http://motherboard.vice.com/blog/the-most-laudable-george-w-bush-supercut-ever-made-video“</a href>

Bob S. September 19, 2014 7:49 PM

@Hugh

iOS8 has the option to automatically wipe its contents after 10 failed login attempts. Also, the PIN can be four or five digits or a password.

That’s fair. Guessing the pin in only 10 tries is hard, even for a computer. Then there’s is the added hurdle of needing the device itself in your possession because (presumably) remote logins are not possible.

Will various criminals and governments launch intense attacks on this system and break it (quickly)?

Yes and maybe.

Things were easier when all we had to worry about was criminals.

Justin September 20, 2014 1:25 AM

Apple can not bypass the security of IOS8. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.

Notice that Apple EXCLUDES the ‘cloud’ from their protection claims.

Under the Communications Assistance for Law Enforcement Act, the government REQUIRES a back door into ALL new phone technologies, and I am sure that this includes the ‘Apple Cloud’storage system.

Translation you are SCREWED ANYWAY, because an activity log of virtually EVERYTHING you do with your phone, and everywhere you go, along with copies of EVERY message, and EVERY voice mail, ALL of it, gets uploaded to the cloud.

So, far from making things better, they are worse, much worse … in fact MUCH, MUCH, MUCH WORSE, because now, under the CALEA, the police and government can use their backdoor access into Apples servers to get EVERYTHING whether they seize your phone or not.

Wow, that’s an improvement! – an improvement that is, if you want to live in the new-millennium equivalent of a Stasi or Gestapo style police state with really creepy George Orwell 1984 overtones.

Hugh September 20, 2014 2:11 AM

@ Bob S

Re: “iOS8 has the option to automatically wipe its contents after 10 failed login attempts. Also, the PIN can be four or five digits or a password.”

That was my point. You can’t guess the passcode if you’re entering passwords on the screen because you get locked out after 10 tries (older iOS versions also wiped the device after too many bad passcodes, I think). But if someone can crack the phone open and read the raw encrypted data of the Flash EEPROM chip, or get hold of an iCloud backup with equivalent encryption, then the encryption may not count for much unless you have an unusually strong passcode or an actual password.

Gerard van Vooren September 20, 2014 3:12 AM

@ Justin • September 20, 2014 1:25 AM

I am afraid that by using iWhatever, Google, Facebook, Linkedin and the likes you are screwed anyway. There is really one answer and that is not using it.

With a Nokia 105 I am quite sure US Gov can’t collect that much info about you. And turn it off by default. Carry it with you for use in an emergency and listen to the stored messages once in a while. That is also much safer btw when you drive a car.

Ask yourself, is the cloud a benefit or a burden? I agree that it is “handy” and “fun”, but the companies behind it are serious evil. In my younger days I trusted Google, supported by their “Do no evil” slogan. But like many other slogans that one is only hot air. Or do you still believe in “Yes We Can”?

As for operating systems, I have given up Linux. I trust OpenBSD and MINIX3 much more. That is not for their focus on security or reliability, but more for their focus on code correctness. It is just too easy to mess with C and too hard to get it right. Focusing on code correctness, or correctness in general (that includes the cloud), is the right thing to do. That is if you care about privacy.

Bob S. September 20, 2014 9:29 AM

@Hugh,

If “they” have someone’s device disassembled on the shop table hooked up to an forensic ERPOM reader I would have to guess that person has plenty to worry about in addition to not being able to call mom.

Conversely, if it’s in your pocket, the recent changes are a positive step in the right direction by Apple. Not enough to trust them, just agree it’s a step in the right direction.

OldFish September 20, 2014 9:48 AM

@Hugh & Bob S
“Wiping” a phone means deleting the keys used to encrypt the filesystem. It would take too long to actually wipe multiple GB of flash. This goes back to public info from at least four years ago.

Storage of filesys keys is, well, key.

The passcode merely authenticates the user to log in to their phone.

elkhorn September 20, 2014 11:11 AM

John Gilmore had this to say:

And why do we believe them?

  • Because we can read the source code and the protocol descriptions
    ourselves, and determine just how secure they are?
  • Because they’re a big company and big companies never lie?
  • Because they’ve implemented it in proprietary binary software,
    and proprietary crypto is always stronger than the company
    claims it to be?
  • Because they can’t covertly send your device updated software that
    would change all these promises, for a targeted individual, or on
    a mass basis?
  • Because you will never agree to upgrade the software on your
    device, ever, no matter how often they send you updates?

  • Because this first release of their encryption software has no
    security bugs, so you will never need to upgrade it to retain
    your privacy?

  • Because if a future update INSERTS privacy or security bugs, we
    will surely be able to distinguish these updates from future
    updates that FIX privacy or security bugs?

  • Because if they change their mind and decide to lessen our privacy
    for their convenience, or by secret government edict, they will
    be sure to let us know?

  • Because they have worked hard for years to prevent you from
    upgrading the software that runs on their devices so that YOU can
    choose it and control it instead of them?

  • Because the US export control bureaucracy would never try to stop
    Apple from selling secure mass market proprietary encryption
    products across the border?

  • Because the countries that wouldn’t let Blackberry sell phones
    that communicate securely with your own corporate servers,
    will of course let Apple sell whatever high security non-tappable
    devices it wants to?

  • Because we’re apple fanboys and the company can do no wrong?

  • Because they want to help the terrorists win?

  • Because NSA made them mad once, therefore they are on the side
    of the public against NSA?

  • Because it’s always better to wiretap people after you convince
    them that they are perfectly secure, so they’ll spill all their
    best secrets?

There must be some other reason, I’m just having trouble thinking of it.

herman September 20, 2014 11:24 AM

Well, what the CEO said is that the security is strong enough to keep Apple out.

He did not say that it is strong enough to keep the NSA or GCHQ out of the phones.

It is just a way for Apple to avoid having to waste time and money working on government data requests: Ooooh, sorry, we cannot help you. We are too stupid. You’ll have to do it yourself.

John Q. Security September 20, 2014 11:59 AM

I’m not sure that we should really care about what’s going on within the application processor when the baseband processor is wide open. So, does it really matter that they’re claiming that they’re not able to provide the interfaces that enable certain entities access to these devices?

TimH September 20, 2014 8:34 PM

This is a good way for Apple to monetise the passing of customer data to Gov. Gov can either buy the data at some price from Apple, or get a warrant/subpoena/NSL to force the surrender of the data. Since it appears that Gov wants data for no sound (justifiable to a court) reason most of the tie, I hear the sound of cash registers ringing at the fruit company.

Buck September 20, 2014 10:50 PM

@TimH

Close, but not quite right!

Gangs of criminal ‘hacker’ thieves will gather and store large databases of iOS users’ passcodes via various flavors of malware for use in a variety of nefarious schemes…

After they’ve had their fun, they’ll be busted by the feds (or STRATCOM), and those databases will somehow find a way onto the servers of more ‘friendly’ (and possibly domestic) corporations.

These special companies will then be able to sell their mysterious new wares to law enforcement agencies at outrageous costs!

Apple will be too busy monetizing the dupes who think they’re now hack-proof. 😛

Chris September 21, 2014 12:27 AM

“Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data…” Cute, Apple’s competitors can bypass and access iPhone data.

How are any of Apple’s competitors able to bypass pass codes and thereby access user data? Android has never offered to save either user passcode (basis of the KEK) or the DEK. Apple did offer to store one of these. (On OS X they store the DEK). Is Apple revealing something about their competitors previously unknown?

B. D. Johnson September 21, 2014 9:03 AM

Android did the same thing several years ago. Device encryption has been baked into Android since 2.3.4/Gingerbread (so for Android users you probably already have this if your device is less than 3-4 years old). It’s just not enabled by default.

Petter September 21, 2014 1:17 PM

For the newer iOS devices such as the Iphone 4S/5/5S, ipad2/3 etc the only way to get into them is – according to the market leaders of forensic software – getting the hands on the physical lockdown plist containing DeviceCertificate, HostPrivateKey, RootPrivateKey etc.
These files are stored on the computer last used for syncing the device.

With those keys the forensic software will then gain “authorized access” to the phone and can not extract what ever they like.
A non-secured sync computer would be the weakest link in many cases and the best attack vector for the prying eye.

Syncing the iOS device to a computer with FDE and staying away from the icloud backup is the best way as of now.

Nate September 22, 2014 12:41 AM

So here’s something I just discovered. Take a look at page 53 of the June 2013 NIST Cloud Computing Standards Roadmap (Final edition): http://www.nist.gov/itl/cloud/upload/NIST_SP-500-291_Version-2_2013_June18_FINAL.pdf

It lists FIPS 185, Escrowed Encryption Standard, as a current ‘Approved Standard with Market Acceptance’ in encryption. Right alongside AES.

When I google FIPS 185 I get 1994 stories about the Clipper Chip. (A refresher for the newbies: http://groups.csail.mit.edu/mac/classes/6.805/articles/crypto/clipper94.html ) The stories say that despite massive federal pressure for adoption, the industry outcry was so great that FIPS 185 failed to gain traction and was ultimately retired.

So can anyone explain why an utterly failed, rejected by the market, and politically embarassing Clinton Administration era standard would be embedded in the latest shiny Cloud security standards?

Are there some ancient legacy mission critical devices or software out there today in federal/defense land that actually USE FIPS 185? A standard built to be backdoored, with the backdoor wide out in the open? A backdoor so badly designed that the backdoor itself was backdoored shortly after announcement.

So why is it even there?

nobody@localhost September 22, 2014 1:49 AM

@Nate

Funnily enough, I just raised the ghost of the Clipper Chip by name in another thread, in ancillary support of argument against somebody who purports that communications can be both secure and wiretappable…

…and that same somebody (“Skeptical”) is rather transparently, shall we say, having unlimited time on a consistent basis to write long posts pushing a certain agenda using very well-known tactics (even the nym is textbook)…

I suggest to you that this thing you noticed needs to be examined in the squid thread (“other security news”)—and if it is what it looks like, it needs to be seen everywhere.

James September 23, 2014 3:11 PM

@Danile • September 19, 2014 1:23 PM

“Myself, I trust neither party. The companies have to have some way of responding to NSA letters. If they think they can get away with washing their hands
of the problem I think they are wrong. Congress will pass a law if they have too.”

No, there is no way Congress can force companies to install backdoored encryption without doing massive damage to overseas export.

A company is not required to backdoor encryption for the government, and that debate was settled in the 1990s with the clipper controversy.

A national security letter does not require even a corporation to structure its business in a way that is convenient to government investigations.

It only requires the corporation to provide certain information if it has this information.

There is no implicit data retention requirement or antistructuring requirement in the law, and such is impossible to enforce on opensource software.

Douglas McClendon September 23, 2014 4:41 PM

@James

No, there is no way Congress can force companies to install backdoored encryption without doing massive damage to overseas export.

Umm… Snowden? You can shift the blame away from Congress if you like, but this sounds pretty much like the story of what happened. Damage has already been done and factored into stock prices. It’s the “new normal”.

A company is not required to backdoor encryption for the government, and that debate was settled in the 1990s with the clipper controversy.

I’d like to believe that. What I believe is that the Snowden controversy settled the fact that it is not the right formulation of the issue. PRISM revealed that any sufficiently large corporation need not be aware of backdoors installed by government agents (perhaps posing as well-qualified interns).

A national security letter does not require even a corporation to structure its business in a way that is convenient to government investigations.

I find that hard to believe. Even if not literally it’s another of those “but in practice the issue should be formulated differently”. I.e. I certainly believe as some other comment here casually tossed out, that the US Gov is not remotely willing to allow non-defective-by-design communication technology to fall into the hands of the unwashed masses, even domestically. You can always claim that anyone is free to start-up some actually good secure-coms product, but… good luck with that. The number of ways that the U.S. Gov can and will, ethically and unethically, legally and illegally, persuade such a company to come into the fold of those sufficiently obedient to the CIA and NSA’s missions…

It only requires the corporation to provide certain information if it has this information.

There is no implicit data retention requirement or antistructuring requirement in the law, and such is impossible to enforce on opensource software.

Impossible is just something you haven’t witnessed up close and personal with your own eyes. Doesn’t mean it isn’t happening. Look at the landscape. Someone here referred to Clapper’s testimony before congress going unpunished as “signalling all the way down”. I think that phrase sums it up well. We have been signalled what we are allowed to do, and what we are not allowed to do. We can try to do it anyway, and maybe the U.S. Gov can’t literally step on us like a bug, but what they can do subtly, over long periods of time, is actually far worse.

jamesww@mail2.com September 23, 2014 5:23 PM

@Douglas McClendon

At least now, the government does not get its way all the time.

If the law forbade strong unbackdoored encryption, this would be very bad for American corporations exporting to privacy aware European nations.

I do not deny that the government may circumvent encryption in other more less overt ways, but if the law mandated backdoors in encryption products, US corporations could simply not offer their products to European customers without running afoul of local rules.

As soon as the government’s policy becomes official a lot of consequences flows from this knowledge which can no longer be denied and will be acted upon by other external actors.

The EU as a consequence of the Snowden revelation may well gut the safe harbor, and there are lawsuits pending against Facebook in the European Court of Justice and against the UK in the European Court of Human Rights.

Of course, everyone with an interest in security could have known that the NSA is spying on foreign nations and that US corporations are assisting in that policy, but no one wanted to rock the boat.
I think that forcing the government to be openly repressive is better because it denies the deep state the power to hide the rules from its citizens.

Nick P September 23, 2014 6:50 PM

@ james, Douglass

“A company is not required to backdoor encryption for the government, and that debate was settled in the 1990s with the clipper controversy.”

That’s what Skeptical kept arguing. Ironically, it was he that just posted evidence to contrary in the Squid thread. His link to declassified CIA document on crypto. The document says Clinton Administration’s policy was sharing key with third party & this satisified all players with big cooperation from many big companies. Mainly for export.

The thing that disturbs me is they talk like it is separate from Clipper, it’s already implemented by many firms, it’s not mentioned in public that I know of, and I haven’t seen any evidence of a retraction. We have seen evidence via BULLRUN that NSA is trying to backdoor products and weaken standards. I assume that legally the two might be related.

Secret Police September 23, 2014 11:11 PM

The police state will just lobby to have laws passed forcing decryption much like the UK laws. Android should also take a page out of Kali Linux’s book and create a kill password that deletes the key if entered so nobody can ever unlock the data.

James September 24, 2014 1:31 AM

@Secret Police • September 23, 2014 11:11 PM

The UK has a law called RIPA which imposes criminal penalties for knowingly failing to disclose a key to law enforcement.

But even that is rarely applied because it must be proven beyond a reasonable doubt that the suspect knowingly failed to disclose the key.

If the government can prove you have the key, you can be convicted of failing to disclose it.

But if this proof is absent, you’ll not be convicted.

There are several scenarios where this burden of proof is a big issue.

If you have a Truecrypt volume, and the government can’t prove the existence of an inner volume, you can only be forced to disclose the password to the outer volume.

This actually happened in a terrorism case.

BiasedReprter October 2, 2014 5:51 PM

Apple is now encrypting data stored on the device by default–but for all the news stories about “locking out the NSA” they are ignoring intercepts of phone calls, Google, internet activity, etc. I am glad to see Moxie’s Whisper Systems finally on iPhone http://www.wired.com/2014/07/free-encrypted-calling-finally-comes-to-the-iphone/

Marcy Wheeler wrote about the effects of the Supreme Court decision in Riley v California, how it protects data on devices from police searches, but likely has no effect on NSA dragnet. http://www.salon.com/2014/06/26/john_roberts_channels_aclu_why_right_wing_court_saved_cellphone_privacy/

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.