Apple's Cloud Key Vault

Ever since Ian Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not. Matthew Green and Steve Bellovin have both explained why not. And the same group of us that wrote the “Keys Under Doormats” paper on why backdoors are a bad idea have also explained why Apple’s technology does not enable it to build secure backdoors for law enforcement. Michael Specter did the bulk of the writing.

The problem with Tait’s argument becomes clearer when you actually try to turn Apple’s Cloud Key Vault into an exceptional access mechanism. In that case, Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI­—or an agency from any of the 100+ countries where Apple sells iPhones­—saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.

Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. This puts us back at the situation where Apple needs to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters.

Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists.

It is worth noting that the exceptional access schemes one can create from Apple’s CKV (like the one outlined above) inherently entails the precise issues we warned about in our previous essay on the danger signs for recognizing flawed exceptional access systems. Additionally, the Risks of Key Escrow and Keys Under Doormats papers describe further technical and nontechnical issues with exceptional access schemes that must be addressed. Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys­—to the contrary that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology, and that any scheme running into fundamental sociotechnical challenges such as jurisdiction must be evaluated honestly before any technical implementation is considered.

Posted on September 8, 2016 at 8:00 AM44 Comments

Comments

hawk September 8, 2016 8:39 AM

These types of explanations will cause problems. It’s not that they’re technically inaccurate or misleading in some way. But, you’re trying to argue a larger case and mixing technical hurdles with management challenges. Picture a room full of 10k lawyers – that’s the authority you’re dealing with. They will take this and prove that the technical hurdles are nothing more than management problems. It’s coming, guaranteed.

“Experts” at what? A two of spades will trump an ace of hearts. Go to the Hill with a bus load of crypto “experts” and try to explain why it is difficult to manage a system granting authorities access. They are going to whip you like a puppy.

hawk up a loogie September 8, 2016 10:33 AM

Whip you like a puppy, ha ha. The only puppies getting whipped on the hill are the castrated legislators who beg please pretty please for answers to their itty bitty polite questions. CIA threatens them, NSA lies to them and laughs, FBI bugs them and entraps them.

Daniel September 8, 2016 10:35 AM

@hawk

In my view it smacks of desperation, grasping at straws. If this is the best confirmation the FBI and its stooges can get for their biases they are really hurting.

Kino September 8, 2016 10:44 AM

This explains well why, for political reasons, backdoor(s) implementation is technically a bad idea in terms of security, due added complexity.

But it says nothing about Apple (or any other company) non actually stuffing (insecure) backdoors in their services.

Kyle Wilson September 8, 2016 11:04 AM

Makes me wonder what would happen if Apple reincorporated in the Maldives of something similar. Would the new standard be the law of the requesting country and the law of their new home? Seems as if this could very rapidly lead to tax shelter changes generating nearly open access to their key escrow (assuming that likely tax havens will also be cooperative about letting third parties have access to keys stored on their territory to avoid hassles).

eagle September 8, 2016 11:07 AM

@Kyle Wilson
If you want to access american market you will comply to american laws and “proposals”.

steve September 8, 2016 11:20 AM

@hawk

Picture a room full of 10k lawyers – that’s the authority you’re dealing with.

Legislative law != Physical Law.

Reality doesn’t care if you legislate that Pi is rational (they tried – Indiana, 1897). If your legislators focus on what they want at the expense of what the world really is, you might have trouble.

May as well legislate that the Sun not progress through the Main Sequence, because the Red Giant transition is clearly a threat to National Security.

sow September 8, 2016 12:14 PM

Key backup system can’t be compromised. That’s fine.

The WRONG thing to do is to then assume the valuable encrypted data is safe from State actors, or even Apple.

The reality distortion field is still in full effect.

HRW September 8, 2016 12:40 PM

I read Steve Bellovin’s post and he said something that I think is crucial, especially in light of the recent Citizen Lab revelations.

“Apple’s real problem is that they’re trying to satisfy consumer needs while still defending against nation-state adversaries. I hope they’ve gotten it right—but I won’t be even slightly surprised if they haven’t. ”

That insight is 100% correct. There are many human rights campaigners that balk at the iPhone because of its outrageous cost, when viewed from the eyes of the second or third world. Yet I think Bellovin’s point is the far more serious reason to be worried about the iPhone.

Daniel September 8, 2016 4:02 PM

@Ben

Thanks. The author of the post you linked to is a person so in love with what they perceive to be their own intelligence that they utter every ipse dixit as if it were truth. If you don’t understand that Latin term what I am saying is that the entire article is an appeal to authority–the author’s own authority–and nothing more.

It’s not an opinion piece.
It’s not a technical piece.
It’s intellectual diarrhea.

ab praeceptis September 8, 2016 5:14 PM

I’m amazed. A quite heavyweight group of seriously bright experts getting caught in a nonsense maze.

For a starter: Do security agencies a way to eavesdrop on any and every kind communication? Why, of course they do.
So why is there a problem, and if there is one, what is it?

There is a problem because we can not reasonably trust our security agencies (and the politicians) anymore. There is a problem, because many perceive requests like the fbi’s request as basically criminals asking for unfettered acces to everything. This is not a political view. This is a statement based on many observation and events that actually did happen.
There is also a problem because their desire for “dark access” would need to be balanced by “white transparency”, i.e. by laws (being actually obeyed) demanding that any eavesdropping by the agencies must be followed up after a certain period of time by telling the eavesdropped citizens exactly from when till when exactly what was eavesdropped and why. Quite probably there also needed to be provisions for citizens to go to court to get the agencies to be scrutinized and scrutinized hard and cold.

In other words the problem is that the very agencies (and state behind them) which desire unfettered access bluntly refuse any and every meaningful control and compliance scrutiny and enforcement.

Now to the more technical aspects. Probably I’m wrong but I consider storing very sensitive information in the cloud to be about the peak of carelessness and stupidity. And again we fail to discuss the real problem and limit ourselves largely to dicuss technicalities.

Being at that: How complicated can it be to create some kind of “encrypt me” program with an apple user-friendly click me button that does symmetric encryption before sending sensitive stuff across the wire? I’m no fan of web storage, even less so for sensitive material but sometimes that need arises and it’s, of course, seductive to use the web as cheap offsite storage. There have indeed been situation where I did store something like a collection of passwords in the net. Years and yars ago. No problem. Just blowfifh (or whatever) the date send them over the wire, store them somewhere and be done. Simple, secure.

All in all my take is that we should finally address the real problems rather than made up complicated self-created pseudo-problems.

Pilgrim in the Wilderness September 8, 2016 5:45 PM

You got to trust somebody, and of all the current IT players I trust Apple more than the others.

I am going to convert to Apple devices and software beginning now. They cost a lot, but in the long they are the ones I trust the most.

I stupidly downloaded a the most recent patch for Windows 10 having worked very hard recently to beat Windows into NOT communicating to mother 24/7…at all!

I dropped the shields, did the update, and the damned thing started immediately upload data to MS constantly. To hell with MS. It’s become a tool of our corporate-military masters and they dare to make us PAY to be data targets.

I think MS moving to become a tool of the military and corporations all over the world. It will be very profitable. They pay big money for good data. Billions of dollars. Maybe Trillions.

I hope my escape won’t cost them too much. Well OK, I wish it would, but that’s the problem, they don’t give a damn about individual end users anymore.

john smith September 8, 2016 9:26 PM

So, as carrot commented on Matthew Green’s blog, what stops a FISA court from issuing an order to Apple, to secretly make copies of all private keys for NSA?

Or, another secret order that Apple must use “NSA-approved” HSMs?

Apple’s scheme is the perfect backdoor for “Law” Enforcement, with a fancy ceremony to mislead all the suckers. Just to rub their faces in it.

Mark September 9, 2016 2:18 AM

It’s almost as if Bruce hasn’t actually seen any of the Snowden documents.

It’s a completely closed system — something for which Green himself has criticised iMessage, another Apple service. Whose HSMs are they using? An American company’s, no doubt.

No source code, no design docs, no trust. Remember, Apple was named as an NSA partner in the PRISM programme.

Drone September 9, 2016 2:47 AM

Yeah right. And you expect me to trust a company that removes the standard headphone jack then forces you to buy a proprietary $160 set of headphones?

65535 September 9, 2016 3:28 AM

This is what Apple needs to clarify:

What is “…to run the [key] cards through a blender” actually mean?

As Mr. Green notes it is supposedly a “physical one-way hash function”.

What exactly is a “Physical” one-way hash function? Is it a hardware box? Is it a software program? Is it a combination of the two? And, exactly how secure is it?

More troubling, is Mr. Green statement:

‘What stops Apple from just reprogramming its HSM?’

‘This is probably the biggest weakness of the system, and the part that’s driving the “backdoor’ concerns above. You see, the HSMs Apple uses are programmable. This means that — as long as Apple still has the code signing keys — the company can potentially update the custom code it includes onto the HSM to do all sort sorts of things.

‘These things might include: programming the HSM to output decrypted escrow keys. Or disabling the maximum login attempt counting mechanism. Or even inserting a program that runs a brute-force dictionary attack on the HSM itself.’ –Green

https://blog.cryptographyengineering.com/2016/08/13/is-apples-cloud-key-vault-crypto/

This, in and of itself is concerning because we don’t really know what and how this “physical hash/blender” works let alone know if the HSM box can be programmed to copy said “admin smartcards” or imaged them in some fashion to be sent off to the NSA’s mother ship [or China’s mother ship or Russia’s mother ship and so on].

I think Apple has a few more technical questions to answer before they are considered “secure”.

Oox7aeki September 9, 2016 3:40 AM

@65535: The obvious interpretation of a blender is that they literally mean a blender. i.e. the smartcards are physically destroyed, putting the code signing key that they protect beyond use.

Dirk Praet September 9, 2016 4:27 AM

@ Pigrim in the Wilderness

I am going to convert to Apple devices and software beginning now. They cost a lot, but in the long they are the ones I trust the most.

Before you go on an expensive Apple shopping spree, may I recommend paying a visit to someone who already is on OS X and ask him/her to install a Little Snitch trial. Then reboot and watch the gazillion of home calls your trusted Apple friend is making too.

ab praeceptis September 9, 2016 5:13 AM

part 2, advocatus diaboli

Any information whether and how apple enforces the srp v(erfier) uniqueness constraint?

With millions of users that’s not trivial and gets even tougher as with a gazillion potential users out there, there might arise a situation where, due to external factors (psychological, crisis, etc) a very high density of new user per time unit might arise. Considering that HSMs are not made with high performance in mind that might become an issue.

What quality has the random employed both at apples and clients side?

What verifiable and/or certified (and by whom?) information is available about the “private code” apple added to the code in the HSM?

Given to what lengths (particularly the us-american) governments have been known to go to get their will imposed (incl. gag orders) what guarantees can apple provide (and have in the first place) that they haven’t been made a player in a smokescreen game, namely one where the HSM is not trustworthy but backdoored (or weakened, or …)?

The HSM world is a rather small one with not too many players, many if not most of them being tight with governments and agencies for diverse reasons. Combining this interdependency and the legal means of a gov. (like gag orders) it seems not far fetched if not even likely to assume that apple has been accepted or allowed to perform an impressive security theater game (probably with best intention on apples side) while gov. agencies have their backdoor access on a deeper level (i.e. the HSM).

Indices: There have been cases where (particularly us-american) gov. agencies have more or less quietly gotten themselves access to systems/devices.

The one cornerstone logic. Building everything around and on one cornerstone (here the HSM) constitutes a classical weakness unless that cornerstone can be soundly proven to be solid (which is very hard to do).

It seems reasonable to assume that a major crime operation (read: with some million $ thrown in for equipment and professionals) against the HSMs would pay off very handsomely and would hence not be out of this world. Not to get the secrets and encrypted data but to “blow up” the cornerstone (or threatening to do so unless, say 250 mio. are payed in ransom). a) what protection does apple have in place? and b) what plan B does apple have in place – and is that plan B in conformity with all their promises? c) In (the likely) case apple employs geographically dispersed HSMs what is known about and what procedures are in place against tampering in a way that creates inconsistency between them? d) What liability is accepted by apple? Are there binding obligations to compensate their customers in case something goes wrong? After all, if their system is so great they shouldn’t have a problem putting their money where their security assurances are.

We know that (particularly us-american) gov agencies want want want to get at decrypted data or at crypto backdoors. They have voiced that clearly enough. And we know that (particularly us-american) gov agencies and entities do have and do make use of any means to get what they want, incl. blackmailing, muting, and probably killing.

Finally we know that (particularly us-american) corporations have quietly and sometimes without being forced cooperated with gov. and conspired against their very customers. We also know that their talk (and theater) re. the well being of their customers means regrettably little.

Do I see reason to trust apple, based on what we know about them and their attitude? Not at all. My assumption would be that apple would sell out their customers every day a week and twice on sundays.

Jeff September 9, 2016 5:31 AM

@ 65535

“What is “…to run the [key] cards through a blender” actually mean?”

If I read it correctly, I think it meant ‘whodunit?’ because they did not specify which one of their employees pushed the button. What that means in legal terms, I do not know.

I think, after following up on the twitter post, this is very interesting on all fronts, particularly the setting of a precedence. What do you think?

Alex September 9, 2016 7:37 AM

Considering all the things I have seen happen over the last 20 years, I have come to the conclusion that the greatest casualty in the world of computing has been trust.

Once something exceeds my technical abilities (which these days happens often) then I am dependent on trust.
Do you see the problem?

Kyle Wilson September 9, 2016 8:20 AM

@eagle …and to do business in China you need to follow Chinese law. These things shift over time. No reason to assume that any multi-national will continue to have the US first in the long term.

If a legal request from China comes in asking for the keys to ‘terrorist’ communications that is signed by their courts, how can anyone reasonably say no? If China is an important market and may sanction the company for non-compliance, does the company want to say no?

Once the mechanisms are in place, legally valid requests must be handled whether the company or any particular third-party agrees or not. That opens up a potentially endless can of worms while we just delude ourselves that only American agencies will have the opportunity to present requests.

keiner September 9, 2016 9:07 AM

@Dirk Praet

ehhmm, this Little Snitch:

“A firewall protects your computer against unwanted guests from the Internet.

But who protects your private data from being sent out?”

I though a firewall works in both directions, software thing on Win machine can block nearly everything and ask before contacting Adobe etc (OK, blocking Windows from phoning home has to be done in the perimeter firewall).

So this product is just some lines of code with some firewall rules?!?! No that new to Windows users…

Dirk Praet September 9, 2016 10:31 AM

@ keiner

But who protects your private data from being sent out?

Little Snitch does both. It alerts you to any outgoing connection which you can then allow or deny, temporarily or permanently. So does Windows Firewall Control (WFC, by Binisoft) on Windows. Unfortunately, there’s no such thing on Linux, except in SubGraph OS.

ab praeceptis September 9, 2016 1:02 PM

As SubGraph OS happened to come up, I’d strongly suggest to watch Andy Müller Maguhns talk at THSF in May 2016. For those who don’t know him, AM Maguhn is one of the CCC founders. Chaos Computer Club has a tremendous reputation ans is one the very few institutions in europe that easily surpasses pretty everything across the ocean. And CCC has an excellent – and pretty clean – track record of decades. In short, when leading CCC people talk one should listen carefully. You might also know CCC from the CCC congress which is about the highest quality major conference of its kind.

How do I justify the jump from SubGraph OS to a Maguhn Hacker days talk? Because SubGraph OS is, at least to a considerable part, sponsored by and linked to “Open Technology Fund” which is a part of what AM Maguhn calls “The Military Industrial Internet Freedom Complex”. And for a reason and based on extensive research.

In other words: It seems very likely that there is some kind of connection between that sgos project and the cia, soros and accomplices.

“But”, you might say, “Bruce Schneier is listed in otf”. Yes, indeed, he is. Obviously I can’t speak for him and his reasons but I can offer an educated guess: It wouldn’t be the first time, someone innocent, driven by the best intentions and lured by a nice surface, ended up in a bad neighbourhood. Also note that a brillant mind (and profound interest) in math and bits does not necessarily correlate with the same expertise in other fields. Actually that would even be quite unlikely. If Bruce Schneier did profound and extensive research on everything he joins or engages in or even just somehow is connected to, he wouldn’t have any time left for his crypto research. But for otf and the like it’s obviously extremely attractive to have people like Bruce Schneier listed who deservedly enjoys a high level of well earned trust.

Whatever, I strongly recommend to watch that AM Maguhn talk. Not ending up with an OS that quite probably is co-sponsored by the cia should easily be worth the time.

65535 September 9, 2016 2:23 PM

@ Oox7aeki

“…they literally mean a blender. i.e. the smartcards are physically destroyed…”

That is not much of an explanation.

You’re saying a plastic card with a mag stripe or chip is put into a vegetable blender and destroyed or something similar? Obvious question is how do you ensure the parts are not re-assembled? Next, is the fact that a data structure has to go on these “smart cards” so who is to say that data is not mirrored or copied and subverted in some fashion?

@ Dirk Praet

[Little Snitch] It alerts you to any outgoing connection which you can then allow or deny.

Good point about using Little Snitch. It can be very revealing.

@ Jeff

‘If I read it correctly, I think it meant ‘whodunit? [Who destroyed the keys]’… What that means in legal terms, I do not know.’

That is an interesting question. What if a plant from the NSA supposedly “did it” but made a copy? What would be the legal ramifications? I think Apple would be in the clear and so would the NSA.

Donald Not Trump September 9, 2016 8:41 PM

@ keiner

“So this product is just some lines of code with some firewall rules?!?! No that new to Windows users…”

I can’t help but ask who protects us from the firewall? 🙂

Spot September 9, 2016 11:19 PM

@Mark

I sympathize with your sentiment. Hypothesis: Snowden didn’t actually exist in the original timeline. He was photoshopped in by God to balance the architect’s matrix equation.

Spend some time hardening windows 10 instead of pursuing free and open source alternatives. You’ll see what I mean.

Oox7aeki September 10, 2016 3:45 AM

@65535 I’m saying that the literal interpretation is the obvious one; beyond that I can only speculate.

At any rate physical destruction of storage devices containing secrets is hardly a novel concept, if Apple want to achieve it then I see no reason to think they’ll fail.

Skeptical September 10, 2016 3:52 AM

I’m bemused by the emphasis on the “100 other countries…” to which Apple would need to provide access.

Does Apple currently disclose the nature of cooperation, including modification of services, that it provides to various foreign governments in order to be permitted to conduct business in those countries?

Does Apple disclose how the PRC requests communications access or records? Russia?

These governments already almost certainly have a level of access that the US does not and should not possess; and it seems remarkably unlikely that any lawful access mechanism enacted in the United States could somehow expand the access such governments already have obtained within their countries.

So far as lawful access mechanisms in the US and elsewhere… there seems to be little empirical data concerning the actual risks and benefits posed by any of a number of different possible approaches.

Hard to imagine that Apple would have constructed this particular security process without estimating certain parameters to justify the cost – including of course the risks. That might be useful information to share – assuming of course Apple hasn’t lost the key to it.

Jeffrey September 11, 2016 10:45 AM

I’m bemused by the emphasis on the “100 other countries…”

Thank you for pointing out the hilarity (no relation to Hillary) of it. Even if apple products were banned from import, they became popular contrabands. Building a wall along the Southern Border won’t stop the flood of migrants, likewise a small country in South America nobody’s ever heard of can’t stop its people from buying apple products even if they revoke import licenses.

10 years ago if you had told me Hillary is the sane candidate in any election, I’d have thought you were joking. Likewise, you must think I’m joking if I had told you I’d vote for the insane one. Intresting times… 🙂

warning September 12, 2016 12:59 AM

This may be true, but I’d suggest you remember that if you monitor iOS devices network connections, with iCloud disabled, every service controllable in Apple’s settings UI that could possibly require access to Apple servers disabled, overlooking Apple’s software update (easily distinguishable by monitoring DNS requests), iOS still constantly talks to Apple servers. Also, pay attention when you purchase an iOS device at a physical Apple store – they push for personal info beyond the data they get from credit card purchases. Look at the receipt – they specifically link purchase PIIs with IMEI, serial #, etc.

All iOS devices, whether they have cellular capabilities or not, require “activation” which sends a hash of a combination of the device’s serial number, Model Number (e.g. MC135), Product Type (e.g. iPhone2,1), UUID, UCID, and where applicable, ICCID, IMEI, IMSI, Phone Number – and of course these are updated in Apple’s databases when new SIM cards are placed in the device. The latter cellular-only identifiers are also shared with your carrier over the cellular network in a relatively opaque exchange shown in this Apple patent (https://patentimages.storage.googleapis.com/pdfs/US20090061934.pdf).

Sure, this data is hashed, but Apple have already linked these identifiers to real identities at purchase, and hence, depending on how much, and what data Apple is collecting, are able to collect an incredible amount of information when all blatantly obvious privacy-invading “features” of the OS are disabled.

This profile that is taken of a given device during “activation” is then used for iMessage (when not signed into an Apple ID), receiving push messages via APNS***. The use of hardware identifiers with little user oversight shows that it is practically impossible to use iOS devices in a non-cloud state, even with iCloud disabled.

On top of this, every time an update for iOS is downloaded, it cannot be flashed unless the device is allowed to send identifying information to Apple servers (key exchange) to verify the installer and device on which the install is to take place – this is clearest when updating via iTunes. Even if you download a copy of the IPSW files separately, it is not possible to simply use GPG keys to verify the update locally – if Apple cared about privacy, they would allow users to do so, thereby allowing a device to be updated without sending identifying information in the process – you have to allow iTunes to connect to Apple servers to send the previously mentioned identifiers and verify the device and update.

Then, if you set up an Apple ID with the device, which you undoubtedly will, since in most use cases you’ll need 3rd party applications, this Apple ID and associated email is again linked with the previously mentioned hardware identifiers and in return the real identities collected at purchase, in a key exchange. Apple promotes the App Store to developers selling paid apps as a way to stop users stealing their work and to end users as a way to avoid malware.

Again, if Apple cared about privacy, they would link App Store accounts to resettable GPG keys (if reset, keys are wiped from store and device and then regenerated as appropriate – and when the device is signed out, the keys are wiped from both device and server) generated on the device that are in no way linked to hardware identifiers (and hence real identities), exchange the public keys with store servers to link them with the given account. This framework should allow Apple to maintain this setup without generating highly accurate profiles of real people – if the given user wanted it that way.

iCloud of course is the next, ultimate step in a surveilled device which again, will link even more private information with the previously mentioned identifiers.

Apple has adopted exactly the same attitude with the Mac App Store, linking Apple IDs with the device’s MAC address and other hardware identifiers – just try removing your Airport card, and you’ll see that all Mac App Store apps will refuse to open. The one difference is that on Mac you still (for now) retain sufficient control to disable all cloud-linked services and if you avoid the Mac App Store, use a small number of regularly updated OSS applications that you understand and are satisfied is sufficiently secure code to run on your machine and run them sandboxed, you are likely going to be OK. Otherwise, switch to Qubes.

Check out the iPhone wiki for some more information on this topic.

https://www.theiphonewiki.com/wiki/Activation_Token
http://theiphonewiki.com/wiki/Activation

***(which are hopeless to control – on another note, it is near impossible to disable push notifications for 3rd party apps while maintaining local notifications when apps are open, and hence rely on the 3rd party developer not to reveal sensitive information to Apple when sending push messages)

Clive Robinson September 12, 2016 7:49 AM

@ 65535,

With regards “blending” plastic mag stripe cards, it’s a follow on from destroying paper/card KeyMat.

Whilst I’ve not tried it with mag stripe “ABA Cards” or those with embedded electronics I can tell you about “blending to wood pulp” card and paper KeyMat.

The first thing you need to know is that they realy mean “food blender” not “food processor”. It needs to be of the sort with glass or stainless steel ingredient container with rotor blades in the bottom directly driven by a 200 Watt plus engine (ie atleast 1/4 horsepower) using either direct shaft or steel gears. That is the sort of blender you find in proffessional kitchens or bars where it will churn ice cubes lobster shells etc to “slurpy” in seconds time after time all day and night. Anything less will not do the job.

The second thing to know is that it will be inefective if you “use it dry”. You need a carrier liquid to alow the blade vortex to mix the blending effectively. For paper card and plastic destined for quick destruction you should look to use a flamable light solvent like alcohol but not to light as there is a vapour ignition issue.

The resulting slurry can then be poured into a suitable container and then burnt, leaving only ash etc (I’ve found that those seedling pots like soft cardboard egg boxes to be effective). Then dispose of the ashes in a dispersed fashion.

Sort of the “boy scout” version of “Bash Burn n bury” for crypto geeks.

However if the ABA card contains a micro chip as do smart cards, you realy should start with a center punch and hammer @Thoth has described how to do this in the recent past, oh and a quick nuke with a microwave will not go amiss either.

!!! Warning !!! Nearly all plastics will upon being burnt or chared directly or when nuked in a microwave produce gasses with “Toxicological Disadvantages”. Which means they could easily result in injury or death at the time or later from cancer etc, also the same applies to the use of solvents. Nobody want’s the price of keeping secrets to be an early demise so take care.

Anon10 September 12, 2016 6:05 PM

@Kyle

If Apple’s lawyers and accountants found a way to save on their tax bill, by re-incorporating in the Maldives, they would have done so already.

Anon10 September 12, 2016 9:22 PM

To some extent, the articles miss the bigger picture. If the San Bernardino case taught us anything, the FBI doesn’t just give up with it can’t get access it wants through a court order. It will invest in offensive cyber capabilities, either internally or by outsourcing to contractors. At some point, the FBI will probably starting hoarding zero days, if encryption of data at rest becomes ubiquitous and they don’t get “backdoor” access to that data.

Nick P September 12, 2016 9:29 PM

@ Anon10

“At some point, the FBI will probably starting hoarding zero days”

Are you making a prediction or talking about the past? Looks like the latter.

Thoth September 13, 2016 3:53 AM

@Clive Robinson

re: Smart Card “Blending”

Your best bet is a drill bit or a simple scissors or pliers (with sharp edges).

For those who have not a single idea what a smart card is and speculates about it’s internal workings, it’s simply a single or multiple chip card and the chip is about 2 to 3mm by size for width and length.

Below linked is one of the smart card chip I personally extracted from one of my old payment cards. The image should be rather self explanatory for those who have background in electronics and computing.

Smart cards have contact and contactless type and typically if the card says it’s contact card, you take a drill and drill through the middle of the metallic contact pads of the card as the chip is embedded right opposite underneath the metallic contact pads. My advise is a drill bit bigger than 5mm in diameter to cover the entire chip’s destruction. If you are paranoid just drill the metal contact until unrecognizable.

If you are too poor to own a portable drill, scissors or pliers with some sharp edges would do. Cut an “*” (asterisk) shape across the metal contact from corner to corner and then one final cut in the middle. It will nicely destroy the chip.

If the chip is has contactless capability or have an expensive display screen or some keypad on the card, you need to take a knife cutter to score the back of the card and locate any chip you can find and then cut the chip in an asterisk shape or take a drill to it.

Typically a contactless card would have a contactless capable chip embedded somewhere on the far corner of the chip. You need to score open the outer layer of the card’s “skin” which will reveal a type of dug out hole or cavity which will contain the chip connected to the contactless copper wire inlay. Below has an image of a contactless chip card with coiled antenna and a black chip.

You do not need to destroy or blend the card as it can be messy from being too imprecise.

I have also attached my past commentary of Apple’s Key Vault scheme that @Nick P have kindly asked me to give a commentary on which I find it to be rather appropriately designed and secure enough but still needing more thought on Apple’s side. The glaring weakness is they did not use a Zero Knowledge method to authenticate and instead made somewhat of a mistake by feeding the user’s password into their HSM which will cause “public knee jerks” to happen even if they have done their due diligence and pure intentions.

If you are fine with the Apple iPhone with it’s “blackbox” Secure Enclave but you are not fine with Apple’s Key Vault HSMs, then that’s something very wrong somewhere. Why aren’t there people complaining about the blackbox Secure Enclave more than the Key Vault HSM when the Secure Enclaves are blackbox smart card HSMs that sit so close to you and may have a better chance of being a theoretical backdoor when compared to an online HSM service which you could theoretically block it via network security techniques.

I would personally think that the Secure Enclave should be labelled a bigger security concern compared to the online HSM service when the Secure Enclave is virtually unblockable and always present while an online HSM service only requires network filtering to remove it’s presence.

There are too much misinformation on how HSMs and HSM-based Secure Execution Environment works and too much raving by non-domain people who have never worked with a single HSM before giving incorrect information and mostly FUD and speculations.

It’s a waste of time and rather an act of ignorance on how a smart card work for those who take a smart card and blend it inside a blender (which is troublesome to clean up later) when a cleaner method is a pair of scissors to snip through the metal contacts. And yes, Apple people don’t seem to know what they are doing by putting cards into blender (if it really happened).

I have also included (in my previous post linked below), the consequences and problems that come with destroying administrative cards from HSMs as well.

A matter of fact is neither does Matt Green nor Pwn All The Things ever worked with HSMs otherwise they would have given much more depths on the inside of how a HSM work and what exactly is wrong with the scheme if used on a HSM.

This issue is mostly a hype and an attempt to grab attention IMHO.

As @Clive Robinson have mentioned and many of us have pointed out, if any of you are so concerned with your personal data, you should not be doing sensitive stuff on smartphones but air-gap, energy-gap and keep them in a highly controlled environment like what @Figureitout, @Markus Ottela and myself have attempted to create in our practical work by creating practical locked down environments for sensitive information processing.

Links:
http://imgur.com/a/l3al0
http://www.kartecard.com/card/iccard.htm
https://www.schneier.com/blog/archives/2016/08/friday_squid_bl_539.html#c6731270

Gray Fox November 19, 2016 5:49 PM

lol sure pal, Trust The Math.

Anyone who cares about Truthiness in crypto should be more alarmed by the dew-eyed lack of backdoor suspicion coming from Bruce “I sat on the FOXACID Manual for over a year” Schneier.

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” -Upton Sinclair

We can’t have crypto celebs going around clucking the sky is falling and scaring the hen house to stop buying annual $700 iPhones at a 38% profit margin; which would crash Apple stock and sink the Titanic pension funds worse than AIG.

After all, What’s Good for Cupertino is Good for America. Especially since 10% of USD fiat is held offshore in the Bermuda Triangle sinkhole to assist the Federal Reserve printing presses like a pirate’s fake peg leg propping up a hollowed out collapsed economy.

As for ab praecepti praising the convenient myth of Chaos Computer Club being anything more than a great cover story for Deep State subversion, don’t forget the inconvenient truth that BND infiltrated CCC from day 1 in 1986.

A Strange Story by Bernd Fix

download.adamas.ai/dlbase/Stuff/VX%20Heavens%20Library/vbf01.html

Why waste black budget and scarce man hours on building your own cyberweapons when you can get it for free from a bunch of smart ass punk kids who rebel against the system by co-opting the manufactured propaganda of Cyberpunk Identity Politics as funded and staffed by covert Disinfo Officers who work for Deep State?

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.