Encryption in Cloud Computing

This article makes the important argument that encryption—where the user and not the cloud provider holds the keys—is critical to protect cloud data. The problem is, it upsets cloud providers’ business models:

In part it is because encryption with customer controlled keys is inconsistent with portions of their business model. This architecture limits a cloud provider’s ability to data mine or otherwise exploit the users’ data. If a provider does not have access to the keys, they lose access to the data for their own use. While a cloud provider may agree to keep the data confidential (i.e., they won’t show it to anyone else) that promise does not prevent their own use of the data to improve search results or deliver ads. Of course, this kind of access to the data has huge value to some cloud providers and they believe that data access in exchange for providing below-cost cloud services is a fair trade.

Also, providing onsite encryption at rest options might require some providers to significantly modify their existing software systems, which could require a substantial capital investment.

That second reason is actually very important, too. A lot of cloud providers don’t just store client data, they do things with that data. If the user encrypts the data, it’s an opaque blob to the cloud provider—and a lot of cloud services would be impossible.

Lots of companies are trying really hard to solve parts of this problem, but a truly optimal solution still eludes us.

Posted on November 12, 2012 at 5:47 AM58 Comments


Grahame November 12, 2012 6:02 AM

Seems pretty straight forward to me:

this kind of access to the data has huge value to some cloud providers and they believe that data access in exchange for providing below-cost cloud services is a fair trade

I think it’s my business model – getting services for less than cost – is what gets upset.

You get what you pay for.

Mikko Jakonen November 12, 2012 6:15 AM

Interesting topic. However, it is missing crucial part. You are missing user and therefore user’s identity & access management part. User could have role with attached key for accessing the confidential data. While the agreement between user and service provider(s) exists, it allows key(s) to be utilized transparently through the clouds.

I have one presentation of this available, unluckily in FIN only at the moment. Available at Slideshare under my name, in case someone is interested to run it through translator.

Assaf November 12, 2012 6:20 AM

Doesn’t it all boil down to trust, anyway?

Take DropBox as an example. Let’s say they offered client-side encryption and never stored (or transmitted) clients’ keys. If they get hacked, someone might replace their client application with a malicious one that does give up the keys. Now, if you’re talking strictly about cloud storage, over a standard protocol – e.g. storing data in S3 over HTTPS – then, indeed, nothing is stopping the client from being responsible for encryption and key storage. But that’s a very narrow view of what constitutes a cloud service these days. Cloud services also handle computational tasks, communication between clients, etc. And so it really boils down to whether or not you choose to trust a cloud provider with your data. If you don’t trust a cloud provider who commits not to sell your data or mine it, then you shouldn’t be using it at all.

I thus see it less as a business model problem (there’s plenty of cloud services that are very profitable without having to sell off clients’ data) and more of a technical limitation. Implementing client-side encryption is tricky as hell and would also prevent providers from being able to offer certain features (e.g. communication, analytics – anything that has to run on the server side basically).

JohnW November 12, 2012 6:33 AM

I would have imagined one of the big problems with client side encryption was that it removed the possibility of data de-duplication between customers. If you can remove redundant data (particularly on file-locker or email storage type sites) then you can significantly reduce the cost of storage space and also improve access times.

jddj November 12, 2012 6:34 AM

Wouldn’t client-side encryption slow down things like Dropbox? From what I understand, it syncs and shares quickly by exchanging or storing only hashes of previously-seen file blocks. Only when it hasn’t seen a given hash before does it require end-to-end data transfer. For locally-encrypted data, most all blocks would need end-to-end transfer.

James Sutherland November 12, 2012 6:38 AM

By “cloud” it seems they pretty much mean Google’s e-mail service, which does indeed have a business model built on analysing the content to serve up ads, but barely qualifies as a “cloud” service in any meaningful sense: it’s just another hosted email system, like Hotmail/outlook.com/Office365.

Indeed, Amazon Web Services has already implemented at-rest encryption of their own in the S3 storage service, as well as HTTPS for uploading and downloading data – apart from the CPU overhead, this is no problem at all for Amazon, who neither know nor care what is in any block of data you’re paying to store there. (At least one of the popular S3 tools also has support for PGP-encrypting your data before it’s sent to Amazon, giving better protection still.) Rackspace would appear to be much the same, with comparable offerings and business models. My email provider has the unencrypted messages (they were, after all, all delivered that way over SMTP in the first place!) but doesn’t do ads: the service consists entirely of a highly-replicated IMAP store. Encryption wouldn’t impact their business model at all either – it would just be fairly redundant, since they’d need the keys as well to implement IMAP and SMTP for me!

The article is right about geographic/nationality restrictions being misguided and counterproductive, though. Are NASA’s emails really any more secure on a server in Miami than they would be in Toronto? Does banning Amazon’s British or Irish staff from doing the maintenance on servers in Virginia make them any more secure? Let’s ask Bradley Manning about that one…

js November 12, 2012 6:43 AM

Even if your business model does not depend on scavenging user data, client-side encryption leads to some complicated technical challenges.

Unless you’ve traveled to the future to bring back fully homomorphic encryption, how can you do email search in a web-based client? Spell checking or machine translation (which often requires huge data sets just to run) in your phone? Data deduplication or even checking that some client-side file is already “synced” when one plaintext can be encrypted in different ways depending on how your PRG is feeling today?

James Sutherland November 12, 2012 6:48 AM

JohnW: Yes, it would break inter-client deduplication, which Dropbox seems to do (i.e. if I happen to upload a file you’ve already uploaded, it just checks hashes then marks us both as sharing that data). It would probably increase Dropbox’s storage costs a bit to encrypt everything properly.

jddj: Not necessarily; yes, inserting or deleting one byte will move all the subsequent block boundaries and complicate things a bit, but there are thing you could do to avoid that. Variable size encryption blocks, for example.

(Indeed, the better online backup services already deal with exactly these issues; they don’t deduplicate between user accounts because of the key issue, and manage to sync changes up to the encrypted store quite effectively.)

Carlo Graziani November 12, 2012 6:51 AM

The irritating thing for me is that the business model of cloud provision creates a perverse incentive that prevents provision of a service/system that I would regard as killer useful: private home cloud.

The “cloud” idea has a great deal of intrinsic merit. Everyone nowadays has at least a half-dozen devices that, properly considered, in effect are (or contain) networked general-purpose computers — phones, hi-fi equipment, personal music players, laptops, tablets, etc. It is extremely useful to have those devices talk to each other and exchange data, and it is even more useful to be able to do this irrespective of a device’s current location.

However, what would be really great would be if we could have (a) a cloud service that we totally control, not intermediated by Google, Apple, Dropbox, Amazon, etc., and (b) a cloud service wherein we could establish fine-grained privacy/sharing policy, so that different data can marked as appropriate for sharing with different groups of users/processes through different devices, in a natural and transparent way through some central privacy policy control tool. And of course, strong encryption could be larded through the entire system.

None of this is technically difficult — you can set up your own server over your home broadband line for when you’re outside your home, and just cut the big cloud providers out of the loop. And extensible data synchronization protocols exist. In principle, if someone saw a market in private home clouds for people who care about their privacy, a system of device clients connecting to a home cloud server shouldn’t be very hard to provide.

But that’s the problem: the “cloud” market is dominated by players who see no value in disintermediating themselves, because the business model requires inspection of people’s data. And most people fail to regard as an unacceptable invasion of privacy in its own right, or perceive the risks created by government demands to look at the data, or by criminal break-ins. So the market for home clouds fails to attract developers.

I hope this changes — I seem to remember reading about the EFF getting involved in sponsoring a project along these lines. But I’m not holding my breath. And, I’m not using i*, G*, or any other public cloud service for anything I care about in the meantime.

Edw November 12, 2012 6:52 AM

James > HTTPS encrypts the communication between you and the server, but the server decodes what you send him. The server owner has potentially full access to the data.

Bruce > You can add https://spideroak.com/ to your list of cloud providers trying to help.

James Sutherland November 12, 2012 6:59 AM

js: Yes, without perfect homomorphic encryption, encryption is really only useful for protecting the data “at rest” – which is where a lot of these businesses are focussed anyway: Dropbox, Box, Crashplan, Amazon S3, Rackspace Cloud Files. Yes, it precludes deduplicating data between you and another user with different keys – otherwise, it would be a huge security hole in itself! (Just look whose user account shows a saving when deduplicated alongside a copy of the leaked memo/pirate software/etc…)

Encryption is no panacea: you still need to trust whoever is manipulating your data, it just avoids the need to rely on trust in your storage/transmission.

mb November 12, 2012 7:12 AM

you can encrypt and store data in the cloud. I do it now, it is not integrated with the cloud software, but any service that allows you to upload any type of file, can have encrypted files uploaded.

Nelson November 12, 2012 7:12 AM

Why does nobody mention Wuala? Built-in client-side-encryption and available on almost any platform. Plus, servers in EU and swiss-law, what do you even need more??

Matt Palmer November 12, 2012 7:16 AM

Actually, you can still deduplicate with encrypted data. By chance, a recent posting to Cryptology ePrints discusses this very issue, which they dub Message-Locked Encryption:


Essentially, it uses a hash of the message as the encryption key. Two identical messages will end up with the same key and therefore the same ciphertext.

Adrian O'Connor November 12, 2012 7:16 AM

I’ ve been thinking about this a lot lately. I think there’s a need to provide non-cloud based alternatives to services like Dropbox that work as well as the cloud versions, but without having to trust somebody else to store your data. Anyway, the biggest argument against this kind of encryption that I can think of is that: if a user loses their password (or key), you can’t reset it. That would be a massive downside for a lot of people.

kashmarek November 12, 2012 7:26 AM

The “cloud” embraces the “all your data are belong to us” approach. They (the all inclusive “they”) won’t let you see anybody else’s data (including your own perhaps) but they want everybody else to let “them” see your data.

Only put your crap in the “cloud”, not your data. In other words, treat the “cloud” like the sewer: one way out, away, never to be seen again.

Or, just wait until they provide “private cloud” services, USING YOUR OWN PC AS THE STORAGE MEDIUM, and charge you for it. hee hee hee.

Kreg Michael November 12, 2012 8:03 AM

SpiderOak and Tahoe-LAFS-on-S3 already offer provider-independent / zero-knowledge security.

Simon November 12, 2012 8:08 AM

I doubt it’s as simple as using encryption at rest- just keep your own keys. All these ‘me too’ providers want you to believe that.

Simon November 12, 2012 8:23 AM

I wasn’t born yesterday. Unlimited storage for $10 / mo means VC subsidized S3. And if it doesn’t work out… oh we’ll.

Chris W November 12, 2012 9:07 AM

@Carlo Graziani

Check out ownCloud. private cloud on your own home server/inteligent-nas. Not sure if it fits your requirements.

You’ll be surprised how many home-cloud solutions are out there, problem is obviously marketing. And lack of finance usually make those solutions pretty specific.
For mediastreaming over the web you already got several. (eg. plex)
Commercial alternative to ownCloud is for example Tonido.

Candidate solutions aside, I do think the ultimate service lies in a hybrid, where you have both encrypted and unencrypted files/services.
My requirement for a homecloud would be syncing with devices, and secure incremental backups to off-site locations (eg. friends using the same solution) or sharing specific folders with them (eg. personal dropbox).

I really have no trouble with parties like google aggregating data from me along with thousands of other users and selling those anonymous aggregated statistics to whatever company they like. After all, I’m getting a nice service in return. I’m more concerned with any unauthorized (by me) party gaining access to my specific data. Obviously I’m pointing at governments and hackers.

Nick P November 12, 2012 10:05 AM

Here’s a nice technology for people just using the cloud for backups.

“FadeVersion follows the standard version-controlled backup design, which eliminates the storage of redundant data across different versions of backups. On top of this, FadeVersion applies cryptographic protection to data backups. Specifically, it enables fine-grained assured deletion, that is, cloud clients can assuredly delete particular backup versions or files on the cloud and make them permanently inaccessible to anyone, while other versions that share the common data of the deleted versions or files will remain unaffected. ”


Academics have also been producing designs to allow trusted computations on untrusted OS’s & hypervisors via TPM. An issue I see with that is, assuming there’s really a TPM on the machine (wink), the cloud providers are likely to use vTPM’s instead of TPM’s. There’s price/performance motivations for this. A vTPM’s security != a hardware TPM’s security, though, good as it might be. Particularly, the security assumptions & models TPM-based designs are using might differ from the cloud provider’s implementation, thus invalidating security proofs or claims.


Simon November 12, 2012 10:22 AM

So, when FadeVersion ‘deletes’ an archive… it simply destroys the key? Isn’t that all ‘shredders’ do on encrypted drives?

AlanS November 12, 2012 10:33 AM

The article is on the site called NextGov and directed at government customers so I think the points about encryption and data mining, given the context, don’t make much sense, at least to US federal customers and their contractors.

Government customers aren’t using free cloud services and the cloud services they use certainly aren’t allowed to mine the data. If you are a US federal customer or contractor you have to comply with FISMA. The feds are spending billions on cloud services at the moment and not just from any old cloud service provider. They have to have a federal ATO, be on the GSA’s IaaS BPA, or be in the process of getting a FedRAMP provisional ATO. And FIPS 140-2 support is a requirement. And they have continuous monitoring. These type of customers do use Amazon, but they are on Amazon’s GovCloud (https://aws.amazon.com/govcloud-us/) or using Apptis, not your regular AWS.

And author is from the Chertoff Group. Makes one wonder what their interest is? See https://www.schneier.com/blog/archives/2010/11/tsa_backscatter.html. “Michael Chertoff, former Department of Homeland Security secretary, has been touting the full-body scanners, while at the same time maintaining a financial interest in the company that makes them.”

BlueRaja November 12, 2012 11:24 AM

@JohnW: “If you can remove redundant data (particularly on file-locker or email storage type sites) then you can significantly reduce the cost of storage space and also improve access times.”

Bitcasa, Spideroak, and I believe Wuala already do this.

AlanS November 12, 2012 11:29 AM

Original article has more to do with issues of location and data sovereignty. This discussion has been going on for a while.

Read Jerry’s comment on the original article. He point’s out that the argument doesn’t make sense as cloud is being used to process data; not just store it.

NobodySpecial November 12, 2012 11:32 AM

So has amazon solved homomorphic encryption – but only for it’s government customers – or is this just security through bureaucracy?

You don’t need to worry about an employee leaking your data or a hack of our system because we have a certificate

Brian November 12, 2012 11:42 AM


Spideroak does not deduplicate across accounts while Wuala does (not sure about Bitcasa). But as “Alan Fairless” posted, doing so while preserving all security properties is not possible (with current techniques). If a cloud storage service can deduplicate a file I upload, it can also (and does) tell me when someone else is storing that file. It’s relatively simple for me to use that property to tell me if someone using the service is storing ANY given file. That’s not a security property you’d expect secure online storage to have.

Brian November 12, 2012 11:45 AM

This general problem doesn’t sound solvable to me, at least not in all cases. Basic cloud storage can certainly be encrypted by the end user. But as the article (and Bruce) point out, if you want the cloud provider to DO anything with your data (which is true in many cases), they can’t see the data as just an encrypted blob.

wumpus November 12, 2012 1:14 PM

Axiom: You should never trust your cloud server. Even if your cloud server is trustworthy, their bankruptcy judge could prevent them from shreding “their assets” and hand them over to your competitor since the prevailing practice of cloud providers appears to be data mining.

To point out the blindingly obvious: deduped “encryption” allows anyone to verify if you have stored a particular datum (for whatever size chunk is “encrypted”/deduped). It also strongly reminds me of CBC encryption (only worse).

$10 infinite storage only makes sense for anyone who’s data is worth less than $10.

Jerry November 12, 2012 1:41 PM

Have stumpled upon Askemos already?

It’s an architecture using a mixture of well connected notary peers and personal peers on plug computers to create a trustworthy network without trusting any component completely. Including yourself. Logical objects representing users, databases, websites etc. are replicated to a controlled per-object set of peers. Updates are fault tollerant. (As of todays implementations byzantine agreement from assigned peers is used.)

Jeff Johnson November 12, 2012 2:50 PM

Users aren’t so great at key management. This problem could be overcome, but system crashes, reinstalls, new systems migration, etc. require some explicit control over keys, including backup, that will defeat many users.

Nick P November 12, 2012 5:37 PM

@ Simon

“So, when FadeVersion ‘deletes’ an archive… it simply destroys the key? Isn’t that all ‘shredders’ do on encrypted drives?”

There’s actually several ways COTS products might delete sensitive data: a file delete operation; a delete followed by overwriting; store encrypted, then loose the key. Many products and “erase your drive” tips talk about overwrite strategies, built-in erase features, etc. There are also drive with crypto built in whose technique is unverifiable for us. There have been theoretical and practical attacks on all of these.

The technique I’ve always recommended is to store the files strongly encrypted, make sure the key never touches the disk (partly user generated) & simply ditch the key to erase the data. It’s much easier (and cheaper) to erase a key in RAM than to overwrite or destroy physical storage mediums (e.g. degausers, shredders, thermite). Additionally, getting rid of GB or TB of data is quite time consuming, unlike destroying 128-256bits of key material.

Dirk Praet November 12, 2012 7:12 PM

Cloud services are nothing more than just another (relatively) new technology offering a number of advantages as well as drawbacks. As with so many others in the past, it is being hyped by marketeers as the next or current holy grail of computing, which it isn’t. For the average IT person with even half a brain, they are another tool in the arsenal, to be used wisely and in an informed way, depending on the specific requirements of a particular project or business. If data confidentiality is high on the list and the provider needs to hold your keys, you may wish to consider other solutions than those offered by most commercial cloud providers.

Dave November 12, 2012 8:49 PM

A solution (optimal or otherwise) will always evade us. Running my code and putting my data on someone else’s mainframe in a data centre^H^H^H^H^Hcloud-based system that’s potentially shared with code run by the very people I don’t want to have access to the data is an inherently unsolvable problem. The Orange Book guys spent 20-odd years picking away at it in a much simpler environment (carefully-controlled, fixed-functionality mainframes) and found it was an unsolvable problem, so I don’t see how the cloud guys are going to solve it now.

In any case the issue will resolve itself in a couple of years when the pendulum swings back away from centralised mainframes^H^H^H^Hcloud services again, as it has every 5-10 years for some decades now. We’ve then got another 5-10 years before it swings back the other way again.

Kasper Henriksen November 12, 2012 11:29 PM

Dave, what is a “data ccloud-based system” and a “centralised mainfrcloud service”?

Explanation: ^H only deletes one character. ^W deletes a whole word.

Adam November 13, 2012 10:14 AM

I hate Dropbox, Skydrive, Google Drive etc. for not supporting client side encryption. I expect they don’t do it so they can exploit data redundancy and reduce their server side storage requirements. e.g. if 30,000 people all have eclipse-3.7.zip in their drives they only have to store exactly one copy of it.

Still, it does not excuse at least giving the option of client side crypto built into the software. If I have sensitive data then chances are it’s has no redundancy anyway. So why not let me designate a Secure folder and anything placed in it is scrambled with my key before being sent up.

But that said, I’m sure someone could write something akin to Dropbox where you copy files to another folder, and these are encrypted before being copied to the dropbox folder after they have been encrypted and vice versa. So I work with folder A monitored by a crypto app. The crypto app encrypts the files in folder A, and copies them to folder B which dropbox is monitoring. Dropbox syncs them to the cloud. If a file appears in the Dropbox folder then the crypto app copies and decrypts the plaintext back to A.

Dirk Praet November 13, 2012 1:40 PM

@ Adam

I hate Dropbox, Skydrive, Google Drive etc. for not supporting client side encryption.

That’s just a minor nuisance since BoxCryptor supports all three of them, and on a variety of platforms. AES256, and free for personal use.

Lark Allen November 13, 2012 3:17 PM

Scrambls is a free solution at http://www.scrambls.com which matches the proposed architecture – the user controls the encryption keys, and they are always separate from the encrypted data held in the cloud.


Scrambls was has been available for about six months to allow private social media posting in Twitter, Facebook, etc. With this announcement Scrambls now supports encryption of files as well.

Nick P November 13, 2012 3:32 PM

@ Dave

“The Orange Book guys spent 20-odd years picking away at it in a much simpler environment (carefully-controlled, fixed-functionality mainframes) and found it was an unsolvable problem, so I don’t see how the cloud guys are going to solve it now.”

Good to see I’m not the only one looking for wisdom in all that old stuff. 😉 They got pretty far back then. Combine old & new, you can do something like they set out to do. The problem? It will be HORRIBLY slow, minimalist, & practically unusable. Wait a second, aren’t those the opposite of the adjectives used by cloud promoters?

Yes, reality has a harsh bite. Years ago, I promoted using small, cheap computers as individual application nodes. Security enforcement would happen at many levels using trusted components. Today, I still think that’s the only way to do some of what cloud back-end does in a trustworthy fashion. It’s just that the hardware & cloud provider incentives both make “secure, sharing of resources” an oxymoron.

Godel November 13, 2012 5:14 PM

@ Dirk Praet
“I hate Dropbox, Skydrive, Google Drive etc. for not supporting client side encryption.

That’s just a minor nuisance since BoxCryptor supports all three of them, and on a variety of platforms. AES256, and free for personal use.”

Cloudfogger is another alternative, also free for personal use.

Hitesh Tewari November 13, 2012 6:10 PM

Take a look at CipherDocs.com. A real-time encryption technology for Cloud Documents with seamless key exchange

Dirk Praet November 13, 2012 7:01 PM

@ Nick P

Years ago, I promoted using small, cheap computers as individual application nodes. Security enforcement would happen at many levels using trusted components.


Adam November 15, 2012 10:24 AM

@Dirk Praet, I hadn’t heard of Boxcryptor but it sounds similar to what I was suggesting encrypting files and passing them through to the actual folder. Too bad it’s commercial – I can see Truecrypt getting something similar someday.

Jonathan Wilson November 16, 2012 12:44 AM

Somewhat related to this is the reports that the guy behind Megaupload is trying to launch a new file sharing site similar to Megaupload but with client-side encryption (where the server never sees anything other than an encrypted blob that they have no encryption keys for).

The theory is that if all the data is encrypted, its harder for the big media companies to shut it down (because it will be much harder for them to prove that the operators of the site had any knowledge of what their users are doing)

venkat November 16, 2012 4:14 PM

Personal/private Cloud solutions like Tonido enables and individuals to run own cloud storage/sync system using their hardware.

One will be surprised how simple these solutions are nowadays. For widespread adoption, we just need awareness that alternatives are out there.

Jeff November 18, 2012 1:48 PM

If one is using the cloud solely for backup, why not just manually encrypt any files before uploading, using open-source encryption software? I see that Boxcryptor and Cloudfogger are closed source, so what is the level of confidence in files created with them as opposed, say, to files created with Truecrypt?

mt January 3, 2013 3:13 AM

There is a beta service AES.io. I descriptions seem honest and implementation, as described, makes sense to me. Would anyone have any more knowledge of how secure this service may be?

One thing that they mention is that users have to believe that the JavaScript client-side implementation is secure – if they mess it up, either accidentally on on purpose (they also mention US court order to mess it up – is that even possible to happen?) the system breaks right there. Is there a way of assuring that such client-side encryption does not depend entirely on the good will of the service programmers?

Igor March 15, 2013 2:50 AM

I am currently testing a solution that offers client-side encryption. However, I was surprised that plain HTTP is used when files are copied from the client to the cloud. The vendor argues that transport security (TLS) is not implemented as the files are encrypted on the client anyway. What is your opinion on this?

Clive Robinson March 15, 2013 3:59 AM


The vendor argues that transport security (TLS) is not implemented as the files are encrypted on the client anyway

That means that only the file contents are encrypted, not the file name or other meta data.

Such meta data could provide an attacker with sufficient information to make meaningful deductions about waht is going on within an organisation as such this is generaly considered undesirable.

Igor March 15, 2013 7:42 AM


That’s a good point. However, according to the vendor file names and meta data are encrypted by AES on the client-side. The AES key is in turn encrypted by the user’s public key. So technically, the vendor is not able to read the data.

Clive Robinson March 15, 2013 9:34 AM

@ Igor,

according to the vendor file names and meta data are encrypted by AES on the client-side.

This may or may not be effective, depending on the mode AES is used in and any random numbers used once (hence “nonce”) as padding / IV etc.

Over simplisticaly short messages become simple substitution ciphers with any block cipher in many of the standard modes if not implemented with care, which can be a significant problem as in effect information will leak.

Look at an old example where this happened, you have an old fashioned text menu system as beloved of TTY/VDU comand line systems. The menu requires a simple series of Y/N answers. The designer used a cipher chaining mode but alwaysed used the same IV and reset the chain on each transaction (ie press of the return key). So the Y and N always encoded the same under the same key.

Now if this menu format is known to an attacker (which is a reasonable assumption) it does not matter which key you encrypt the Y or N under you end up with two substitutions, which can easily be guessed which coresponds to Y and which N. So the attacker can work out fairly easily whhich menu option you are in…

So in a simple “code book” encryption system or one which has been designed by someone NOT familiar with the potential pit falls of chaining modes or using block ciphers as stream generators etc you could end up with a simple substitiution system or similar leaking information.

Also, with regards,

The AES key is in turn encrypted by the user’s public key. So technically the vendor is not able to read the data

There are many issues with Public Key key generation [1][2] and use [3]. And like the use of modes for block ciphers encrypting a short (by comparison to the PubKey length) symetric key under a PubKey can have padding and other issues.

Thus even though the likes of SSL/TLS have known defects they are likely to be a lot more secure than an adhoc homebrew crypto system.

So you would need to check that any system used by the vendor was “standards compliant” or had been checked by some one suitably qualified (and they are less common than hens teeth) to check the protocols and methods in use.

[1] It has been shown repeatedly that most random number generators in use are realy not upto the job of producing anything sufficiently close to the unpredictability requirments for cryptography [4][5].

[2] Adam Young and Moti Yung showed how the original primes could be leaked in a PK in a way that is not possible to detect only knowing either the public or private key in a process they called Kleptography.

[3] Don Coppersmith one of the DES designers who originaly discovered differential cryptography (T-Attack) showed there are issues to do with RSA public keys that give rise to a class of attacks, http://en.m.wikipedia.org/wiki/Coppersmith's_Attack

[4] A known defect in the RNG in several embbeded products such as routers meant that one of the random numbers used to select one of the primes used to generate a PubKey for the device was of an extreamly limited range which opened up an associated key attack [5].

[5] One accademic paper showed how a short cut could be used to discover if a public key had a common prime with another pub key thus making the reward for factoring just one PubKey rather more significant [6].

[6] However the deffect in the RNG acts as a short cut in it’s own right due to the limited numberr of candidates for on of the two primes. This has the effect of putting 1024bit RSA keys generated on these embeded devices very much in a factorable range.

Alejandro March 17, 2013 9:03 AM

I think that this topic in grate. To achieve this goal homomorphic encryption IS NOT necessary at all. For me the solution is a client side encryption, that’s all. If its a imap service, you only encrypt the body of the message, if its a storing service, like dropbox, you encrypt the hole data, think like a truecrypt container on the cloud and only you can open it with your key. When you want to send information , you encrypt it on the client side and send it to the container.

Clive Robinson March 17, 2013 3:58 PM

@ Alejandro,

To achieve this goal homomorphic encryption IS NOT necessary at all

Only if you use the cloud for storage.

But what about when you want to use the cloud for computation?

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.