Schneier on Security
A blog covering security and security technology.
« Preventive vs. Reactive Security |
| Measuring Cooperation and Defection using Shipwreck Data »
August 14, 2012
I'm late writing about this one. Cryptocat is a web-based encrypted chat application. After Wired published a pretty fluffy profile on the program and its author, security researcher Chris Soghoian wrote an essay criticizing the unskeptical coverage. Ryan Singel, the editor (not the writer) of the Wired piece, responded by defending the original article and attacking Soghoian.
At this point, I would have considered writing a long essay explaining what's wrong with the whole concept behind Cryptocat, and echoing my complaints about the dangers of uncritically accepting the security claims of people and companies that write security software, but Patrick Ball did a great job:
CryptoCat is one of a whole class of applications that rely on what's called "host-based security". The most famous tool in this group is Hushmail, an encrypted e-mail service that takes the same approach. Unfortunately, these tools are subject to a well-known attack. I'll detail it below, but the short version is if you use one of these applications, your security depends entirely the security of the host. This means that in practice, CryptoCat is no more secure than Yahoo chat, and Hushmail is no more secure than Gmail. More generally, your security in a host-based encryption system is no better than having no crypto at all.
Sometimes it's nice to come in late.
EDITED TO ADD (8/14): As a result of this, CryptoCat is moving to a browser plug-in model.
Posted on August 14, 2012 at 6:00 AM
• 36 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Not sure I understand. Does this mean if CryptoCat and Yahoo hosts are equally "secure", the unencrypted system is still only just as good as the encrypted?
It is pretty clear that a secure by design service with a vulnerability is certainly no better than a non-secure service without a vulnerability (if there is one - but given that let's say the OS could perhaps be the same, then vulnerability would be shared between them). Is that the point here?
Do note that Cryptocat 2 will be strictly a client-side tool (an OTR+XMPP client).
Thoughts on a local/native extension/plugin or OTR/XMPP version of cryptocat(2)?
Especially given the author has opened it up to criticism/advice?
I've seen some interesting discussion mailing lists involving the player of this little stage show. The discussions are subject lined: what I learned from cryptocat. These is the best thing to come out of this debacle
At 1st Reader: the problem with the so-called host-based security is that the encryption is terminated at the host, you could say the provider. So your traffic is unreadable for those intercepting it, but on the server (say, gmail) anyone with sufficient permissions can read anything being stored.
"your security depends entirely the security of the host" does not equal "your security in a host-based encryption system is no better than having no crypto at all".
Am I missing something?
A trustworthy company with some financial resource can do a better job of securing a system than most individuals.
The big issue any company trying to sell a service like this has is establishing trust.
Stripping away the irrelevant gender-bias accusations at the beginning of Singal's piece, I thought he was making a rather nuanced point that has been missed by much of the attending discussion: absent a realistic threat model, there can be no serious discussion of the security of a system like Cryptocat.
If your model is "the government is out to read your mail", then no, of course you can't rely on something like this. And, of course, that is the maximalist threat that informs the thinking of people in high-surveillance countries with civil rights issues, as well as cryptosec fundamentalists.
But there is another valid model for other people, along the lines of "my abusive ex-husband/boyfriend is trying to stalk me". For these sorts of purposes, a less impregnable solution can be acceptable, particularly if it is part of a tradeoff that yields greater ease of use for people living under this sort of threat who are not as technically-savvy as Soghoian et al.
Too often, computer/crypto security is discussed in absolute 1/0 terms, a framework encouraged by theoretical research into cryptanalysis and software security, which yields categorizations of "secure/insecure", without reference to use. I thought Singal's piece was a refreshing corrective to this mindset.
While I agree that all applications that depend on host-based security suffer from the same foundational weaknesses, to say that Hushmail is "no more secure" than Gmail or that "security in a host-based encryption system is no better than having no crypto at all" is patently false and misleading.
Cryptocat, Hushmail, and other host-based security systems offer great security and fill a number of security needs when used for the correct applications. You have to match the threat with the countermeasure.
Among a host of possible users, any victim of domestic abuse could use these systems to prevent their communications being monitored by their partner, any teenager who doesn't want his mother reading his emails to his girlfriend, and most corporate whistleblowers would be able to safely use these types of systems.
No one can offer perfect communications security in the online world, but don't throw out anything that doesn't come up to the level of "perfect". There is a security continuum and the users have to be aware of what level they need.
I agree some tools may have been over touted in the press, but please don't let the perfect be the enemy of the good.
So what's the threat model that Hushmail or Cryptocat is successful against? It sounds as if it's essentially "has physical access to your machine but no application password" along with "does not have the resources to install a keylogger".
And maybe a side order of "will not retaliate upon finding a password-protected app for which he/she does not have the password."
The name CryptoCat is totally awesome at least. Everybody uses Jabber/XMPP with off the record encryption anyways for free.
Hey Bruce.. remember that NSA executive that leaked their massive spying program? He claimed he used hushmail a 'highly secure email service' to whistleblow. I was shocked anybody at his level in the NSA would think hushmail is at all secure
"A trustworthy company with some financial resource can do a better job of securing a system than most individuals."
Can - but often don't. How many big name companies have lost unencrypted, or encrypted but unsalted passwd lists recently?
Then there is all the people the company will quite legally share all your data with. eg. The government in any country they do business and any country they would like to do business. Their own advertising dept, the ad depts of any company they would like to do business with. The 1000s of employees in their 3rd world call center.
And finally everyone on the internet that discovers the zero day exploit on their servers before they do.
I'm sorry Bruce, but Patrick Ball did no better at explaining, in my opinion. His article seemed to me a fluff piece for his own product and organization: Benetech, the makers of Martus.
If you remove the fluff, then his core argument could be reworded as:
"Host-based" services require you to trust the host every time you access it, whereas the "secure software" they've been "building for 20 years" at Benetech only requires you to trust the host once (when you download it).
If a user has no technical understanding/ability to verify the code they receive, then the only difference between these two scenarios is how often you download the code (assuming you never need to update Martus, of course).
If Benetech has been compromised before you start running their software, you've lost just as big as if Cryptocat is compromised.
Seems to me the article would have been better written as warning that if you don't understand crypto and review the code you're using, you can't have confidence in your own level of security.
Of course, then he wouldn't have had the opportunity to promote his own software as more secure than the latest thing that's getting mention.
No, Connie, eliminating the need to trust the site to maintain adequate security forever is a large improvement in security, even though of course you're still vulnerable to any flaws or back doors in the original program in either case.
As had been said before: Don't think in 0/1 security.
First, it's easier to write secure software if the information you try to protect is distributed on every user's computer and not stored in one place.
Second, as Joe has pointed out, you need to trust them forever and not just in the moment you download the software (though of course if it needs to be updated regular, then you need continous trust there, too).
Third, security holes in the end users software are hopefully detectable (at least if it used by a large user basis) - even if not each individual end user checks his or her code - if the data flow is mostly peer to peer (for example email). I'd not count the XMPP model secure if your data is sent to a preconfigured host, as it is in most cases easy and undetectable to use a weak random number genarator or a weak group for your crypto.
And four: explicitly steal the users data on his computer is a much higher moral barrier than "stealing" it from the servers the attacker already own. Also many legal system would force a provider to give the data they already have to law enforcement, but not to write insecure software and push it to the end users PC.
"His article seemed to me a fluff piece for his own product and organization: Benetech, the makers of Martus."
Reading your statements, I would have thought you meant he had a big financial conflict of interest selling some product. I just looked it up & their main offering is free. He also notes it has a lot of uptake in the very types of places the Wired article promotes Cryptocat for. The take from that: he's experience in real-world usage of humanitarian technologies & his conflict of interest is smaller than most.
I'll pass on offering arguments about the technical stance as others have replied to you.
@Paul "So what's the threat model that Hushmail or Cryptocat is successful against? It sounds as if it's essentially "has physical access to your machine but no application password" along with "does not have the resources to install a keylogger"."
How about "has control over intervening network segments between you and the host, but no control over the host"? Basically, the same as any SSL-based service.
Wow. Interesting to see 'host-proof hosting' -- a phrase that my colleagues and I at eVelocity coined more than a decade ago -- pop up in Patrick Ball's article. But as he points out, being truly host-proof was more of a concept than a reality and I believe that's still the case.
@ Bruce Schneier
"Sometimes it's nice to come in late."
As a side benefit, you can promote your site & publish the good information without Singel accusing you of being sexist for criticizing a tool. ;)
(I had a nice laugh at that little tangent.)
And to clarify my above comment, I agree with Patrick Ball that 'host-proof applications' is a better description of the security goal.
And to correct the second comment... that wasn't the Patrick Ball article that used the phrase 'host-proof applications'. In the words of the immortal Emily Litella, never mind. ;-)
Your arguments are unconvincing. I have never come across a piece of software (especially networked security software) that did not require constant updating. So again, the only difference is the frequency with which you trust the provider.
Furthermore, your argument that is somehow more illegal or infeasible for an agency to force a provider to push a modified version of software to a specific user is just plain confused--the danger of using CryptoCat is supposedly exactly that!
I make no claim regarding his financial interest in his organization, which I'm sure you don't have a good handle on either. Regardless, he definitely appears interested in promoting it and himself as a creator. It's typical contentless web-PR, he talks more about himself and his creations than the supposed subject of the article.
I agree it's an improvement to have to review code pushed to you less often than every time you want to use it, but pretending that downloadable software is somehow vastly different and inherently more secure is a false dichotomy. It can still have a backdoor, even one created just for you and only in your copy.
The obviou attack vector in this discussion is "legal attack." No host-based system has any resistance to this type of attack. And these days, legal attacks don't just include lawful intercepts or wholesale data gathering by "big surveillance." A civil lawsuit over IP theft or alleged breach of a company confidentialty agreement could compromise the system. Or a legal attack could be mounted by the entertainment industry if they decide this service benefits content pirates.
Unfortunately, it's hard to get away from having to trust a service provider. There's nothing preventing them from one day pushing a legitimate, signed update to you that silently leaks your keys. Unless the service supports an ecosystem of open-source clients with a peer-reviewed code base and transparent distribution, I don't see how any cloud service can ever promise any sort of protection from other than in-flight interception or casual snooping.
The summary by Ball is very good and noone should trust these services if they are up against a competent adversary (and for the domestic example it's hard to see what this provides, in *reality*, that another gmail account doesn't).
Important: it's all well and good to argue semantics and details and threat levels and whatnot. But in this case, even if you meant well and just got misunderstood, recommending these services can get people *killed*.
Just think about that.
I cannot agree with Patrick Ball's generalization. Yes, it is certainly a class of applications that rely on "host-based security," but proper use has a lot to do with the security effectiveness.
To my knowledge, prior to 2005, Hushmail published their original source code and compiled applet for purposes of both peer review and strong hash verification.
Proper use would refer to trusting original applet code download through out-of-band confirmation or a digital signature verification. The locally-stored and trusted applet could be used or a new applet served could be verified against a trusted original. (the original Cryptocat download of a 'plug-in' will require out-of-band verification as well).
The real question is whether all of this becomes any easier than just using your laptop with PGP or carrying around your own PGP-install USB device.
The security of CryptoCat is comparable to the one provided by any client-side cryptography tool.
For example GnuPG or TrueCrypt and CryptoCat2 all run on your PC, with local code, interacting with the network only with encrypted data.
The data received/sent from the network cannot subvert the application logic, because the application logic only exchange with the network sanitized data.
By using a Browser plug-in as a default choice, ad eventually web-delivered-code only for DEMO purposes, the level of security would be the same as Enigmail with GnuPG.
a) You download the encryption program
b) You use the encryption program
c) The encryption program communicate with the network by sending/receiving encrypted data
That's the future.
Imho we should consider that the security of downloading GnuPG from their website or installing CryptoCat Plug-in is the same.
So the security of the software delivery should not be put in doubt.
W3C is also working in that direction by providing native crypto APIs to facilitate implementation of JS crypto.
Fabio Pietrosanti (naif)
Cryptocat ... Wired published a fluffy profile ...
@ Fabio Pietrosanti
"For example GnuPG or TrueCrypt and CryptoCat2 all run on your PC, with local code, interacting with the network only with encrypted data."
"a) You download the encryption program
b) You use the encryption program
c) The encryption program communicate with the network by sending/receiving encrypted data"
Not quite. Cryptocat works with the browser. So, you need the browser itself to use it. You may also have to consider other browser extensions in the TCB, esp as people keep finding sandbox bypasses & attacks. This isn't the case with a standalone app.
Additionally, a standalone app can be further isolated with existing & future security technologies. The various separation kernels & approaches like Perseus Security architecture like to make security-critical code run side-by-side with untrusted, virtualized (or wrapped) code. Some designs even offer a trusted path or more robust GUI system. A standalone app, with minimal platform dependencies, can be run on one of these. It's also easier to use such an app with a tech like Flicker's trusted execution or user-mode Vx32 VM.
I invite all cryptographers of the future to not underestimate the power of WWW complexity to diminish the assurance of crypto products and nullify our ability to protect them with the fruits of defensive R&D.
"Imho we should consider that the security of downloading GnuPG from their website or installing CryptoCat Plug-in is the same. "
Closer to the truth. However, checking an executable might be as simple as verifying a signature. Installing a plugin requires trusting the browser too. Or handling plugins in non-intuitive way, defeating that advantage. Most WWW versions of security-critical apps essentially say "let's take this tool, put a pretty HTML/CSS interface on it, & throw a whole browser in its TCB." Seeing as browsers are regularly DEFCON fodder, it's hard to have as much confidence in the idea.
Definitely. The browser and browser-based security attempts won't go away. Chrome's NaCl sandbox is the best of them. I'd like to see continued improvements in crypto primitives, isolation properties, & ability to enforce POLA. Migh become a good platform one day. The alternative is to make portable apps with simple installation & use. People are doing that, too, though. So, we both get what we want. ;)
Admittedly, the Quinn Norton article in Wired was completely over the top and made claims Cryptocat's author, 21-year old student Nadim Kobeissi, never made himself. So was Ryan Singel's subsequent attack of Christopher Soghoian's criticism. If it weren't for Patrick Ball's way more nuanced story, I think I would pretty much have given up on the entire Wired Threat Level-section alltogether. It's not the first time they totally lose perspective and prefer sensationalism over facts in exactly the same way Stratfor is regularly doing.
That said, I propose we all give Kobeissi a break and consider Cryptocat for what it is: a works in progress of a young man with his heart in the right place and who is very open for constructive criticism. His moving away from the host based browser model is a fine example thereof. I for one am happy that the guy is using his time and talents for this sort of thing instead of just having himself enlisted by some financial institution where he can make a lot of money inventing the next complex mathematical algorithm for derivatives that will eventually crash the markets.
As suggested by other commentors, there is little point in thinking in black and white only. Applications like Hushmail, Cryptocat, Enlocked and the like do have an added value for some types of communications as long as their users understand both their strong points and their weaknesses. A VW Golf may be a very interesting city car, but it may not get you anywhere on a formula 1 racing track. Adversely, you don't need a Ferrari just to go shopping.
sorry to come in late, was offline for a while.
-- Martus is published under the GPL, it's free ($0) and libre. Benetech is a 501(c)(3) nonprofit; the various documents related to that are available online. My colleagues and I earn salaries, but we don't make anything from Martus, if that's your concern.
-- Backdoors: certainly users should confirm that they software they're getting is the software we released. For Martus, there's plenty of digital signature machinery built-in to enable users to do that.
-- Could it be spoofed? Maybe, but it's not simple. Probably malware in java would be the easiest attack on our model: if the user's computer is compromised in this way, there's not much we can do. Or users could build the jar themselves. Not easy, but entirely do-able, we've heard of users doing it. We've done a lot of work on our build process recently to make it easier.
-- I disagree with you: I think that trojaning an easily authenticated installable is a lot harder than attacking a per-session downloadable web app. More to the point, in server-based security, the server holds your key, so they must know who you are when you use their system, which makes a targeted attack easy. In an installable like Martus, pidgin/OTR, or gpg, the developers have no idea who you are.
-- In practice, we've learned that most of our users tend to pass around CDs or jar files on USB for updates, rather than download, so it would be hard for us to tailor a trojan for a particular user. And a big part of our training is how to authenticate our jar, which we find our users enjoy.
-- Security updates: since we published v1.0, we've not had a security-related bug. We've had problems with various internationalizations (Martus runs in languages like Arabic, Thai, Burmese, Nepali, Russian). We release approximately quarterly with new features. Our complete release history is available online.
-- I used the Martus example to make the point that low-resource users can benefit from serious security, and we've been doing it for over ten years. The claim by some journalists that real security is too hard is correct, but my colleagues and I think the right approach for low-resource users who have little attention for security is to build real security into applications. So that's what we do. CryptoCat's research is interesting, but presenting what is essentially a research project as though it's secure software is not OK.
hope this clarifies.
1) Security experts applaud migration of Cryptocat to a browser-extension model. What is there to celebrate? Now, in order to install a Chrome browser extension... you are required to have a Google account!
2) Even if this new Chrome requirement to establish a single, centrally monitored identity across all computers which you use wasn't bad enough by itself... Google sabotages new account creations for people who aren't easily trackable. (I have never managed to create a Google account using Tor + disposable e-mail address for years. The account disappears. Or other things happen.)
3) There is no mention of this significant fact on the Internet. You can trawl Google in vain. This fact has been mentioned only one time ever... and where, in 4chan! http://www.webcitation.org/6Abky8Bks
Unnatural absence of information.
4) The whole "host-based security" dispute looks like rubbish and attention diversion. Has everyone forgotten that you CAN download the cryptocat server code ONLY ONCE and install it on your own computer FOREVER? It then becomes "client-based security" effectively.
5) I'm not any security expert but I believe that HTTPS vulnerability is avoided by using Tor hidden service... If someone cares about that, he can install Tor Browser. No need to massacre cryptocat.
6) Security experts have had over 20 years (since the inception of IRC) to create a simple multi-user chat application with client-side encryption, so that no server admins could play God. (OTR is not multi-user.) Apparently there was no will to. The few attempts to create it have been subverted just like Cryptocat. You can hear from them a lot about how bad it is... one thing you won't hear from the same people is how to make it better. Security experts DON'T WANT security for the average Joe, only such security which is practical for military-grade applications... and impractical for everything else. They seem enjoying vulnerability of the masses. Even server-side encryption such as IRC-over-SSL has been introduced with great reluctance, as if in support of ISP surveillance. They relish in cleartext transparency of IRC, and in Freenode's requirement to link your identity to a reputable IP and/or reputable e-mail. Corrupt bastards, not experts.
7) I'm no security expert (LOL disclaimer) but it amazes me how these simple things "evade" experts attention.
8) The whole thing (the rollercoaster of positive and negative publicity) smells like a conspiracy to intimidate the young creator of Cryptocat into tearing his idea apart and save the online chat status quo. (Reminiscent of the fatal pressure which another young idealist, Ilya Zhitomirskiy, got subjected to when he initiated Diaspora, an open-source alternative to Facebook, just before the Facebook IPO drive.) Chat software where you are (effectively) required to log into Google and be fully trackable by the overlord company is fundamentally different from the original idea of Cryptocat. I think that Cryptocat has been murdered.
9) You can't get friends in real life when your passions aren't typical enough. You must find them on-line. But, without safety from "casual snooping"... you just MUST NOT do so. The consequence of this little fact is very big: no alternative communities can emerge in the world until home-like level of privacy becomes widespread on-line. Privacy available only for Unix experts is not very useful in changing the world. The world order architects are aware of this, aren't they?
P.S. "Comment Blocked. Your comment was blocked from appearing on the site. This may have happened for one of the following reasons: * You posted without typing anything in the name field, or you simply typed 'Anonymous.' Please choose a name or handle to use on the blog and try again. Conversations among several people all called 'Anonymous' get too confusing."
With great interest follow this dispute. So far I am learned that:
1. Possible compromission of trusted host that deliver js code is will be catastrophic. So, my mom should google, download and install some ProtectMyCreditCard.exe locally, in addition to all those cool browser toolbars and tray icons. If something goes wrong, she will notice it much faster than we find backdoor in public js code.
2. Inability to prevent SSL MITM attack (with CA likely owned by "repressive government") make cryptocat unsecure. Therefore, usage of SSL considered harmful.
Now seriously: instead of criticizing obvious weaknesses of niche products, how about offer the good enough solution for the problem? Best that we have is still x509 and DNS. Honestly, this is not what I am expect from the cryptography of the computer era.
I don't trust google [properties] per se: profit from user harm. "do no evil" is comically conflated with proactive good. As the naive majority dissolves in wisdom so fades google into obsolescence. If only that could be effected rapidly. If google is serious about their alleged non-evil they would deploy mpOTR and zRTP.
It is less than amusing that a plug-in boasting privacy has a google email prerequisite. Nonesuch is required for firefox installation.
There need be a middle ground for chromium users who are not compile inclined. Trusted third party compilers?
what of federation for others who deploy cryptocat2? selective federation?
S2S whitelists? pgp style trust?
JS vs JOSE?
(the original acronym, WOES, like bacon is more pleasing)
Kudos to cryptocat2 for integrating mpOTR before jappix, jwchat, jaxL, or sparkweb, and in far less time.
There is a chatting application called TorChat whose security and anonymity is based on Tor hidden services. Is there any review anywhere that would evaluate the application?
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.