Eric Schmidt on Secrecy and Security

From Information Week:

InformationWeek: What about security? Have you been paying as much attention to security as, say Microsoft—you can debate whether or not they've been successful, but they've poured a lot of resources into it.

Schmidt: More people to a bad architecture does not necessarily make a more secure system. Why don't you define security so I can answer your question better?

InformationWeek: I suppose it's an issue of making the technology transparent enough that people can deploy it with confidence.

Schmidt: Transparency is not necessarily the only way you achieve security. For example, part of the encryption algorithms are not typically made available to the open source community, because you don't want people discovering flaws in the encryption.

Actually, he's wrong. Everything about an encryption algorithm should always be made available to everyone, because otherwise you'll invariably have exploitable flaws in your encryption.

My essay on the topic is here.

Posted on May 31, 2005 at 1:09 PM • 10 Comments

Comments

IthikaMay 31, 2005 7:44 PM

I thought the last word on this matter was written in the nineteenth century (Kerckoff)? Surely people aren't still advocating "security through obscurity"?

Bruce SchneierMay 31, 2005 9:17 PM

"I thought the last word on this matter was written in the nineteenth century (Kerckoff)? Surely people aren't still advocating 'security through obscurity'?"

It's actually much more complicated than that. There is a place for obscurity, but it needs to be used carefully and sparingly. Read the essay I wrote on the topic.

Thomas SprinkmeierMay 31, 2005 10:25 PM

Mentioning "open source" is just a red herring. He's talking about (not disclosing) algorithms, not source code.

Unfortunately this will get interpreted as "Even Google CEO things Open Source is a security risk", and enough PHB's will believe it.

Chris WalshJune 1, 2005 12:59 AM

Schmidt seems to have misspoken. Of course you want people to discover flaws. You just want "the right people" to discover them. The most charitable interpretation I can make is that Schmidt feels that Google's internal community of experts is better able to identify these flaws than the general
interested community would be, and that Google's algorithms are thus analogous to the panic button in Bruce's essay. I think he's wrong on that.

Tommy PirbosJune 1, 2005 2:18 AM

Regarding "security through obscurity", I think it's not a matter of either-or. Sometimes it's very good security indeed not to be seen or be noticed. Regarding security related software I totally agree that transparency is a necessary thing, but that doesn't automatically mean that in other areas of computing or life obscurity is bad.

Bob McGrewJune 1, 2005 3:00 AM

In context, he's advocating security through obscurity for link-spamming and related issues, not encryption. (Though he's clearly wrong about encryption.)

And for things like link-spamming which are really fraud issues rather than security ones, he's right - there's no magic bullet the way that there is with cryptography. Concealing your strategy so that your opponent is uncertain of it will get you better payoffs.

logicnaziJune 1, 2005 4:51 AM

There certainly are places where security through some amount of secuirty may be quite valuable even for cryptographic algorithms. For instance if you are the NSA your in house analytic capabilities probably greatly exceed any contribution you might get from interested outsiders. Furthermore knowledge of the algorithm may greatly aid attacking countries and it is perfectly reasonable that this aid causes more harm than the free eyeballs gained by releasing it.


On a more pragmatic example suppose you are going to make an irreversable commitment to an encryption algorithm, e.g., DVD encryption, and for some reason you can't use a well tested off the shelf solution. Now if your algorithm is only availible in hardware obscurity will likely slow any attacker somewhat and extend the lifetime of the device encryption (some benefit is also had putting it in software). Conversely, releasing the algorithm from the general community is not likely to attract much aid before it is released (would *you* go do some free analysis for the MPAA to help make their encryption better). Since the commitment to the algorithm may be irreversable I hardly see how finding out what the flaw is in the algorithm once it is deployed helps them.

bahawJune 3, 2005 12:22 AM

Probably Eric has similar reasons as why US gov't did not release Skipjack algo. Poor Kerckhoff.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..