Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break.
In 2004, Cory Doctorow called this Schneier’s law:
…what I think of as Schneier’s Law: “any person can invent a security system so clever that she or he can’t think of how to break it.”
The general idea is older than my writing. Wikipedia points out that in The Codebreakers, David Kahn writes:
Few false ideas have more firmly gripped the minds of so many intelligent men than the one that, if they just tried, they could invent a cipher that no one could break.
The idea is even older. Back in 1864, Charles Babbage wrote:
One of the most singular characteristics of the art of deciphering is the strong conviction possessed by every person, even moderately acquainted with it, that he is able to construct a cipher which nobody else can decipher.
My phrasing is different, though. Here’s my original quote in context:
Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break. It’s not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.
And here’s me in 2006:
Anyone can invent a security system that he himself cannot break. I’ve said this so often that Cory Doctorow has named it “Schneier’s Law”: When someone hands you a security system and says, “I believe this is secure,” the first thing you have to ask is, “Who the hell are you?” Show me what you’ve broken to demonstrate that your assertion of the system’s security means something.
And that’s the point I want to make. It’s not that people believe they can create an unbreakable cipher; it’s that people create a cipher that they themselves can’t break, and then use that as evidence they’ve created an unbreakable cipher.
EDITED TO ADD (4/16): This is an example of the Dunning-Kruger effect, named after the authors of this paper: “Unskilled and Unaware of It: How Difficulties in recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.”
Abstract: People tend to hold overly favorable views of their abilities in many social and intellectual domains. The authors suggest that this overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it. Across 4 studies, the authors found that participants scoring in the bottom quartile on tests of humor, grammar, and logic grossly overestimated their test performance and ability. Although their test scores put them in the 12th percentile, they estimated themselves to be in the 62nd. Several analyses linked this miscalibration to deficits in metacognitive skill, or the capacity to distinguish accuracy from error. Paradoxically, improving the skills of participants, and thus increasing their metacognitive competence, helped them recognize the limitations of their abilities.
EDITED TO ADD (4/18): If I have any contribution to this, it’s to generalize it to security systems and not just to cryptographic algorithms. Because anyone can design a security system that he cannot break, evaluating the security credentials of the designer is an essential aspect of evaluating the system’s security.
Leave a comment