Schneier on Security
A blog covering security and security technology.
« CME in Practice |
| 30,000 People Mistakenly Put on Terrorist Watch List »
December 7, 2005
Snake-Oil Research in Nature
Snake-oil isn't only in commercial products. Here's a piece of research published (behind a paywall) in Nature that's just full of it.
The article suggests using chaos in an electro-optical system to generate a pseudo-random light sequence, which is then added to the message to protect it from interception. Now, the idea of using chaos to build encryption systems has been tried many times in the cryptographic community, and has always failed. But the authors of the Nature article show no signs of familiarity with prior cryptographic work.
The published system has the obvious problem that it does not include any form of message authentication, so it will be trivial to send spoofed messages or tamper with messages while they are in transit.
But a closer examination of the paper's figures suggests a far more fundamental problem. There's no key. Anyone with a valid receiver can decode the ciphertext. No key equals no security, and what you have left is a totally broken system.
I e-mailed Claudio R. Mirasso, the corresponding author, about the lack of any key, and got this reply: "To extract the message from the chaotic carrier you need to replicate the carrier itself. This can only be done by a laser that matches the emitter characteristics within, let's say, within 2-5%. Semiconductor lasers with such similarity have to be carefully selected from the same wafer. Even though you have to test them because they can still be too different and do not synchronize. We talk abut a hardware key. Also the operating conditions (current, feedback length and coupling strength) are part of the key."
Let me translate that. He's saying that there is a hardware key baked into the system at fabrication. (It comes from manufacturing deviations in the lasers.) There's no way to change the key in the field. There's no way to recover security if any of the transmitters/receivers are lost or stolen. And they don't know how hard it would be for an attacker to build a compatible receiver, or even a tunable receiver that could listen to a variety of encodings.
This paper would never get past peer review in any competent cryptography journal or conference. I'm surprised it was accepted in Nature, a fiercely competitive journal. I don't know why Nature is taking articles on topics that are outside its usual competence, but it looks to me like Nature got burnt here by a lack of expertise in the area.
To be fair, the paper very carefully skirts the issue of security, and claims hardly anything: "Additionally, chaotic carriers offer a certain degree of intrinsic privacy, which could complement (via robust hardware encryption) both classical (software based) and quantum cryptography systems." Now that "certain degree of intrinsic privacy" is approximately zero. But other than that, they're very careful how they word their claims.
For instance, the abstract says: "Chaotic signals have been proposed as broadband information carriers with the potential of providing a high level of robustness and privacy in data transmission." But there's no disclosure that this proposal is bogus, from a privacy perspective. And the next-to-last paragraph says "Building on this, it should be possible to develop reliable cost-effective secure communication systems that exploit deeper properties of chaotic dynamics." No disclosure that "chaotic dynamics" is actually irrelevant to the "secure" part. The last paragraph talks about "smart encryption techniques" (referencing a paper that talks about chaos encryption), "developing active eavesdropper-evasion strategies" (whatever that means), and so on. It's just enough that if you don't parse their words carefully and don't already know the area well, you might come away with the impression that this is a major advance in secure communications. It seems as if it would have helped to have a more careful disclaimer.
Communications security was listed as one of the motivations for studying this communications technique. To list this as a motivation, without explaining that their experimental setup is actually useless for communications security, is questionable at best.
Meanwhile, the press has written articles that convey the wrong impression. Science News has an article that lauds this as a big achievement for communications privacy.
It talks about it as a "new encryption strategy," "chaos-encrypted communication," "1 gigabyte of chaos-encrypted information per second." It's obvious that the communications security aspect is what Science News is writing about. If the authors knew that their scheme is useless for communications security, they didn't explain that very well.
There is also a New Scientist article titled "Let chaos keep your secrets safe" that characterizes this as a "new cryptographic technique, " but I can't get a copy of the full article.
Here are two more articles that discuss its security benefits. In the latter, Mirasso says "the main task we have for the future" is to "define, test, and calibrate the security that our system can offer."
And their project web page says that "the continuous increase of computer speed threatens the safety" of traditional cryptography (which is bogus) and suggests using physical-layer chaos as a way to solve this. That's listed as the goal of the project.
There's a lesson here. This is research undertaken by researchers with no prior track record in cryptography, submitted to a journal with no background in cryptography, and reviewed by reviewers with who knows what kind of experience in cryptography. Cryptography is a subtle subject, and trying to design new cryptosystems without the necessary experience and training in the field is a quick route to insecurity.
And what's up with Nature? Cryptographers with no training in physics know better than to think they are competent to evaluate physics research. If a physics paper were submitted to a cryptography journal, the authors would likely be gently redirected to a physics journal -- we wouldn't want our cryptography conferences to accept a paper on a subject they aren't competent to evaluate. Why would Nature expect the situation to be any different when physicists try to do cryptography research?
Posted on December 7, 2005 at 6:36 AM
• 63 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
You know its scary that these people can be so inept and still manage to get their "research" published. I have to find and read journals every day for uni work, security papers included. Is it the norm for such such easily discredited research to be published? Makes me wonder how many papers i've read in the past have been so fundamentally flawed like this one.
Posts like this one are the main reason I read this blog (and Cryptogram). Snake oil, especially if it is research, should always be debunked.
These types of posts are why I keep coming back to reading your weblog. thanks.
This article by Spanish scientists in Nature is hardly the first one on this subject according to some Googling. Science published on some Georgia Tech research in 1998. There is a long list of physics papers on this, in all kinds of high end scientific journals, which is linked to from the Spanish site. Surely somebody in the crypto community has come across this? Below is a link on what georgia tech published in Science in 1998, with an explanation of what they have done. The Nature study is funded by European Union Money, whereas the Georgiatech was funded by NSF and Office of Naval Research.
Will be interesting to see who will win this debate. I tend to side with Bruce now...
The New Scientist article about the same topic (I am a subscriber) is no better than your description of the article from Nature. It left me with an impression that the author is a genuinely well-meaning and curious human being trying to report on a topic about which he or she is completely clueless.
I think people should be taught serious cryptography and security courses, not in university, but already in high school.
I think only widespread security courses like that in high school would seriously impact the proportion of clueless security thinking we're seeing.
Aside from Nature entering the security game for dubious but obvious reasons, these folks are caught in the same research economy as the rest of the US in needing to do the basic to applied work all at once.. and needing to publish.
The problem inherent in this "economic" need for applied work to the debt of basic work is very well illustrated here, as the researchers may not have spent much time on the basic question of whether this approach is really interesting or not --- subject to the interception problem (instantaneous by a variable or tunable device, or stored and exhaustive).
Doesn't everyone go through the experience of creating an "unbreakable" crypto system that is fundamentally flawed? I know I did. I was 12 at the time and have had a long time to learn how much I didn't and still don't know.
The goal of cryptography is to transform a message into unintelligible gibberish. This is the level of understanding most people have. Without an understanding of how to attack cryptosystems, it's easy for someone to create something that they themselves cannot break. I think that is the case in this article.
As technical people, we tend to miss the point on things like this sometimes. Good technical solutions rarely will stand on their own merits. On the surface, it may look like a solution to a technical problem, but it's likely being driven by politics more than a need for a solution.
It might be worth sending your summary to the author in case they are truly interested in improving communication security. However, I wouldn't be surprised if your input on this was completely disregarded. I would bet that they have never heard of you and would not be swayed by a suggestion that they read some books on cryptography. It's likely that someone funded this research and a positive outcome results in more funding.
There's a subtle difference between falsifying data, creating real data that can be used out of context, and failing to ask pertinent questions. I see all three as basically the same - ways to avoid the truth. However, I've noticed in the business world, using data out of context and failing to ask questions are extremely common. I wish I could say academia was different, but I really don't think so.
So the abstract says that it is a communications system which is private and robust. As we can see, it's no more "private" than any other tappable line. However, I'd also like to take issue with the claim that it's robust. It sounds to me like the system relies on subtle variations in manufacturing conditions combined with "the operating conditions (current, feedback length and coupling strength)". It would seem that minor variations in environmental conditions at either end of this communications link could cause the system to fail. That does not sound, to me, like a particularly robust system.
Perhaps it's very secure because it doesn't work very often and so no one uses it.
Physicists are brought up in a culture that thinks EVERYTHING is physics; therefore, a physicist can opine and do research on ANYTHING with no further training. In general, I have a great deal of respect for physicists, but I've noticed this culture of arrogance many times.
BTW, economists and lawyers are embedded in similar cultures. Neuroscientists are getting that way, and evolutionary biologists, too. I'm a mathematician, and our culture is ignorance of the real world, not arrogance about our knowledge of it. Go team!
I guess that the underlying reason for this kind of snake oil, is that physiscists are taught to think that they know something about everything, esp in science. This leads to arrogance as described above.
To be fair, Nature articles are often very, very theoretical. The fact that any real implementation of this "encryption" technique would be useless for security is less relevant to Nature's audience than the fact that these guys managed to exploit a new chaotic system.
Similarly, it's no surprise the recent article on honeypots required a few unimplemented details like an automatic virus detector and a patch distribution mechanism: those sorts of things belong in an engineering-oriented journal, or at least more applied science.
The point of these papers is not to explore any concepts that might get used in security software or hardware. The point is to explore fundamental concepts of the universe. Perhaps the recent article on virus detection will come in handy for building honeypots for biological detectors. Perhaps lasers that emit chaotically varying intensity will be useful as pseudorandom number generators in some sort of environment where normal electronics don't work. Perhaps neither paper will be useful at all, ever.
This is science as pure as it gets. It's disconnected from people's actual needs; that's basically the whole point. If you don't like it, read Spectrum or Dr. Dobb's Journal.
Thank heavens that someone is keeping track of absurd 'research' such as this. Keep up the good work!
Heard about the "Sokal Affair"?
The big difference is that Alan Sokal on purpose submitted a hoax article to test the quality of the review process and hence the credibility of the authors and publishers.
Everything IS physics, what isn't understood is that there is more to some things than /just/ physics.
I think that Nature certainly has a justification for printing this, or more accurately articles like this. There are interdisciplinary topics that span physics and cryptography. I give you quantum cryptography, for instance. There's a subject that is interesting to cryptographers, and yet most probably wouldn't have been conceived without some physicists. This article may have good physics in it (I don't know, I'm not a physicist), and I think that cryptography papers of such an interdisciplinary flavor have a fine place in such a publication. I think that Nature's only mistake was in not having a competent cryptographer among the people who reviewed the article. And it's always possible that they thought they did.
well, they weren't cryptographers and they got in a little over their heads, but...
that's the global trend now, writing about and doing things outside/beyond one's competence. exhibit "a": the bush administration.
I wouldn't go so far as to say that everything is Physics, therefore the Nature people feel competent. But the basic principle *is* physical -- after all, it's got lasers and interference and stuff. ;-)
So give them the last year of your blog for reading over the holidays. If they're Real Scientists, they'll publish a correction. (One should hope.)
Security courses in high school?
The elements of things should be learned in elementary school. By second or third grade kids can read well enough to understand the idea of a message, a garbled message, an obscured message, a hidden message, and so on. They can understand the concepts of alphabets per se, of substitution, of transposition. (At this age this material would be new and fun, and tricky!)
Think of the age group targeted by TV with magic decoder rings.
The early years are when kids should learn about security, about how securing is done and undone. (Bruce is thinking of writing a book for kids.)
Obviously, the article authors missed out on this stuff when they were kids.
"BTW, economists and lawyers are embedded in similar cultures."
I'm starting to believe the economists.
Well Bruce, I think you have missed the fairly obvious here. I mean, it is quite clear to me what they mean by "developing active eavesdropper-evasion strategies"
No doubt they are going to use rotating polarizing filters to encrypt the photons carrying the message stream. If I was to guess, I'd think they'd probably put it through 13 rotational sequences.
I wonder what they'll call that...
"So give them the last year of your blog for reading over the holidays. If they're Real Scientists, they'll publish a correction."
I sent the corresponding author the link to this thread. He may yet comment here.
"This is research undertaken by researchers with no prior track record in cryptography, submitted to a journal with no background in cryptography, and reviewed by reviewers with who knows what kind of experience in cryptography."
... and of course, propogated by journalists with no damn clue about anything. *sigh*
Journalists are in the same company as physicists, lawyers, and economists. When all you really know is journalism, everything looks like a story. And all stories look pretty much alike.
So, really, is this guy claiming to be able to do anything more than fingerprint lasers and then speculating wildly on what you could do with that ability?
The physicist I am most impressed by is the guy who first managed to produce a Bosse condensate - when asked what it was good for he said something like, "When they first invented lasers they said they'd be the weapons of the future. Nobody mentioned eye surgery or information storage. So I'm just going to say I don't know what it's good for."
Being a physicist and an active researcher in the field of quantum cryptography, I regularly see this kind of snake-oil research, and some published in very prestigious journals like Physical Review Letters, but it's a situation which is difficult to fight. We are lucky enough to have both physicist and cryptographer working on quantum cryptography, and this limits the problem. However, it's still difficult to say publically that the direction followed by some teams is completely bogus, even if many people agree ...
Bruce, it was courteous of you to contact the author first, but if he does not address your criticisms I hope you will submit a letter of your own to Nature. Mistakes in the literature of science are inevitable, but to a large degree what makes the system work is self-correction: anyone can load an error into the knowledge base, but anyone can fix it too.
"I think people should be taught serious cryptography and security courses, not in university, but already in high school."
Having T.A.'d pre-calculus courses in college, I'd have to say that you'd need to improve the general mathematics background of most high-school students to teach them even something simple like RSA. As for myself, I didn't encounter serious modern cryptography until graduate school.
"(Bruce is thinking of writing a book for kids.)"
I'll buy it. Two copies, one for each sibling. They'll use it to get things by me, but that will give me a motivation to improve my own understanding as well.
Crypto Book for kids...excellent idea.
I would buy it for mine the second it came out.
When you are teaching security to youngsters, you can use simpler systems to show what works and what doesn't. And try to teach general principles. I think you could explain the basics of RSA to middle school kids, as they should have had a enough math by then. It is an interesting practical application of number theory.
Also it would be nice to teach people that you can't trust client software to be secure when you don't totally control the machine it was running on. I still remember having a Peoplesoft security trainer give me an odd look when I explained their security for two tier connections was fundamentally broken and then say but the connection was encrypted so it must be secure. The fact that the software that knew how to do the decription was running on a machine controlled by the person it was supposed to be defending against didn't register with her.
"I think you could explain the basics of RSA to middle school kids, as they should have had a enough math by then." - Anonymous
I'll guess that your country (or time period of education) has better early math education than mine; I attended one of the better pre-college math programs in the U.S., and I never formally studied anything using a modulus until a graduate-level math course (which taught that as part of the foundation for RSA - only about 1/2 the students had already covered it previously in their educations).
Now mechanically, you're correct- multiplication and remainders were grade-school material for me. However, I think that to teach RSA properly, you'd also want to teach:
1) Why it's secure.
2) What you'd have to do to make it insecure.
3) How to attack it.
All of which require math theory that I wouldn't have had a basis for until quite a bit later in my education.
In my experience naive-RSA is exactly the right amount of little knowledge to be a dangerous thing.
RSA, as first published, was hopelessly insecure. It wasn't until good padding schemes were invented that RSA became secure. I keep meeting people who understand the mathematics of RSA itself, and don't realize how much there is still to learn about the security of RSA. Understanding the proofs of security in the random oracle model that accompany OAEP+ or PSS is a much harder challenge - so hard that flaws in the proof of the predecessor OAEP went un-noticed for years.
If you want to teach people about crypto, please don't start by teaching them RSA, even though this seems to be the fashionable thing to do. Or at least teach Rabin, which is superior in every way.
Going back a few years to when I was a astronomy research student, Nature already had a problem. Primarily a life sciences journal they had a fixation with publishing 'sexy' research from other disciplines. A sub-editor even came into our department to solicit the submission of 'sexy' astronomy. For some reason, researchers in those disciplines play along. Probably because if you get into Nature, you get into the news feeds and you can play 'famous scientist'.
"Crypto and lasers in one article? Hold the front page!"
"I never formally studied anything using a modulus until a graduate-level math course"
Ask a little kid two questions:
1. If today is Monday, what day will it be eight days from now?
2. If it is 9:00 now, what time will it be four hours from now?
Even if the kid has to look at an analog clock face or a wall calendar (both being modular meters and calculators), or count on his fingers, he is demonstrating a correct understanding of modular arithmetic.
When the concepts are formally introduced years later, the kid is lost. Obviously it is not the kid that's stupid, it's the teaching.
Classy post. This makes me glued to your blog all day. Kudos Bruce!
Re previous posters:
I didn't mean to imply that school students should necessarily know or understand the full details about how RSA or any such other scheme works. Hey, I can't claim such depth of knowledge either. I think they should be taught the critical lessons of thinking like a security analyst - or an attacker - so that they will be able to recognize bogus security arguments when they see them, and so that, at the very least, know will get an idea of how much they DON'T know.
People who haven't had a chance to learn these lessons will (1) not detect bogus and harmful security arguments, and will perhaps even follow them, and (2) will think that security is trivial and that they are fit to design and evaluate a security design, which they most certainly aren't.
I conceed the point that I used something like a modulus many times, from around 1st grade on. As another example, I knew what remainders were.
However, I did not use a modulus as a formal field until far later in my education. Specific basic questions I couldn't have answered:
In mod 8, what is the additive inverse of 2?
In Mod 12, what is 3*8?
In Mod 60, what is the multiplicative inverse of 12?
However, whether I was taught this or not is largely irrelevant. In my opinion, to teach any cryptosystem, you need experience in creating and attacking proofs. My training in proofs barely started in Middle school, and only became reasonable in High school, where I had a number of largely proof-based courses. My understanding is that (at least in the U.S.A.) many students don't have proof-heavy courses until college. The bit of T.A.ing I did as a graduate student appeared to confirm that.
@ denis bider
"I think they should be taught the critical lessons of thinking like a security analyst - or an attacker"
Teach kids not only encoding and decoding, but how to attack encoded messages. When I was at the "secret-decoder ring" age, I was a lot more interested in how to decode someone else's secrets, because I figured The Enemy had smart guys who thought like I did, too.
There's a lot you can learn about codes by writing computer programs to attack them, too. If I'd had computers at that age, that's where I'd be. And not just brute-force attacks, but analysis, patterns, chosen plaintext, etc. There's a lot there in the "How to think about it" that doesn't take deep math.
And none of this really needs to be cutting-edge crypto. It'd be enough to have kids at age 12 learn that the codes they believed unbreakable at age 10 were simply not being attacked properly.
"When I was a child, I thought as a child..."
When I see "published in Nature", I think "biology", sooo...
When I first read the title of this blog-post, I thought "Oh, now Bruce has found something interesting about oil extracted from legless reptiles". I was shocked, SHOCKED to find it was merely lasers and cryptography.
@J. Random Programmer
"Teach kids not only encoding and decoding, but how to attack encoded messages."
Uber-bingo! Learning the skills to approach and attack *any* problem is one of the most valuable parts of an education. And you don't even have to wait for a kids crypto primer.
Sudoku (google or amazon it) is a number puzzle, that involves using the existing numbers in the puzzle as clues and working out the remaining digits. Depending on the number of starting digits in the grid, they have varying difficulty, so they can be scaled back to Grade 3 level. But they absolutely require a solution method, even if only a small one applies repeatedly.
Sponge-Bob and Dora Sudoku books are coming soon.
The really nice thing from a puzzle point of view is that there is no external element to the solution. A cross-word puzzle can become impossible due to gaps in your vocabulary. But a Sudoku puzzle always contains the seeds of its own solution (there is an open-source solver on sourceforge).
I use two or three puzzles on the subway in the morning - it's like sending your brain out for a morning run to make sure you are ready for the day.
The key aspect about RSA is it being an example of a trap door function and its primary use should be as an example of such. Comments about some problems with naive use of it should be mentioned, but depending on the audience you don't need to go into the gory details.
I think it is simpler to explain about factoring then some of the other trap door functions that have been invented.
Bruce is usually good at getting to the socio-economic bottom of problems, but he's not reached it on this one. A more pertinent question is "how and why did the editors of Nature let this one go through?" Is their review board so narrowly disciplined that the people who review physics articles can't recognize that articles with cryptographic content need review by professional cryptographers, or is it just one of the inevitable random failures of the peer review system, as "Philip" or Joe Patterson suggest.
Peer review's self-correcting process makes this a fine opportunity for a rebuttal letter, comment, or even article by someone. Why not contact Philip Campbell, Nature's editor-in-chief, and ask what actually happened?
Well you publish in nature for one reason. Sex appeal. That gets more funding. Both Science and Nature require sexy *results* not methods, and theres almost no room in the papers for enough detail for the reviewers to be objective. Not many ppl look at the extras.
This is not restricted to areas outside Natures topic areas. Even some of those articles are dubious at best and just plain wrong. Like the paper on homeopathy.
Note that any fraud in the general Science community genrally publish in almost everything in Nature and Science. Like that guy from bell labs. The interval review said we must assume that all his results were faked, only 2 articles IIRC where in physiscs review. The rest where in Nature and Science. Ironicaly it was the 2 other articles that gave the game up, when another group was tring to replicate the results, they noticed 2 graphs that where identical, but were for different things.
A little OT. Sorry.
The New Scientst article does quote Kevin Short from the University of New Hampshire who says that chaotic encryption is vulnerable, it references Physical Review Letters, vol 83, p 5389.
I have a lesson called "Penny Encryption" on one of my disks (somewhere) I should dig up and put on a my website. It's a one-time pad lesson I did for second graders, using a penny and binary encoding. Greater than half of second graders can do it as-is. The ones who have trouble mostly do with the binary encoding but can handle a lookup table just fine. They all get how you can make a message that's secret. I never did hear if they were passing secret notes.
Anyway, re: Nature, this is a good time for the old saying, "Biologists think they're chemists; chemists think they're physicists; physicists think they're mathematicians, and (of course) mathematicians think they're God."
OK so the researchers missed some fundamental security concepts and the system as it was conceived is not secure. However could this not be considered to be an added security measure? I.e. a physical layer form of security over which confidentiality, authentication, nonrepudiation, etc. encryption keys are applied? Would such a system not be more secure than with encryption keys alone?
Teaching kids the basis of crypto is pretty easy - kids already do some of that (at least where I went to school) but they call it "problem solving". You go from "extend the sequence" to "each letter represents a digit in this sum" to "each letter represents a different letter". Add a step ans start using something more tricky than a substitution cipher. Even a one-round multiply-and-modulo would help.
FWIW, I recall using modulo in primary school, as well as base-n math (we called it "clock arithmetic" and "base n maths" just to be inconsistent). It was quite fun once someone explained the idea of using letters as extra digits, although I recall getting a funny look when I asked if that meant that the Bablyonians had 60 different digits. This was in the 1970s in Nuke Freeland.
Ya know, if the CIA can have a section for kids, and if Sponge Bob can do soduku books, then there oughta be a basic Security Concepts curriculum oriented towards a range of youth levels.
I remember learning base-11 and base-12 in about 6th grade, but it was just Something Else to Forget until programming caught me up, and suddenly hex and octal and carry and overflow started to matter.
That's all mere mechanics, though, and is a far cry from being able to say whether some new secret-passing technology is actually practical as a new crypto technology.
For example, I don't see anyone these days advocating tattooing the shaved heads of couriers, letting the hair grow back, and then sending them on their way. Why not? Well, the message turn-around time is a bit slow, so although it might still have its uses (including inspiring a TV show), it's just not practical.
It's the mathematicians, evolutionary biologists, and economists who have it right.
Bruce, I agree with sennoma: you should submit a letter to the editors of Nature.
There is little point in teaching children actual cryptographic techniques. But I can see a game where kids assume roles of Alices, Bobs, Eves and Mallets, with teacher being the Trent, where they can learn about passive and active intercepts ("no Eve, you can't open the envelope with the letter, only Mallet or Bob can do this!") and where they learn that being Eve or Mallet is a bad thing but people will do that anyway and we need to protect against it.
After reading Applied Crypto, one of my friends wcoined himself following signature:
"Let's start our social communications lesson. First we should divide into groups. Group one: Alice and Bob. Group two: Eve and Mallet. My name is Trent. You can trust me."
When I was twelve I passed a note in class to another kid, hoping the teacher would intercept it, since we knew if he did he would read it aloud. He did intercept it, and triumphantly went to read it aloud, only to find five-letter code groups.
We later learned through our spy in the office that the note had made the rounds of almost all the faculty, whose best intelligence was "Maybe it's a secret code."
Well, it was secret, about 150 years ago -- it was the railfence cipher used in the American Civil War.
I learned then that kids could learn this stuff quickly, and would readily attack a message or any kind of puzzle, while grownups resisted learning defiantly.
My wife and I are mulling over various "extracurricular" lessons for our future kids. She grew up homeschooled, and I in a very off-the-beaten-path private school, where lots of odd lessons in things were encouraged.
Learning about security, and about cryptography (as separate items) really seems like it fits the bill.
But the issue I have is when it comes to cryptography, especially the multiply and modulus that seems to be the basis of most hashes and ciphers, I don't exactly "get" in a way that would allow me to explain it to a kid. It makes sense as a tool I can use, but that kind of algebra and numer theory I never had exposure to.
I have Applied Crypto, and I'm slowly working my way through it. Does it have this, or does it assume this knowledge. If assumed, any recommendations for good texts on that portion of math/number theory?
Actually, one can argue that this kind of fancy-sounding rubbish is what defines a Nature article. In my experience, this is hardly any rare exception to the high quality of Nature articles.
The main problem of these "fiercely competitive" journals, such as Nature, is that they want to publish "major breakthroughs" in any field of science. So as an author, to get your paper published,e.g. in Nature,, thorough work (and, as I would argue cynically, correctness) is actually much less important than the way the research can be presented (I could choose another word here) as a "breakthrough". Also, the Nature editors are much more important in the process of acceptance than in more specialised (physics) journals. There are quite numerous rumours that the editors tend to accept some papers because of their alledged "breakthrough" importance even if the reviewers are not that cheerful a croud.
Nature has a comment section, so Bruce could definitely try to get a rebuttal published.
There is little point in teaching children how RSA works, except perhaps an example; when you teach them how to use hammer, you don't teach them how to smelt and forge iron to build the hammer. I guess a better book to start from would be Ross Anderson's "Security Engineering".
I guess the single most important fact that should be passed is that you need to Define Your Threat Model. To know what is against you and what is not.
There are lots of people, even from the IT field who seem to be unable to grasp this simple point. Then they say "If we add AES/TLS/[something], the system will be more secure.".
If you can make the pupils understand how to think about threats, the rest is only icing on the cake.
This sort of thing isn't unusual for Nature. Nature seems to want to modernize its image and has been publishing a lot of computer science articles recently, provided they have some sort of physics or biology tie-in. Many of those articles wouldn't make it past peer review even in a third rate computer science journal, but because of Nature's reputation, they get a lot of press coverage. It's quite frustrating to people who have actually been working in those areas. I've been considering canceling my Nature subscription in protest.
I think it's the term 'chaos' that got this accepted.
Out of curiousity, as a journalist, what questions SHOULD I ask when I deal with a cryptography story? I don't feel like paying for the New Scientist article, but I am interested in how to learn what I should be looking for. (I mean, the way I'd generally handle this sort of story if it was assigned to me is read their paper— as much as I could suss out— ask the authors for quotes, get quotes from another scientist in their area, and then look for a prominent cryptography researcher for another point of view. I'd ask about strengths, weaknesses and chances of practical application). What specifically should I be looking for in stories like this? What are good tip-offs that something's smoke and mirrors for someone whose calculus class was just under a decade ago?
You mention New Scientist also having been hoodwinked as well as Nature. This touches on an area that makes my blood boil. When I was a child and teenager I regularly read New Scientist because my Dad bought it every week. Back then it contained a lot of scientific stuff much of which was over my head, but by reading it anyway I seemed to absorb a lot of science and developed a science brain. In recent years though, the mag has gone down market, as have many others, and one symptom of this is a failure to properly discharge its editorial duty to vet articles and submissions. Frequently it prints as short articles items which are simply corporate press releases. The new idea or technique described is usually full of holes apparent to anyone with a little knowledge of the field. Sometimes the proposed product or whatever defies some pretty unimpeachable basic law of nature or sometimes is just nonsense.
Occasionally the longer articles fall into this trap too and fail to see that the "new finding" is easily explained by what we know or that the proposal is snake oil. I have seen articles that would make anyone who had ever written a program that made a simple TCP/IP socket connection laugh out loud for its lack of understanding of how the internet works; I have also seen articles that purported to describe something that could not be explained by natural selection that could easily be explained by natural selection and was in fact a good illustration of it.
The area in which they have fallen into this trap most often is security and cryptography, often related to the internet. The one you highlighted is just one of dozens over the last few years. It is sad that Nature too was hoodwinked.
I have found that all the science magazines have gotten more sensationalist. And they certainly regularly screw up security stories -- the one area I know well. I don't have any confidence that security is the only area they screw up.
When in the 4th grade, I read a cryptography book targeted at kids my age. It was called "Alvin's Secret Code" (ISBN 0141300558) and covered basic transposition and substitution ciphers. Nothing we'd consider "real" crypto now, but it was enough to start poking at decoder rings and stuff, and years later when I first encountered rot13, I recognized and figured it out immediately.
I credit this book with sparking my interest, so that when Applied Cryptography came out, I devoured the first half. I'm still not well enough versed in math to do most of that stuff, but I enjoy the concepts and the security field in general.
"Alvin's Secret Code" was the first crypto book I read, too. I still remember techniques explained in that book.
In the 90's, much of my research was focused on breaking chaotic secure communication techniques, and I published a number of papers describing the security weaknesses of a number of techniques - the paper mentioned in New Scientist (PRL, Vol 83) was just the last in a string of papers. Interestingly enough, it directly addressed breaking laser-based chaotic secure communications. So, I was a bit shocked to hear about the Nature paper, since it seemed just a rehash of old stuff. I have since moved on to other research areas and have not looked at the data, so I cannot pass judgement on it, but it clearly ignored a lot of previously exposed weaknesses.
That said, I do believe that chaotic techniques could be quite interesting, but will not be useful until creating systems and breaking systems go hand in hand. Unfortunately, my NSF grant was not renewed when I proposed setting up such a system, and I pretty much gave up at that point and began looking at other areas.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.