Schneier on Security
A blog covering security and security technology.
« High-Quality Fake IDs from China |
| Cyberwar Treaties »
June 13, 2012
Teaching the Security Mindset
In 2008, I wrote about the security mindset and how difficult it is to teach. Two professors teaching a cyberwarfare class gave an exam where they expected their students to cheat:
Our variation of the Kobayashi Maru utilized a deliberately unfair exam -- write the first 100 digits of pi (3.14159...) from memory and took place in the pilot offering of a governmental cyber warfare course. The topic of the test itself was somewhat arbitrary; we only sought a scenario that would be too challenging to meet through traditional studying. By design, students were given little advance warning for the exam. Insurrection immediately followed. Why were we giving them such an unfair exam? What conceivable purpose would it serve? Now that we had their attention, we informed the class that we had no expectation that they would actually memorize the digits of pi, we expected them to cheat. How they chose to cheat was entirely up to the student. Collaborative cheating was also encouraged, but importantly, students would fail the exam if caught.
Students took diverse approaches to cheating, and of the 20 students in the course, none were caught. One student used his Mandarin Chinese skills to hide the answers. Another built a small PowerPoint presentation consisting of three slides (all black slide, digits of pi slide, all black slide). The idea being that the student could flip to the answer when the proctor wasn’t looking and easily flip forwards or backward to a blank screen to hide the answer. Several students chose to hide answers on a slip of paper under the keyboards on their desks. One student hand wrote the answers on a blank sheet of paper (in advance) and simply turned it in, exploiting the fact that we didn’t pass out a formal exam sheet. Another just memorized the first ten digits of pi and randomly filled in the rest, assuming the instructors would be too lazy to
check every digit. His assumption was correct.
Read the whole paper. This is the conclusion:
Teach yourself and your students to cheat. We’ve always been taught to color inside the lines, stick to the rules, and never, ever, cheat. In seeking cyber security, we must drop that mindset. It is difficult to defeat a creative and determined adversary who must find only a single flaw among myriad defensive measures to be successful. We must not tie our hands, and our intellects, at the same time. If we truly wish to create the best possible information security professionals, being able to think like an adversary is an essential skill. Cheating exercises provide long term remembrance, teach students how to effectively evaluate a system, and motivate them to think imaginatively. Cheating will challenge students’ assumptions about security and the trust models they envision. Some will find the process uncomfortable. That is
OK and by design. For it is only by learning the thought processes of our adversaries that we can hope to unleash the creative thinking needed to build the best secure systems, become effective at red teaming and penetration testing, defend against attacks, and conduct ethical hacking activities.
Here's a Boing Boing post, including a video of a presentation about the exercise.
Posted on June 13, 2012 at 12:08 PM
• 59 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Juxtapose this with the maxim "any person can invent a security system so clever that she or he can't think of how to break it."
What is most surprising to me is how simplistic most of the solutions were. Would have expected more sophisticated approaches, though obviously the students didn't have much time. Also surprised noone was caught-- I wonder: was the proctor aware that all the students were expected to cheat?
I look forward to reading the study tonight after work. Thanks the the thought-provoking post (as usual.)
So, since the point is to teach security, if I actually managed to learn 100 digits of pi, (which is quite easy if you know mnemonic techniques,) then I'm cheating by not cheating?
I don't see how this teaches students the Security Mindset. I thought the Security Mindset is about designing things by yourself to be secure by thinking like a cracker, not simply trying to think like a cracker for cracking other people's designs.
@anonymous moose. Which is more effective, ask 20people to list ways in which an exam can be made secure - or ask 20people to find ways to cheat?
I am a recent Information Security & Assurance graduate and I think this exercise, as well as any of the others written about by these instructors, would have been incredibly useful, interesting, and exciting to be apart of.
However, I find that this practice would have to be carefully monitored and I honestly don't see many Universities allowing instructors to perform this. As intuitive as the overall message here is, the practice is in direct violation of any Universities' policy.
@Oops: yes, I thought that too. But better still, really memorise 100 digits of pi, write them from memory in the exam, and afterwards, claim to have cheated – but never let on how. With any luck the examiners will conclude that your cheating method was the best of the lot, because not only did they not catch it at the time, they still can't figure out what it was!
I'm amused that I've read 2/3 of their recommended fictional reading. They seem to have good taste, so I should take a look at "Critical System Error".
Re: University prohibition against cheating. I'd posit that if your social engineering skills aren't up to finding a loophole in the university policies, you aren't qualified to teach the course in the first place.
This raises interesting religious and moral questions for those of us in the field. It seems similar to questions related to being a soldier: "Does my religion allow me to kill in war?"
A similar question could be raised here: "Do my religious beliefs about lying and cheating prevent me from being an effective infosec professional?"
Well, at least these are questions that come up in my head.
@Oops: That's how Naruto did it, IIRC.
Kids these days. When I was in college, all of my friends knew at least 50 digits of pi. What has Google done to us all?
"Cheating exercises provide long term remembrance, teach students how to effectively evaluate a system, and motivate them to think imaginatively."
This paper is the same as just about every other paper on "the pedagogy of X" that I've read -- and as a computer science professor, I've read a lot of them: sweeping claims (such as the ones above) about the authors' techniques, with not a shred of data to support those claims. It's all anecdotal and speculative.
To demonstrate that their approach really is effective, the authors should have done something like this: Take 20 students who had gone through this cheating exercise (the test group) and 20 students who had not (the control group). A year later, perhaps in a successor course, have the 40 students do an exercise where they would have to demonstrate the adversarial mindset, such as developing some secure system. Evaluate the actual security of the students' systems. See if there is any significant difference in security between the test group's systems and the control group's systems. If so, then and only then would the authors be justified in making the claims they did.
Have the authors done such a rigorous scientific study? No. Will they ever do this? I doubt it. If they did, would the results be significantly different between the two groups? Again, I doubt it.
Sure, the technique of "controlled cheating" exercises to teach the adversarial mindset is provocative, and makes for a titillating paper. But I don't see any evidence that the technique at all helps to achieve the goal.
I don't understand how the cheating part is so difficult when you're given a computer to assist.
Not only that but by the time the kids have gotten to college I would guess that they've already had experience either cheating or seeing someone cheat.
And those past experiences were far better than the scenario outlined in the paper. Simply because they probably had real consequences for being caught.
As for the "adversarial mindset", that's easy. That's taught in most sports. You look for weaknesses in the other team's defense and you try to exploit those weaknesses.
Moving that to computer security should be just as easy.
Teams A and B defend and attack simultaneously while still providing remote services to team C.
Anything short of threats of harm to people is allowed.
Fatal System Error by Joseph Menn
That is a useful skill for software testers even outside the realm of security testing.
When I was in high school latin class, one student sat in front of the class bulletin board with clippings and stuff tacked to it. He put his cheat sheet on the bulletin board, and nobody noticed.
Why would a university policy against cheating be a problem? It's a little meta but clearly the test was to successfully demonstrate a way to cheat on an exam. It clearly wasn't to actually produce those digits since the guy who made up an answer passed.
However I would have caught 'oops' - it's not that much more effort to also check the last few digits. Even three digits would be enough to reduce the odds of a successful cheat to 1-in-1000.
I have never cheated on any exam. I was never tempted. If the material was good, it was well worth learning it. If not, I avoided the exam (sometimes possible) or just got a bad grade. I find cheating is more effort and risk than it is worth. Studying the examiners, what their preferences are, what their exam style is, brought me into the top 5 students of 240 in my year when I did my Informatics Diploma (MsC). Some people even have described me as the most honest person they know and the most hard to corrupt. (I don't think that is a fair assessment. While I may be hard to corrupt, I will lie on occasion, when I see a good reason for it.)
On the other hand, I have absolutely no problem with the security mindset. There may be an element of not wanting to do it in real-time, but I believe I could cheat well under real-time conditions by now.
My point is, while this will work well for some students, it may fail for others. (Of course I may just be distant enough from the norm for it to not matter. Apparently "normal" people lie and cheat all the time, even without good reason or significant benefits...)
However, as with any true learning, the student has to discover his/her own style of learning and has do advance his/her way of approaching problems by themselves. The described exam is a bit more of a stunt and less of teaching. If it was necessary to create awareness in the students first, it may have some value. Otherwise I would advise activities were students design security measures and then break their own and that of some others.
@Jack: I find it difficult to design security systems that I cannot break. True, there are some details, like, for example an input data check, but that may actually not be breakable. For more complex things I find that because of the border conditions (there are always some), I make trade-offs and know well how to break them, but achieve prohibitive effort. There is also an aspect of time. I sometimes find bugs in my engineering by thinking about the problem again, anywhere from days do years later.
So, no, I think that maxim is only true for pretty bad designers. Of which there are a lot, admittedly.
However much I like the comparison with the Kobayashi Maru, I'm totally with Alan and Brandioch on this.
Cheating does not make up for lack of talent, knowledge or creativity and doesn't turn a Salieri into a Mozart. As every professional in any line of work knows - whether he be an artist, an engineer or a humble pastry baker -, the difference between great and mediocre is determined by thinking out of the box and enlarging your reference frame. Cheating has got nothing to do with it.
Years ago, the company I was then working for had won a contract with a a government customer whereby the existing IBM AIX install base was replaced by Sun Solaris machines. Unfortunately, the IT manager was less than happy with this and was hell-bent on kicking everything Sun back out. As the salesreps had committed themselves that Solaris could do everything AIX did and more, at one point he had found that the native Solaris printing subsystem was unable to print to a series of plotters for which no printer drivers were available. Developing custom drivers was not an option, and none of the techies could find a workable solution. Managers and salesreps alike were desperately trying to find loopholes in contracts and T&C's to find a way out. Until one guy came up with the simple idea of replacing the native printing subsystem by CUPS, solving the then already seriously escalated issue in about half an hour. Now that's the difference between cheating (loopholes) and out-of-the-box thinking.
To put it simple: the thesis of these two professors IMHO is the wrong answer to the right question. But I'm less than surprised that this sort of ideas is being launched in a context of "cyber war", which to a lot of folks today seems to be a new and emerging battle ground where no rules or treaties apply. I find that a really scary idea. It's back to mustard gas and "take no prisoners".
Many sins can be concealed under the rubric of a training exercise.
As for University policy, it's not an actual "exam" -- it's a lab experiment using the classroom itself as the lab, with the consequences for failure being a lower grade.
3.14159265358979323846 (from memory)
Better yet, assign two lab exercises; one where half the students are proctors and half the students are students/cheaters, then swap a week later.
Rig the room for lots of covert video and audio, then present the finding(s).
I recall an incident at UC where a student was caught cheating red-handed in a large undergraduate lecture class. He boldly challenged the TA and asked, "Do you know who I am?"
The TA, puzzled, said "No."
So the student immediately buried his blue book in the pile of student blue books being turned in, and fled stage right. Cheating successful, failure to detect identity of cheater.
@Gweihir - at what german unversity did you study?
@Joseph R. Jones "What is most surprising to me is how simplistic most of the solutions were."
They had to actually use the solutions they came up with, so of course they favored the ones that succeeded with the least effort (and if their first few ideas were hard to implement, they kept thinking until they came up with something easier).
It occurs to me that while unethical people attack anything whose defeat is high profit/low risk, ethical people are attracted to finding creative attacks against 'interesting' (i.e. difficult) targets, as a sort of puzzle-solving game.
Thus ethical people who think about these things might overestimate the likelihood that a complex attack is useful or necessary.
How ironic. In a fast changing, complex, arcane field where skills acquisition is supremely important, you boil education down to a cute exam trick. As others have pointed out, there's more to security and security education than gimmicks. There is no algorithm for good security practice, much less any bumper sticker.
Or maybe ... just as the students were told not to take the exam on its face, maybe the post is itself a clever ruse, designed to hide a deeper truth ...
But I doubt it.
Here some more thoughts on the art of security and management generally: http://lockstep.com.au/blog/2010/12/21/...
@Alan Kaminsky I agree with you larger point. The article is interesting to read, but there is nothing to prove that such exercise taught them something. (But it is cool exercise and does no harm.)
However, the test you suggest would fail almost any part of any curriculum. A single few hours long part of the curriculum is rarely so influential. You can remove it and the end result will be largely the same.
Remove too many seemingly useless lessons and you will end up with very watered down curriculum. (I think that this is how dumbing down of schools happen.)
Waiting for Clive's post on this...
Possibly along the lines of reprogramming a TI calculator/ Timex watch to display all 100 digits of Pi :-)
@Andrew: The "do you know who I am" anecdote is a widespread "university legend" or just a joke which can be heard retold all around the globe in many versions and languages.
Yeah, he must have a very fulfilled life, given all the things he tried, made, witnessed personally, and always perfectly fitting the specific blog post. With all this knowledge and experience, I'm still wondering why he doesn't set up his own blog and become a known security expert on his own.
When I may have too much time on my hands and are really bored, I may read all his comments here and search them for contradictions, e.g. being at two distant locations at the very same time, to prove he made all this stuff up.
@Andrew and @Peter A.: The meme (appropriate designation...?) was used in a TV ad a couple of years ago. Should be on Youtube but don't recall the brand... Non-effective advertising...
@Peter A.: It is a widespread legend, but that doesn't keep people from trying it and occasionally being successful. I've seen it work in a Math exam in the first semester (in this case he wanted to turn in his test after the time had passed).
I'd say that if 20 students were cheating, and most of them with something as simple as a sheet of paper under the keyboard, something's wrong with the proctoring.
And that no instructor actually checked the answer beyond ten digits? Um.
PS - Finally, the government teaching university students in cyberwarfare should be at least protested as much as having ROTC on campus. J.
Reminds me of Frank Abagnale and similar 'I'm a pilot/surgeon/police -- catch me if you can' people.
What mechanism draws the line before 'I'm a cybercrime security specialist' is reached?
Or indeed a 'university professor'?
In reading the comments I'm awestruck by the number of people who eschew this test because it is not an all encompassing course on how to be an info-sec pro.
The purpose of this test was to help the students gain perspective. To replace their paradigm. To expand their mindset.
And to, ugh, quote Apple "Think different".
@BJ, the post was grandly titled "teaching" the mindset, rather than something more modest like "illustrating". You go even further, suggesting the exercise will "replace their paradigm".
I think many of the comments are in the nature of warning against generalisations drawn from small samples and ill designed experiments.
Surely the very circularity of prescribing cheating limits the usefulness of this sort of test if it were ever to become institutionalised.
In Farrar's "Eric", the students take it in turns to build a crib sheet for the next test. They just stick it to the front of the teacher's desk so they can all copy from it.
Doubles up as a group version of the prisoner's dilemma.
Actually, I am going to disagree with some here. Teaching a security mindset is overblown. Some people "get" it and some don't. I think you can teach it to some degree but many will never be skilled at it.
I found the test highlighted to be cute. I would have approached from a historical standpoint. What did U.S. do to get the Yorktown back up and turned around in 72hrs? Or for Clive, I seem to remember Churchill and chaff. I would be curious to see what (I know it's too easily turned into junk science) what nationalities/personal traits may be extrapolated.
Some people are going to be inclined and even gifted in security mindset. You can hone it and guide it, but still some will be horrible at it. I.E. I can tell you how to fix/do something, but you DO NOT want me welding a power tool. Actually, I may wind up welding a power tool to the workbench. Microsoldering yep, chainsaw, No...
Interesting exercise. I'd love to see an iterative process, where on the second attempt either a) it is explicitly stated nobody is allowed to use a cheat that had been previously used, or b) it's not explicitly stated, but the proctor is given a list of the first exercise's cheats and thus is on the lookout for those things.
Also, not sure if it's really spelled out in the full article (haven't read it yet), but it sounds like you could boil down the lesson into three points:
Understand the system
Understand the flaws *in* the system
Understand how to go *outside* the system
@Jurgen, thanks for the video link. I'm sure it's happened many times in many places - but in the specific incident I recall in the mid 90s at UC (forgive my vagueness), the pile of blue books was on the floor, the challenger was a TA and not a professor, and no fruit was harmed. Also, the student had cheated by copying from another student. The TA asked others if they recognized the student and received no answers.
I agree with you that thinking out of the box is an innate talent that is lacking for the task for most designers in infosec, because of the complexity of the challenge.
Guy Kawasaki said, "those on the first curve are unable to comprehend, let alone embrace the second curve."
If that is indeed the case - that people can't even "recognize" real innovation if it jumps the curve, than who has the ability to think out of the box and come up with one, because one tends to start with what one already knows, and builds on it ???
Hey, "A. N. Onymouse," interesting post. We know that you're really Clive, though.
As an engineer (retired) in a very large internationally competitive field. I've had to encourage the same "cheat" mindset to design and build various machinery and to bypass someone else's patents. The result is no other choice but generate a creative or 'work around ' solution. We as engineering managers want our team to come up with innovative ideas. The same thing we want our students to do. Our perceived problem was administrators that would rather pay royalties than take risks.
"A.N. Onymouse" couldn't be Clive. All the words in the post were spelled correctly. :-)
@Fastes diplom student in karlsruhe: Karlsruhe as well. I might know you ;-)
Lol awesome. Some kids in my university used to cheat by mail ordering 'IR contact lenses' from online gambling cheater sites and writing info with 'invisible ink' all over their arms, hands, legs and whatever else they could use. Nobody was ever caught, bet they still use this method
I'm amazed that with all these comments about security mindset vs cheating I have not seen it mentioned that a Cheater ONLY really needs to find ONE good way to cheat, however a good security person has to defeat EVERY conceivable cheating method. It is kinda asymmetric...
So showing people how easy it is to develop one workable cheat does not in any way show the breadth of workable cheats. A really good cheater with 5 or 6 different proven / perfected cheat methods would still be only an average security professional.
Iteresting I did a course on negocaition a while back and one excersise was the red blue game and as all of the course atendees where from BT System Enineering we spent ages trying to find flaws/optimal stratergy in the game to win it.
Though should you not realy be teaching Meta game thinking
Does this mean that OGA's should be looking for long time role players familiar with min maxing and power gaming.(flutters eyelashes in a hopefully seductive way - Sir Harry you know you need a repacement for that poor PFY that brought the farm last year )
Wish I had kept the narked email i got from one of the founders of a well know gaming company when I developed an interesting character by taking one aspect of thier world to its logical conclsuion.(werewolf rights activist)
I did like my idea of werecats doing Lo Lo from C130's without paracutes though - cats land on their feet right.
A fascinating study, but I think that the real value of the lesson would have been during the debrief, not during the test itself. During the test you are only exposed to one way (or two ways) to cheat, but during the debrief you can learn 20+ ways to cheat - as @RobertT noted, one way to cheat is often not enough.
I wonder if I am the only one who would have already known the first 100 places? I would then have acted highly suspiciously, but not obviously using any particular method to cheat. When challenged, I would have refused point blank to reveal how I had cheated, unless given something substantial in return, such as a guaranteed 10-year tenure at the Uni. in a teaching position.
Sometimes the greatest "cheat" is pure dumb luck! For a similar reason I have learned how to correctly pronounce "Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch" (The Welsh Village) just to be absolutely insufferable someday to someone who is boring me...
Surprised at how many commenters got hung up on the term "cheating" here. It's not cheating if you are told that the expectation is to find some other way to provide the answer than actually remembering the answer - so it is not in violation of anything at all. What it is teaching is adversarial thinking, which is a foreign concept/practice for most developers. Speaking from personal experience as an Architect, most developers are focused on how to make something work, and inherently think along lines of cooperative processes/behavior. After all, one cannot make something work if the pieces are one uses are misbehaving. It is a natural, and productive mindset. But security issues are inherently working in a world of adversarial relationships where people are specifically trying to misbehave in order to find a way to break something that is working. People who are good at QA have much more of this mindset, but such people are few and far between.
"The views in this article are the authors’ and don’t reflect the official policy or position of the United States Military Academy, the Department of the Army, the Department of the Navy, United States Cyber Command, the Department of Defense, or the United States Government."
More's the pity.
> So, since the point is to teach security,
> if I actually managed to learn 100 digits
That's what I would have done. It's the only really foolproof way to guarantee you won't get caught (and thus will pass). Okay, so they were obviously not really *trying* to catch anyone cheating -- but the students couldn't have been entirely sure of that in advance. Near the end of the movie _Catch Me If You Can_, the FBI agent dude asks the main character (who made a career out various forms of forgery, fakery, and cheating) how he beat the bar exam. He responded that he basically just crammed and passed it. It's the easiest way.
A hundred digits isn't even hard. If you can't memorize a mere hundred digits and retain it long enough to pass one test, what are you doing in college? How do you plan on passing an Appreciation of Fine Arts final where you have to name the artist, era, and movement for each of several hundred famous works of art and music, if you can't memorize? You may be able to derive many of the DiffEQ formulae on the fly, but what are you going to do about all those dates you have to learn in Western Civ, the piles and piles of vocabulary and inflectional endings in your foreign language classes, et cetera ad infinitum ad nauseam ad bedlam? College involves a LOT of memorization. If you can't figure out how to memorize, your GPA is going to fall below 3.0, at which point all your scholarships will magically disappear and you will not be in college any more.
> I've seen it work in a Math
> exam in the first semester
Extra time might be just about the only practical way to cheat on a typical college-level math exam (other than perhaps getting somebody to take the exam for you).
That whole "show your work" thing is sheer genius. Having the answers gets you nowhere unless you know how to work the problems (which is, after all, what the exam is testing). If you copied off somebody else's paper, it would be extremely obvious unless you understand the material well enough to make significant adjustments to the details of how the problem is worked, but if you can do that you don't need to copy. Similarly, cheat sheets won't help you unless you understand the material too well to need them. Sneaking into the prof's office ahead of time and filching a copy of the exam won't help either, because you still have to understand the material well enough to work the problems. Basically, all the traditional cheating techniques fail to get you around the need to know how to do the math.
This doesn't mean you couldn't cheat in a math _course_, of course. If nothing else you could slip in and alter the teacher's record of your grades, and of course you can always collaborate with other students on your homework. (Another approach would be to forge a transcript from some other college and transfer the course in. I'd be very surprised if that's never been tried.) But cheating on a math _exam_ would be an extremely thorny problem. I'd bet money that the most common way of doing it involves getting somebody else to take the test in your name.
I remember a course lecturer telling me and the other 4 students in his group that he had set 5/60ths of the ten maths questions in the end of year exam, and much to our surprise, he advised we cheat.
"Supose we get caught?" someone asked.
"You didn't deserve to pass" was his answer.
I cheated in that exam and in every exam since.
I've never been caught because the fact I'd created a cheat sheet usually meant I remembered the cheat and didn't need to look at it.
Fortunately nobody ever asked me for 100 digits of Pi in am exam. I can only remember it to 7 decimals. Good old 7 figure log tables.
I should add that the lecturer above had memorised the log and trig tables to 3 decimals and would always beat our early electronic calculators to the answer in his head to 3 decimals. and often to 5.
The thing that upset me when I was a TA teaching computer science was not that students cheated, but that they cheated so poorly and expected us not to notice. I can't even pretend I don't see cheating when it's blatant cut and paste where you didn't even change variable names. That implies that they either didn't think we read the code they turned in or thought we were so stupid we could never notice their misbehavior? It was really disheartening.
Every time I had to go through the "plagiarism is wrong and what were you thinking" speech all I could think was: come on guys, you're smarter than this. If you don't want to do the work at least do a less offensively poor job of cheating. :/
As a method to encourage thinking outside of the box, calling it "cheating" may help put you in the right mindset.
I recall from my Navy days and old grizzled Chief teaching us damage control, a real life-and-death skill. He said, "If you're not cheating, you're not trying."
Ignore your impression of what "the rules" are, and realize that anything is possible.
So what if I remember over 100 digits of pi.
Not with any nmeumonic devices or any of that crap.
I just know it.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.