The Importance of Security Engineering

In May, neuroscientist and popular author Sam Harris and I debated the issue of profiling Muslims at airport security. We each wrote essays, then went back and forth on the issue. I don’t recommend reading the entire discussion; we spent 14,000 words talking past each other. But what’s interesting is how our debate illustrates the differences between a security engineer and an intelligent layman. Harris was uninterested in the detailed analysis required to understand a security system and unwilling to accept that security engineering is a specialized discipline with a body of knowledge and relevant expertise. He trusted his intuition.

Many people have researched how intuition fails us in security: Paul Slovic and Bill Burns on risk perception, Daniel Kahneman on cognitive biases in general, Rick Walsh on folk computer-security models. I’ve written about the psychology of security, and Daniel Gartner has written more. Basically, our intuitions are based on things like antiquated fight-or-flight models, and these increasingly fail in our technological world.

This problem isn’t unique to computer security, or even security in general. But this misperception about security matters now more than it ever has. We’re no longer asking people to make security choices only for themselves and their businesses; we need them to make security choices as a matter of public policy. And getting it wrong has increasingly bad consequences.

Computers and the Internet have collided with public policy. The entertainment industry wants to enforce copyright. Internet companies want to continue freely spying on users. Law-enforcement wants its own laws imposed on the Internet: laws that make surveillance easier, prohibit anonymity, mandate the removal of objectionable images and texts, and require ISPs to retain data about their customers’ Internet activities. Militaries want laws regarding cyber weapons, laws enabling wholesale surveillance, and laws mandating an Internet kill switch. “Security” is now a catch-all excuse for all sorts of authoritarianism, as well as for boondoggles and corporate profiteering.

Cory Doctorow recently spoke about the coming war on general-purpose computing. I talked about it in terms of the entertainment industry and Jonathan Zittrain discussed it more generally, but Doctorow sees it as a much broader issue. Preventing people from copying digital files is only the first skirmish; just wait until the DEA wants to prevent chemical printers from making certain drugs, or the FBI wants to prevent 3D printers from making guns.

I’m not here to debate the merits of any of these policies, but instead to point out that people will debate them. Elected officials will be expected to understand security implications, both good and bad, and will make laws based on that understanding. And if they aren’t able to understand security engineering, or even accept that there is such a thing, the result will be ineffective and harmful policies.

So what do we do? We need to establish security engineering as a valid profession in the minds of the public and policy makers. This is less about certifications and (heaven forbid) licensing, and more about perception—and cultivating a security mindset. Amateurs produce amateur security, which costs more in dollars, time, liberty, and dignity while giving us less—or even no—security. We need everyone to know that.

We also need to engage with real-world security problems, and apply our expertise to the variety of technical and socio-technical systems that affect broader society. Everything involves computers, and almost everything involves the Internet. More and more, computer security is security.

Finally, and perhaps most importantly, we need to learn how to talk about security engineering to a non-technical audience. We need to convince policy makers to follow a logical approach instead of an emotional one—an approach that includes threat modeling, failure analysis, searching for unintended consequences, and everything else in an engineer’s approach to design. Powerful lobbying forces are attempting to force security policies on society, largely for non-security reasons, and sometimes in secret. We need to stand up for security.

A shorter version of this essay appeared in the September/October 2012 issue of IEEE Security & Privacy.

Posted on August 28, 2012 at 10:38 AM58 Comments


Mark Gent August 28, 2012 11:27 AM

It’s not just Security that suffers from this lack of public understanding. The popular grasp of technology, and the associated media narrative hasn’t really moved on since the end of the 19th century. Ordinary folks don’t even understand electricity or optics properly, so how are technologists expected to explain PKI etc?

Zombie John August 28, 2012 11:46 AM

Unfortunately, politics is embedded in our security. Politicians, long ago, realized that there was a lot of clout to be gained by scaring the crap out of people and then offering an easy solution. Politicians today will do and say nearly anything to maintain their power (mushroom clouds and dirty bombs). The “news” is no longer “The News” but a corporate profit center that thrives on fear. Our culture is not designed for rational conversation.

Matt Middleton August 28, 2012 11:50 AM

I agree that security engineering needs to be recognized as a valid profession. In order to do that, something akin to certification or licensing needs to happen. I’m not suggesting that the multiple-guess exam model of certification should be used; in order to reach legitimacy comparable to other kinds of engineers, some agreed-upon standards of training need to be generated, and need to stay out of the hands of vendors.

Hugh No August 28, 2012 11:53 AM

I read Mr. Harris’s essay and, based on what he wrote, I certainly wouldn’t consider him to be intelligent. There are far too many things that he believes just because, with no desire to do even the most fundamental fact checking. “Making stuff up” now substitutes for careful research these days no doubt because it’s easier, and we all suffer as the entire country becomes less and less knowledgeable because of it.

Alter S. Reiss August 28, 2012 11:55 AM

I think that part of the disconnect here is that security means a bunch of different things; security in the sense of “this transaction is secure” is not the same thing as security in the sense of “Linus has a security blanket.”

Having a vault with lots of fancy locks may or not make a bank more secure in the first sense, but it can also make it more secure in the second sense–people might feel happier about putting their money in a bank with a fancy vault, because they can be sure nobody will sneak in and steal it.

When it comes to screening Muslims on planes, you can explain how it doesn’t add to security in the first sense, but that doesn’t change the fact that for a lot of people, it would help in the second sense.

The real danger posed by someone who looks Muslim on a plane isn’t that he’s going to blow up the plane–the plane isn’t going to blow up, and if it was, that guy isn’t that likely a reason. The real danger is that the guy sitting across the aisle from him is going to spend the flight spending his whole flight worrying that the Muslim is going to blow up the plane, instead of enjoying the delightful meals and wide range of entertainment that the airline has provided.

Granted that the guy is an idiot, and also a racist, the fact remains that there are a lot of racist idiots out there, and Lord knows they make their voices heard.

Policies that improve security are often directly opposed to policies that improve “security”, and while I’d prefer the first to the second, I don’t think it’s wise to pretend the second isn’t there.

simpson August 28, 2012 11:57 AM

There is an enormous amount of “everyone is stupid except for me” in security. For example, who is the “amateur” you refer to? Is it someone who doesn’t get paid for what they do? Is it someone who hasn’t been doing it for very long? Is it someone who hasn’t taken a test?

There are a lot of assumptions being made in security and a lot of self-serving statements made by those who work for vendors selling some kind of security related product. Bill Brenner commented in his blog about the RSA show, regarding all the vendors, there were real guys who at least tried, and then there were the “pretenders.” I don’t know if he ever distinguished which vendor belonged in which category because it is intended to remain ambiguous, similar to your use of the term “amateurs.”

When I read statements like this I generally interpret them to mean ‘those other guys that didn’t check with me first.’

Lizzz August 28, 2012 12:13 PM

So we have CERT Secure Coding Standards to test to. Not usually a fan of multiple choice, but IMHO, if you can T/F these recommendations, you know a hell of a lot, and have certainly increased awareness.

Bill August 28, 2012 12:14 PM

What we need is to get the average person… not the geeks, or the pundits, but the average joe and jane, to see the opportunity in “security engineering” as a career path for their kids/grandkids heading to college. Appeal to getting those kids off the couch and into lucrative careers, and they’ll pay attention.

FP August 28, 2012 12:27 PM

Harris wants to racial profile Muslims not because he doesn’t grok security, but for the same reason Harris argues to torture Muslims, nuclear bomb Muslims, be at war with Islam, and calls Islam the world’s greatest threat: Harris is an irrational bigot. His arguments are tied to his emotional psyche and fears, and have nothing to do with intelligence or rationality.

JP August 28, 2012 1:00 PM

Looking at Security from a modern business point of view.

Business operations has migrated over the past 25 years into the environment that there is a procedure, process, or script for every activity. Anything outside the script is painful.

Businesses do this because everything is now measured on meeting targeted results for every activity. Treat all customers equally and push them through the process as quickly as possible. Consistent results are high quality results.

Security is here as well. Acheive targeted results. Anything above the target is waste. Security is now a regulated environment. The requriements are watered down to minimums that every business should be able to meet.

In the early phases and among the weak performers, this mandates improvement. The poor performers, however, stop improving at the target level. The high performers get cut because management sees it as waste. Very few businesses have a recognized risk level that justifies tight security.

In the end, we have mandated substandard security by mandating compliance with rules that cover a few basics but not real performance. Further, the performance standards get pushed up the supply chain. Security is not perceived as adding value and the specifications are minimal.


Kent August 28, 2012 1:13 PM

My fear is that the public isn’t interested in spending any time learning or understanding anything. I saw a series of books the other day, all entitled “30-Second ????”, 30-Second Psychology, 30-Second Economics…people think that these will make them an expert on a subject. Why has it taken us years to develop this “security expertise”? We should have read the 30-Second book…We have a long road ahead to convince people about security engineering, but all they really want is the 30-Second version.


TimH August 28, 2012 1:17 PM

Cars can’t be sold in production without passing crash tests. You can’t buy a handgun with a safety that was designed by “guys who at least tried”, to quote simpson.

Electrical equipment is tested to pass the finger probe test, so your kid can’t electrocute herself by sticking her pinky in it.

There are resistance grades for safes (time taken for a professional to open them), and recognition that the best safes simply delay the attacker.

But somehow there is no formal metric for consumer electronic locks, whether physical devices such as garage door openers, or software firewalls.

nobodyspecial August 28, 2012 1:27 PM

@timH – But those are relatively simple test metrics, the impact can’t exceed Xg, you can’t put a finger bigger than Ymm into the socket.

For the tests to be useful the electronics lock would have to certify “can’t be hacked in anyway we haven’t thought of” that would be like requiring a car to be tested against being able to have accident.

There are lots of sets of computer security requirements (orange book, C2 etc) but they mostly restrict what certain levels are permitted to do – none of them certify that the system can’t be hacked due to a coding error or a hardware defect.

Even with a simple mechanical lock you can test that it needs a certain force to break – but how do you prove that there is nobody in the world with the skill to pick it?

boog August 28, 2012 1:38 PM


There is an enormous amount of “everyone is stupid except for me” in security. For example, who is the “amateur” you refer to? Is it someone who doesn’t get paid for what they do? Is it someone who hasn’t been doing it for very long? Is it someone who hasn’t taken a test?

am·a·teur (noun)

  1. Someone who is unqualified or insufficiently skillful: The entire thing was built by some amateurs with screwdrivers and plywood.

The meaning was pretty unambiguous, considering the context.

childish August 28, 2012 1:40 PM

@Alter S. Reiss – The notion that “secure in the second sense” is secure is exactly what needs to be changed in the public consciousness. Just like security by obscurity is a fallacy, so is the idea that one’s feeling of security actually makes them secure.

Spending money to provide the feeling of security gives us things like the TSA and the “virtual” border wall, which are extremely expensive and highly visible but not as effective as other, cheaper (or at least no more expensive), less visible options would be. But politicians spend the money there because the general public would rather be able to see that they’re being “protected” rather than actually being protected but not able to see the protection. I’d wager security engineers would rather have the protection and be able to test its effectiveness, even (especially?) if it’s not constantly visible.

Figureitout August 28, 2012 2:00 PM

Many people have researched how intuition fails us in security
–And technology (made by homosapiens) or in this case security engineering never does? I’m not jabbing you either, just asking.

What about like some people talked about with Apple’s iphone security, where you relax and use the device thinking it’s reasonably secure?

The biggest problem in my view, and the prominent posters on this site have talked much about it, is having too many processes and services and applications and capabilities running on devices that there is no chance in hell of keeping it secure.

The other day I got invited to a small owners box in a restaurant, and I kid you not, we were trying to find music and we couldn’t find it, so we pushed the “Manager” button and the manager’s computer came up on the 4 flatscreen tv’s in the room, and I could see all the employees on the spreadsheet the manager was working on; basically it was like connecting an HDMI cable to his computer. Granted, it was the “owner’s box”, but why would you keep the capability for other customers?–For the record, we turned it off rather quickly, out of respect.

When I make a stand for someone’s intuition (also depends on who that someone is), I’m not saying that should be the extent of security. I don’t trust others’ intuition just like ITsec engineers don’t trust systems they can’t verify.

our intuitions are based on things like antiquated fight-or-flight models
–They are also a product of thousands of years of evolution, how old is technology, not the physics, but the tech products?

Computers and the Internet have collided with public policy.
–Which is why even though I see your point in that you want people to only discuss security on your blog. But, you can make a security argument with pretty much anything; and people need to debate and influence tech policy, otherwise it looks like what it is now.

Elected officials will be expected to understand security implications
–Don’t waste your time. You won’t breach the egos.

We also need to engage with real-world security problems
–Yes, please do. Also, our brains can be equated with computers (that formed and coded themselves) and can be “hacked” or “hypnotized”. I’ve tried to make people wary of that and yes I’ve done it to some unsuspecting people; just to test, I gained nothing but insight.

we need to learn how to talk about security engineering to a non-technical audience
–Isn’t there some quote about people not really knowing something if they can’t explain it somewhat simply or in a different way?

simpson August 28, 2012 2:58 PM

@boog – You can pick and choose the definition if you like, but that’s exactly why it remains ambiguous. How is the person unqualified or insufficiently skillful? If it’s based on results, are the results good if they never produce anything bad or should they at least once actually design and build something good? Enterprise IT has earned a reputation by now of saying “No” to everything. So, why not shut the whole company down. Then there would never be a data breach. Does that make them skilled?

All these explanations are like saying ‘the reason it’s raining is because water is falling from the sky.’ EVERYONE is unskilled until they prove otherwise, but if anything they first produce is dismissed out of hand then obviously, it has nothing to do with skill. Somewhat tangentially, read Scott Aaronson’s “Eight Signs A Claimed P≠NP Proof Is Wrong” and his warning not to judge a submission in accordance with the person’s credentials – that there are too many cases in the history of mathematics in which doing so led to big mistakes. Security is mathematical in many ways.

If all these comments about the security of devices and systems are true, then how in the world can it not be true about the people behind them? I hear that anyone can design a system that they themselves cannot defeat, and no system is perfect, etc. So, how much more difficult is it to specify what constitutes skilled (non-amateur) in a person. I attended a conference at the University of Va. Someone who knows a lot more than me stated that the biggest problem in security is that it cannot be measured. Are you saying that the skill of a person that designs secure systems can actually be measured? It’s ridiculous.

The problem is even bigger than that, since so many systems designed and built by “experts” have been penetrated. Is it possible that even trying to lay claim to expertise sets the stage for failure? Maybe the way a person thinks is far better than what they know. How can you measure that?

boog August 28, 2012 4:17 PM


You can pick and choose the definition if you like…

Thank you, I’ll rightly choose the only definition that makes any logical sense in that sentence of Bruce’s essay.

How is the person unqualified or insufficiently skillful?

By definition. You see, when Bruce used the word “amateur” he didn’t mean “people who are qualified and sufficiently skillful.” If so, he would not have used the word “amateur” because that wouldn’t make any sense.

But I’m actually not sure with whom or what your beef is. You seem to be arguing over how to measure a person’s skill level or determine who’s an amateur and who isn’t. That’s not really discussed in the essay (the essay just says that “amateurs produce amateur [products]“, which is pretty fair), so I’m not sure what your point is. Can you please clearly and concisely state what your position is and how it contradicts what Bruce was saying?

John Schilling August 28, 2012 4:42 PM

Unfortunately, the lead paragraph of this post makes the next seven read as “People should shut up and listen to us because we are experts and this is important enough that only expert opinion should be considered”.

I hope that wasn’t the intent, and I am pretty sure that if it was it would not lead to good security. So, basically, junk the seven paragraphs in the middle of this post, keeping the first and the last.

Because the Schneier/Harris “debate”, really more of two people talking past one another, is a prime example of why this community needs to do a better job of communicating with the non-technical audience. But why is easy, it’s the how that’s going to be tricky.

Travis Jensen August 28, 2012 5:43 PM

But how exactly do we accomplish that? A security engineer will talk about how it is impossible to prevent all attacks. That’s not what John Q Public wants to hear, so it will immediately discredit the engineer.

sean b August 28, 2012 7:27 PM

When you order a pizza, do you thing about how many hands have touched the dough? No. You just want it to be hot.

Some professions just aren’t suitable for public discourse about it’s methods and disciplines. I think security is one of them.

People don’t care. They just want it to work without their ssn being stolen.

Oliver Jones August 28, 2012 7:27 PM

I wonder about the merit of certification for security engineering. I think “heaven forbid” might be an unhelpful attitude. Other branches of engineering have “registered professional engineer” certificates. So, of course do other trades such as medicine, nursing, and accounting.

Each of those trades has to enforce various practices that seem paradoxical to untrained intuition. The practice of vaccination is an obvious example. It was just as controversial and useful to demagogues when it was new as statistical risk-assessment is now.

I’m not sure the discipline of security engineering is different. It’s just younger and less developed. If it were a certified profession it could have standards of practice that cut through at least some of the political BS.

Clive Robinson August 28, 2012 8:34 PM

This is one of those “oh dear” moments…

The expression “security engineer” is a compleate nonsense and is used to try and provide a “proffessional” “look and feel” to a “snake oil” market. That is the only “proffessional” aspect of the market is people get paid, but then so do con-artists, confidence tricksters and all maner of other neerdowells.


Well for a start split the term into it’s to parts “security” and “engineer” and ask yourself what the meaning of each word is in concrete not ephemeral or abstract terms.

So what is “security”… Ok stop waving your arms the simple answer is there are so many possible definitions it’s both “all things to all men” and “nothing to everyman” and is thus very much devoid of tangible meaning. As some readers of this blog are aware there are languages for which there is no equivalent word as the english word “security”, the nearest they have is the slightly more tangible equivalent of “safety”.

So what is an “engineer” … Well the usual definition revolves around distinguishing them from “artisans” or “craftsmen” by the fact they use “science” not “gut fealings”, fancy or whims. And by science in engineering terms we generaly mean “hard science” where the results are consistant and quantifiable, thus measurable by all people in the same way each and every time.

The simple fact is that security is a nebulous word that promotes “arm waving” “you know it when you see it” type responses when you push people on what it means. This is because it is in reality a word with out substantive meaning that all can agree upon, and this is a real problem. Because without agreed meaning it cannot be characterised, which in turn means you cannot measure it in a meaningful way, which in turn means it’s not amenable to hard science, so cannot be “engineered” in the traditional sense of the word.

Thus also you cannot decide if what someone is espousing is sense or nonsense in either their or your own terms of refrence. Because you cannot charecterise it and come up with meaningfull methods of test because you have no meaningful measures with which to work or importantly record outcomes as meaningful observations from which meaningful deductions can be drawn.

Security is thus devoid of not just meaning but science and thus it cannot be engineered, so the expression “security engineer” is likewise devoid of meaning and is just another name to call something that in turn is devoid of meaning.

Another name is “charlatan” and this has a much more widely accepted meaning,

Charlatan Noun: A person falsely claiming to have a special knowledge or skill; a fraud

Would anyone care to comment how to differentiate “security engineer” from “charlatan”?

Because if “we” as “security practitioners” can not do it meaningfully and in an easy to explain way, then how do we expect the rest of humanity to differentiate the two?

Or other neerdowells such as politicians and those looking for appropriations from them taking advantage of the term to their not our benifit?

@ boog,

As a point of interest the main difference between an “amateur” and a “proffesional” in the past has been if you got paid to use those abilities/skills or not, usually as a primary or significant source of income. Not as a measure of skill or proficiency at them.

At one time this distinction used to be quite important because if a sportsman was paid they were considered to have became or “turned” “proffesional” and thus they were then not alowed to compete in the Olympics or many other sporting events.

If you care to look up both amateur and proffessional you find,

Amateur Noun: A person who engages in a pursuit, esp. a sport, on an unpaid basis

Professional Noun: A person who engages in a profession.


Profession Noun: A paid occupation, esp. one that involves prolonged training and a formal qualification.

Oh and we used to use other terms to differentiate those who were paid or not whilst learning, one of which is,

Apprentice Noun: A person who is learning a trade from a skilled employer having agreed to work for a fixed period at low wages.

I’ll let you look up the appropriate antonyms.

Preston Tollinger August 28, 2012 9:31 PM

The problem is not so easily solved by even having a formal, recognised profession. I had the experience of sitting on the Traffic Commission of a small California town in the south bay. The goal of the volunteer commission was to advise the city council on traffic related matters. The city, of course, employed several traffic engineers and outside consultants but the goal here was to help filter ideas via citizens views as well. Unsurprisingly this kind of position generally attracted engineering types or at least those who had that mindset. This was good because traffic is a field much like security, where the ‘obvious’ right answer is often not right. In general, we worked well with the traffic engineers because we recognised their expertise. We could come to accept that lowering speeds didn’t necessarily make the roads safer, that painting a crosswalk could actually increase the chance of an accident and that even if there was an accident on a specific corner, it doesn’t mean it is the highest priority item to fix right now.

The traffic commission was eventually disbanded because it was useless. None of our recommendations, even backed by the traffic engineers and city staff, made it past the city council unscathed. People could not get past their intuition or the ‘obvious’ correct answer even when presented by a trained and certified traffic engineer. Residents would argue passionately, city council would agree and intuition would win.

This is clearly just one example but it feels representative of the problem all the professions where a layperson can feel like they can do as well as the professional. We never found a way around this problem in our city at least for traffic and I fear the same problem will occur even with a certified security engineer advising them.

Anonymous10 August 28, 2012 10:06 PM

I don’t believe there is such a thing as security engineering, at least in the context of their discussion. Engineers and hard scientists are unique from other professions in that A) they can prove that a system will work mathematically provided that its’ subsystem tolerances are within some defined range and B) they conduct controlled experiments that are repeatable, such that all engineers conducting the experiment get the same result every time. Bruce’s racial profiling arguments fail both tests for being engineering.

S Taylor August 28, 2012 10:07 PM

People are the only security problem. From securing a server to knowing how to detect social engineering, people fail to deliver good quality, fail to cooperate, create bureaucracies instead of fixing issues and so forth. This thread is right on the mark and several of the comments I have read make me believe that others think the same.

Wael August 28, 2012 11:56 PM

@ Clive Robinson

Well for a start split the term into it’s to parts “security” and “engineer” and ask yourself what the meaning of each word is in concrete not ephemeral or abstract terms.

Well, “security engineering” is not unique in that respect. Sometimes splitting a two-word phrase into it’s individual components yields results that don’t make sense. Another example is “random variable” in probability. It’s neither random, nor is it a variable.

AC2 August 29, 2012 12:35 AM

Bruce I can appreciate your sentiment learning how to talk about security engineering to a non-technical audience.

In a democracy this is a necessary prerequisite to any kind of push back against the forces lobbying for ‘security’ under their own agenda. Because unlike some areas like particle physics or neurosurgery or microprocessor design, ‘security’ is a societal endeavour, where everyone participates willingly/ unwillingly, knowingly/ unknowingly. Certifications for security engineers aren’t going to get us anywhere.

But I do not expect much chance of success in this endeavour. As several posters have pointed out, most people don’t care and most of those that do think that their ‘instinct’ is good enough to make correct decisions. And because there usually isn’t a feedback loop that affects decision makers if their decisions are wrong the bias will continue. If and when the decision makers are confronted with the failure of their decisions the standard response is ‘this is a new attack not seen before’/ ‘we will learn from this setback’ etc.

Re your second point that we need to convince policy makers to follow a logical approach instead of an emotional one… Well they are usually taking a logical approach already, although one based on the monetary benefits to themselves rather than benefits to the group they make decisions for.

AC2 August 29, 2012 12:39 AM

I fear people like Bruce and others will continue to be ignored by people and policymakers alike irrespective of the value of their points, a bit like the economics Nobel winner Krugman…

Justin August 29, 2012 2:39 AM

Well, I don’t think it ever was about security. That was never the goal of TSA airport screening. It always was about offering teddy-bear comfort to people who are emotionally scarred for life from seeing images of planes crashing into buildings on TV. It’s all for show. It would cause a public outcry if a 4-year-old white girl were searched, or a if a Middle Eastern man in a turban were allowed to board without the “full treatment” (never mind people will use their children to hide contraband.)

This Sam Harris is quite the character. How ironic that he started some organization supposedly “… to encourage critical thinking and wise public policy … with the purpose of eroding the influence of dogmatism, superstition and bigotry in the world”! Unfortunately his own fear of death seems to be such that it prevents him from thinking critically about life, not to mention his rather irrational fear of others’ beliefs in Paradise.

Jack Jack August 29, 2012 7:08 AM

Competent security engineering is important but this discipline will do nothing to change the minds of fear-driven know-nothing’s like Harris, who continues to bury his head and spew nonsense:

many readers who took my side in the debate—including those who have worked in airport security, U.S. Customs, the FBI, Delta Force, fraud detection, and other areas where real-time threat assessments must be made. I also received unequivocal support from Saudis, Pakistanis, Indians, and others who are regularly profiled. …

no one does airline security better than the Israelis (even Schneier admits this). But, as I pointed out, and Schneier agreed, the Israelis profile (in every sense of the term—racially, ethnically, behaviorally, by nationality and religion, etc.). In the end, Schneier’s argument came down to a claim about limited resources: He argued that we are too poor (and, perhaps, too stupid) to effectively copy the Israeli approach. That may be true. But pleading poverty and ineptitude is very different from proving that profiling doesn’t work, or that it is unethical, or that the link between the tenets of Islam and jihadist violence isn’t causal.
Schneier conceded that the most secure system would use a combination of profiling and randomness. He simply argued that profiling for the purpose of airline security is too expensive and impractical. But I am not being vilified because I advocated something expensive and impractical. I am being vilified because my critics believe that I support a policy that is shockingly unethical, well known to be ineffective, and the product of near-total confusion about the causes of terrorism.

At least the blowback to Harris’s racially motivated approach is measurable. [Hi MR.]

phred14 August 29, 2012 8:45 AM

There are other mindsets, and sometimes those other mindsets can be sensible.

There is a lot of discussion here about the “security mindset”, and I’ll agree that here and in these areas the “security mindset” is viable and valuable. But there are other mindsets, and in other areas those mindsets can be viable and valuable, as well. What can be interesting here is where those other mindsets intersect with the security mindset.

I’m a chip designer, with over 30 years experience. Especially since getting into CMOS, every chip I’ve ever worked on has had some sort of “back-door”. It’s also called “debug-ability”, and it’s saved our hides countless times. We’ve had new chip designs arrive DOA, and used our back-doors to bring them to life, coming up with workarounds to do thorough characterization, and refine our design for the next pass.

Problem is, you never know when you’re done until you are, so it’s not like there is ever a real opportunity to remove the back-door. You’re done when the design is declared good, or at least good enough. At that point, you don’t touch it any more, unless you absolutely have to. (meaning that it wasn’t really done, before)

So the back-door remains forever in the design. This is where it intersects with the security mindset. Usually our back-doors use a different interface, which isn’t brought out to the normal bonding pads. But there are more formal back-doors, used more formally for test purposes, such as JTAG and scan paths. Some of these ARE available in the final product.

I’ve started seeing stuff recently about chip-level security, detecting counterfeit chips, and even one reference about using JTAG as an illicit back-door. It’s worth remembering that those back-doors are always there, and very likely much simpler to crack into than cracking your way into the front door.

Jay August 29, 2012 9:28 AM

Not sure about Doctorow’s idea of a war on general-purpose computing. I think the war on general-purpose IO is scarier. And more losable.

HDCP might suck as a cryptosystem, but it does successfully restrict video I/O to the approved list of manufacturers. And high-speed buses and parts only available in BGA form add their own restrictions. Extrapolating, if your Arduino or built-from-discrete-logic computer can’t ever talk to a Blu-ray drive or chemical synthesizer, what do They care what computations it can run?

boog August 29, 2012 10:24 AM

@Clive Robinson

As a point of interest the main difference between an “amateur” and a “proffesional” in the past has been if you got paid to use those abilities/skills or not… Not as a measure of skill or proficiency at them.

While contrasting the word “amateur” with the word “professional” might seem to demote my suggested definition in terms of significance, I would hardly call the point, as you say, interesting. Nevertheless, I will stubbornly insist that the definition I suggested still makes more sense in the context of the essay.

Feel free to disagree with me if you like, but I refuse to debate the intended meaning of “amateur” any further, as continuing to argue over what Bruce meant by one word in one sentence in one paragraph of an entire essay is essentially putting words in his mouth, and is almost as useless an exercise as analyzing the meaning of the term “security engineer”.

stvs August 29, 2012 11:10 AM

Doctorow’s article explains that Delta was simply following Harris’s recommendation to “tolerate, advocate, and even practice ethnic profiling”:

It turns out that Delta has a pattern of removing brown people from its airplanes when its pilots and passengers evince thinly veiled (or obvious) racist fears, too.

Maybe Doctorow could design a Muslim profiling t-shirt too.

Wynn August 29, 2012 2:59 PM

@JP: Quite possibly the most succinct encapsulation of the state of our industry today I have seen.

The fissures of infinite depth in this mandlebrot set of CIA/P problems we deal with are obfuscated by key performance indicators used in business. They fail to articulate the complex interdependencies in that analogy. The business value is achieving the real goal of business, which is not to be “secure”, but to merely avoid “gross negligence” and achieve “reasonable practices”. These are the institutional and social pressures that apply to decision makers.

This is the generic standard minimum, beyond which the business case evaporates except in specialized business processes. However the security of these BPS cannot be modularly assembled without the underpinning baselines establishing an effectiveness cieling for the dependent BPS.

Building from the foundation up with independent and exclusive (but redundant) functions is a very difficult business case to make. When describing it in the abstract to executives, the inevitable question is “isn’t this the same as ….” to which the answer is “yes but…” and the argument is lost. Ops execs are trained to root out duplication and eliminate it to preserve business margins or budget.

Only mandatory controls established by a vertical segment and a healthy incident response and containment strategy is going to be reasonably effective. Again this standard of reasonablness will seek to eliminate a state of “gross negligence” — the stuff to which financial liabilities stick. This is why the larger risk management arena is an art with, perhaps, an engineering subdiscipline.


JP August 29, 2012 8:18 PM


Thank you.

I would counter that stringent mandatory controls are one approach but may not the most effective approach. Business will fight these controls using both political power and water them down or delay forever while working within vertical industry segments.

An alternative approach is to expand the business risk of not having effective security. Business has the normal 4 options to deal with the added risk. Accept it and believe they are adequately protected, transfer it to a third party, avoid it by exiting the business, or minimize it with better security.

Accepting is the initial approach for smaller businesses as they can’t afford it. Few businesses are willing to sell off to avoid it. Transferring can be very expensive and often not effective. Minimizing risk with better design and resources is the approach for larger business.

How to place the additional liability on the business is its own political question and unlikely to happen. When done, it will have holes. However, the threat of a $10MM business loss is a lot greater incentive than paying a $100K fine for failing to follow the stricter regulations. Further, it can be a lot more visible to the customers and to the stock market. That risk drives reaction by businesses.


Muhammad Naveed Khurshid August 29, 2012 8:33 PM

Three month ago, I had watched this video, broadcasted on Russian’s RT Confession: Infiltrator who spied on Muslims reveals FBI techniques.

Think what FBI and intelligence agencies had done with muslims… Imagine, somebody do this against Christains, Jews, Atheists, Agnostics, Budhists and people with other faiths?

What a shame??? What sort of an impression had they left on muslims around world? American’s members of Congress talked about freedom of expression, freedom of speech, freedom to practice religion, infact, they just talked about “FREEDOM”. I aint know whether they really know meaning of “FREEDOM”. They should be questioned about it. What about Juris Doctor (J.D) holder Mr. Obama, graduate of American elite university (Harvard University). It is not a surprise that he attended Harvard Law School, coz, it actually shows the real standard of Harvard. Whatever he is doing, he is actually representing his Alma Mater. He is the President, and, he gave an impression that he is unaware what FBI is doing with muslims. For me, it is impossible that he has no knowledge about those dirty practices of FBI… As a matter of fact, he didn’t stop FBI doing those practices.

Its not about Russia has adopted a propaganda against American FBI, infact, they uncover the reality. I don’t know what will happen with that informant but for muslims he is a hero…

Now, let me talk about the real security issues faced by FBI. There is a security threat that more former FBI informants speak like him… This is infact the property of hiring humans in FBI, because, at the end of the day, FBI agents are also humans like us, which is a hidden vulnerability…

Do FBI really ask their new employees to sign a bond that they will not disclose any secret, any tactics? If they really do so then I am sorry to say that it has no worth, because, former agents are violating that bond. Where is Mr. Obama now aka Juris Doctor, graduate of Harvard Law School? Where is the law and court now? :):):) A former FBI informant who was protected under the National Security act become a threat to his own Nation…

It reminded me about a song… Linkin Park – Points of Authority (Lyrics) … Read the lyrics, understand its meaning. BTW, this song is dedicated to all American……

Clive Robinson August 30, 2012 12:22 AM

@ phred14,

It’s worth remembering that those back-doors are always there, and very likely much simpler to crack into than cracking your way into the front door.

I’ve worked on many forms of electronics in my career, and I cannot think of any system that did not have “test points”. Even “ultra secure crypto kit” has “pads and points” that can be probed.

Sometimes they are innocent and required sometimes not and sometimes they are stupid mistakes. For instance a piece of top line cipher equipment used for diplomatic level traffic and using “stream encryption” had a health/status LED on the front pannel. It was driven directly from the stream generator. Being an LED it’s response time was considerably faster than the maximum bit rate. The result was a suitable photodiode/transistor would with a very simple circuit reproduce the key stream…

And that’s not the worst of it, designing secure equipment is a hard job, much much harder than trying to design a non secure product.

The reason is not only does a new secure design have to do that which it is supposed to do as it’s primary function (as is the case with all designs). But it has to go the extra 1000 miles to ensure that IT ONLY DOES what it is supposed to do and not anything else that would make it insecure both in terms of what it emmits but also to what it is susceptible to as well.

The skills required to do this are significant and usually well beyond most design engineers whithout significant extra knowledge and experience that is difficult to get (and no “TEMPEST” Certified does not cut it sufficiently well these days, but it’s a start).

Anonymous10 August 30, 2012 12:41 AM

I think Bruce definitely won the debate with Harris. However, security analysis, especially in the public domain, only gets you so far and can lead to almost complete opposite policy decisions. Bruce’s bottom line recommendation is that we basically go back to pre-9/11 security procedures and that anything more is security theater. He bases his case on a couple of unproven assertions A) that forcing terrorists to change tactics is useless, when in fact security could be forcing them to use less effective tactics,
B) Terrorists don’t care what they blow up (he must have some inside source as to how terrorists choose their targets)
C) it’s impossible to predict which tactic and target terrorists will try next, when I thought that was part of the purpose of our law enforcement and intelligence agencies
D) Any screening system with less than a 100% effectiveness rate is useless because terrorists will try every day until they succeed(again, he must have some inside source as to terrorist’s tactics and procedures not available to the rest of us).

Danny Moules August 30, 2012 5:56 AM

“Another example is “random variable” in probability. It’s neither random, nor is it a variable.”

@Wael My (limited) understanding is that a ‘random variable’ is a dependent variable with a function, either real or not. Generally speaking, in variable nomenclature, you refer to a variable by a semantic description of its contents. If the contents are randomly determined, then you could call it “randomly-determined variable” but “random variable” is as good as; in the same way you don’t refer to an “integer variable” as an “integer-typed variable” (unless you hate your free time). It’s just short-hand. Maybe ‘fuzzy variable with a function’ would work better :p

phred14 August 30, 2012 11:33 AM


This is why I called it a “mindset” issue. When I put my back-door in, my mindset is, “How am I going to debug this chip, no matter how dead the published interface may appear to be?” That goes hand-in-hand with the test/characterization guy’s mindset, “How am I going to find ANY defects in this chip, no matter how much I may have to cheat to do so?” Actually it’s a bit worse than that, because the “test/characterization” mindset is really kind of like the “security” mindset, in that he’s thinking, “How am I going to prove that this design is really broken and needs fixing?”

I have some “security mindset” on a hobby basis, but I can’t let that get in the way of getting to a working design. I once put a “test mode combination lock” on a design, but that was meant to prevent customers from accidentally invoking our characterization hooks. It only had maybe a dozen bits of key, and wasn’t meant to stop a determined attacker.

Nor is there ever a design pass to remove the back-doors – aka test/characterization hooks. Many of them are used during the course of ordinary manufacturing test. Besides that, we never know we’re done until the test/characterization people tell us we are, and at that point we don’t touch the design. There is also that ever-present fear that something will turn up later, and you will need to have every tool at your disposal to find the root cause.

It’s a battle of the mindsets, and as much as this site is dedicated to the security mindset, it’s necessary to understand that such “intentionally insecure mindsets” have their appropriate roles. There is not enough work on reconciling and coping with this little fact.

Nancy Drew August 30, 2012 3:56 PM

am·a·teur (noun)

  1. A 15-year old kid who uses a 4-penny nail to beat the billion-dollar “security” system built by “professionals”.

Clive Robinson August 30, 2012 4:29 PM

@ phred14,

It’s a battle of the mindsets…

Oh that I wish it was not. But mankind has had a longish existance with “dual use” technology. From the stick that became a club to help subdue prey for food, that also kept other members of the tribe or rival tribes at bay or removed them. Few technologies or weapons are either just bad or just good it’s how we use them and how others use them that dictates that.

Part of the security mindset is to look at things differently, for instance you see a newspaper and you think “how can that be used to hurt me/them” not because you actually wish to cause hurt or be hurt but because it’s always good to have an awarnesss of such things to be prepared.

Part of the design mindset is to look at existing designs differently, for instance you see a car and you think “how can that go wrong” not because you actually wish to cause hurt but because it helps you think how to make things better / safer.

As such the mindset is like flipping a coin, it comes down heads you are looking to solve problems, it comes down tails you are looking to create problems the rest of the process is the same. It’s been said a hunter only becomes good when they can think and feel like the prey, it’s a question of adopting the mind set to improve your knowledge and understanding and be better at what it is you do. Good or bad the process is fairly agnostic to the intent.

As you say the “test harness” stays in for many reasons but there is one we tend not to consider or think about and that’s “complexity managment evolves the design towards the managment mechanism”.

As systems become more complex they become increasingly more difficult to test, the usuall solution is to compartmentalis the design. In effect break it into smaller more managable chunks and put a well controled interface between chunks.

That is in the ssimplest case complexity can be viewed as the relationships possible between the parts which is normaly given as 0.5(n^2-n). Thus to beat the 0.5(n^2 -n) problem the trick is to reduce the number of parts n in any given area, and as the dominent factor is n^2 great gains can be made for small changes. This reduction of relationships is usually achived via the addition of a choke point of a controled interface.

It just so happens that when you do this you get other benifits which is one reason pipelining gives an improvment in performance. By reducing complexity in some areas at the small cost of putting in more complexity of interfaces that are controlled you get the benifit of faster through put in each stage and therefore a greater throughput overall, but at the expense of a longer delay overall and an actual increase in complexity in a “bounded” way.

Fairly soon however the design becomes reliant on the extra complexity of the controlled interfaces and the design becomes in effect trapped into using the controlled interfaces.

It’s the same with test harnesses when you put them in they become a “standard part” and you design to the interface it provides. At some point the test harness becomes an integral part of the design and the design then evolves around the test harness requirments…

At which point taking the test harness out becomes not just difficult but problematic, and in the respect of desired functionality removing it does not gain any benifit, it actually costs. So the incentive is actually to leave it there.

Now lets look at it from an attackers point of view…

In cryptanalysis you are initialy presented with an unknown problem, thus you first have to “know the system”. It is known that any regularity in the design actually aids getting to “know the system”. And further it is known that standard message formats provide “cribs” that give “probable plaintext” by which individual messages can be tested.

If you think about it “cryptanalysis” is a specific example of a generalised aproach to attacking systems. Getting to “knowing the system” in cracking / pentesting is usually called “enumeration”. And whilst they might not be called “cribs” the same applies to anything with a standard form.

Thus we appear to have a problem, “that which makes the design process easier also makes the attackers task easier as well”… But does it?

The answer is actually NO with appropriate knowledge and skill you can get the benifits without the disadvantages. But this is very hard come by knowledge and very few design engineers currently possess the knowledge or the skills. Or for that matter want to posses them as currently “there is little or no reward for having them” thus their time can be more profitably spent learning other skills which actually pay dividends.

As such it’s a self perpetuating problem, to resolve it requires employers to put a premium on those anti attacker skills, but that is never likely to happen untill customers demand products which require those skills, and in turn this will only happen when customers know it is possible to have such products and know they will get significant benifit from them.

boog August 30, 2012 5:29 PM

@Nancy Drew

am·a·teur (noun)

4. A 15-year old kid who uses a 4-penny nail to beat the billion-dollar “security” system built by “professionals”.

  1. People who think the quality of their security solutions is reflected by the number of digits they put on the price tag; the people who believe them.

RobertT August 30, 2012 9:05 PM

I’ve hacked a few competitors chips by simply putting myself in the place of the Design Manager / Lead engineer, and asked myself the question,
What test/characterization/debug interfaces would I want, if this were my chip?
Most chip industry outsiders dont understand how much of a gamble a modern SOC chip with 60M transistors really is. So there is no way you would ever leave out testability and debug hooks, regardless of what lies you need to tell the final customer.
BTW if you did leave out these “back-doors” chances are you would never be able to bring the product to market, so it would not matter how secure the chip was.

Steve September 1, 2012 8:34 PM

I’ve read one of Dr Harris’s books, The End of Faith, and my general impression is that he is a borderline fanatic when it comes to his dislike Islam.

While I’m an atheist myself, I find his views on Islam to be repugnant in the extreme, going far beyond simply disbelieving and verging on a call for eradication. If you substituted “Jew” for “Muslim” in his writing he would be shunned as a bigot.

Wael September 1, 2012 11:34 PM

@ Danny Moules

understanding is that a ‘random variable’ is a dependent variable with

The mathematical definition of a “random variable” is a mapping from a set of outcomes to the real numbers. So I would say in your style that a random variable has a dependent and independent variables. My applied math PhD friend is visiting me this weekend, and I ran this by him…

Clive Robinson September 2, 2012 5:50 AM

@ Danny, Wael,

Can you both indicate which “flavour” of “random variable” you are talking about, as it has different meanings to different people at different times, sometimes they prefix the term with another to indicate the flavour they are using sometimes it’s clear from the context of where the term is being used (neither case holding here).

Very loosely all flavours of random variable, are “elerments” whose “state” on any particular experement/try is arived at purely by chance. The outcome of state over many experiments/trys has a probability distrubution over a set or range of states.

The reason for not using “values” in the above is that “random variables” need not be numeric in nature, they could be physical states or elements of spoken or written language etc. These non numeric elements are usually amenable to enumeration in some way even if by just an indexed list and associated probabilities.

As I’ve indicated in the past it’s very very important not to confuse the number of states and the probability each state has.

For instance a 747 has four engines and the state for each engine (could) be the binary choice of “fully functional” or “not fully functional” [1] in which case it could be seen that as ther are four engines the total number of states for the 747 in this discussion is 2^4 or 16 discrete states. However each of these states has a probability of occuring. This way of looking at the 747 would be as a “discrete random variable”.

Now you could decide that as the engines whilst not fully functional may still be usable the binary choice of engine state is inapropriate thus you assume each engine has a range of states from fully functional down to fully not functional [1]. Thus the state of each engine lies on a curve and is thus a “continuous random variable”.

Now the problem with the “continuous” view is that our 747 now has a very complex state that is a complex product of the continuous states of the four engines, that might be directly comparable individually, but due to their position on a wing have different probabilities in the final state of the 747.

Back when the 747 was designed we did not have the computing power to model the complex state based on the continuous random variables atributable to each engine, and the initial design process would have used the 16 state model of the discrete binary view of the engines.

However as the design process progressed the 16 state model would have been augmented by using multiple states for each engine which would have been selected by a lookup table, indexed not on the number of a state but by the probability of the state. And fairly quickly the number of states would be so larrge that other (Monte Carlo) methods would be used…

Now the function that maps the probability to a state has a “distribution” and this is sometimes used as part of the name the element and there appears to be only limited agreemeent on how to do this (I’ve heard people say “a normal continuous random variable” and “a continuous random variable of normal distribution” and mean the same thing and sometimes not…).

Now there is a problem a continuous random variables function could quite easily (in fact often is with physical objects) be discontinuous in nature (the simple example being a load on a chain, as the load is increased the chain starts to streach untill it breaks). And the simple nomoculture for “random variables” starts to become difficult at best, or you could say “it jumps the rails”.

But… this has real world knock on effects, not all discontinuous functions need have catastrophic real world effects, so the system used to model them needs to allow for the region around the change to be dealt with increased sensitivity. Thus you need. to be carefull you don’t “run out of bits”.

But, there are further problems with real world mapping, that is “hysterisis” and “lag”. The normal assumption with “random variables” is that they are “memoryless” that is it does not matter in which direction you traverse the probability curve, or how fast the mapping remains the same… This is almost at compleate variance to the real world and for various reasons both engineers and mathmaticians “pretend” they don’t exist by limiting the scope of the model in some way… Maths says an infinatly thin beam can be infinatly stiff but reality has very different ideas and we get oddities.

Both hysterisis and lag will give rise to frequency dependencies and thus oscillitory conditions in the real world which is why engineers have “stability criteria for operation” or filter the inputs / feedback in some way so you get “unconditional stability in operation”…

Then you have to remember that physical objects do have “memory” a beam will bend under load and will if the load is not excessive return to it’s original state. However above a certain load you excead the “plastic limit” and the beam does not return to it’s original state. Your “random variable” then becomess a “random active variable” and you can see this in “Catastrophe Theory”. Which encompases such terms as “tipping point”, “avalanche effect”, “Domino effect”, “Snowball effect”, “Butterfly effect”, etc. [2].

But whilst complex maths models can take some of this into account it becomes problematical at best. So other theories have arisen to deal with it (to a limited extent). One such is “Chaos theory”, normaly in pop culture it is talked of as being the result of the Butterfly effect, or the high sensitivity to imput conditions. However there are many systems that have high input sensitivity that are not chaotic in behaviour. Thus other conditions are required one of which is “topolgical mixing” which uses “strange attractors” or Julia set repulsors. However this in of it’s self is insufficient to guaranty chaos, You can view those sloped nail boards at fun fairs as a field of static repulsors into which you roll you ball however you usually get a 50/50 on movment left or right so the output is in effect a normal distrubution because the ball is in effect memory less and is just a random variable not a random active variable. Real chaos requires either or both the “particle” or the “attractors”/”repulsors” to have memory and change activly in effect we can never see it because it also requires the lack of external influance. For instance Brownian motion would be an ideal candidate if it were not for the external effects of gravity causing the more active and thus less dense areas of liquid to be less attracted towards the gravitational source…

Any way Brownian motion reminds me via Douglas Adams that my breakfast cup of tea is cooling and thus improbability is reducing to the normality that it requires drinking or microwaving 😉

So to recap “random variable” is often used in a way that makes it’s meaning almost incomprehensable outside of a given context. At it’s simplest it means that “chance” makes fore knowledge of the outcome of the probability function on any given try/experiment unknowable, just as you would expect with a perfect coin or die.

[1] It is important to note the difference between “not fully functional” and “fully not functional” in the two examples as this small difference in wording has significant effects on the number and type of states to be considered.

[2] It is important to remember that these terms also have different meanings to different people. For instance cryptography has borrowed “avalanche effect” from engineering, and information theory has borrowed “entropy” from thermodynamics. In each case the implications of the terms especialy around edge cases is very different.

Wael September 2, 2012 12:59 PM

@ Clive Robinson, @ Danny Moules

It wasn’t my intention to dwell on a non-security-related definition. I was only giving an example relating to Clive Robinson’s analysis:

Well for a start split the term into it’s to parts “security” and “engineer”i>

where decomposing a two word expression into it’s components may not produce the expected meaning. I originally thought of “eggplant”, “pineapple”‘, “grapefruit”, “American Indian”. But I choose something different. Anyway, I was going to drop the subject, until my friend came over. We had a cup of tea together (the mints did not look good, Clive, so we didn’t add them – Some Middle easterners usually put mint in the tea (Egypt, Morocco, Tunis.) Where my friend comes from (another middle eastern country; Syria, Jordan, Palestine), they often add Sage to their tea, but I m not too keen on that flavor. Try to add fresh cardamom to your tea next time — it’s good for your heart, and tastes good, too…

Anyway, I talked to him about three subjects relating to this blog:

  • What a random variable means.
  • This is an example where Wikipedia is wrong in “wording” the first few paragraphs. Flavor is the formal mathematical definition:
    Weisstein, Eric W. “Random Variable.”MathWorld–A Wolfram Web Resource.
    Random Variable
    A random variable is a measurable function from a probability space into a measurable space known as the state space (Doob 1996). Papoulis (1984, p. 88) gives the slightly different definition of a random variable as a real function whose domain is the probability space and such that:
    1. The set is an event for any real number.
    2. The probability of the events {X=+infinity} and {X=–infinity} equals zero.
    The abbreviation “r.v.” is sometimes used to denote a random variable.

    Incidentally, my friend’s perception of what “Security Engineer” means is the following:
    “A person whose job is to make sure breaking a system is a difficult task for the opponent”. Keep in mind that he is not a “security person”…

  • The second topic we discussed was related to @ Bruce Schneier’s post:

    “Don’t you know you’re always supposed to change doors when given the opportunity.”

    I told him, I understand the theory, and the justification, but in my view it’s incorrect. It was a long discussion. But we both agreed it makes no difference whether you change the doors or not. Still justifiable with conditional probability reasoning, but in our view that is incorrect (unless there is a long previous history sample, and the problem is treated as a Markov chain) – details left out 😉

  • The third subject we discussed was: Is it better to change the password, or leave it static?
    I was basically asking him to look at that problem from a probabilistic point of view, assign probabilities, and calculate expectations. I was too lazy to do that. He was also reluctant, and said the best answer is: “Do what works for you”…
  • Anonymous10 September 3, 2012 12:53 AM

    I’m also skeptical that being an expert in one aspect of security, say cryptography, means that you’re an expert in overall computer security. To take that a step forward, I’m skeptical being a computer security expert makes you an expert in computer security policy. Bruce is a genius in cryptography, an expert in computer security, but some of his views on computer security policy tend toward the ridiculous (having cyber-war treaties enforced with verification schemes where a country has to attempt to prove it doesn’t have any hidden flash or hard drives that might stockpile viruses.)

    Clive Robinson September 3, 2012 7:36 AM

    @ Anonymous10,

    I’m also skeptical that being an expert in one aspect of security, say cryptography, means that you’re an expert in overall computer security.

    The way you have worded it the answer most people would expect would be no.

    However, in any field of endevor you have two extreams of expert, “The Generalist” and “The Specialist”, at some point in between is where you will find the likes of “Renaissance Man” and Polymaths.

    So there is no reason why a generalist should not also be a specialist in certain subjects within the field of endevor. Nor is their any reason why the skillsets of a specialist in one domains are not transferable to other domains, or to the entirety of the field of endevor.

    So you should realy be talking about skill sets and outlook of individuals and showing where a skill set or outlook is not transferable.

    Oh and it should be noted that whilst the lower levels of university education are designed to give a broad foundation the higher level “research” qualifications encorage “specialism” not “generalism” which might account for why we have few true generalists at the higher levels.

    Another consiquence of this is that generalists tend to “break new ground” whilst specialists refine the quality of a particular area of endevor and in effect improve the methods. Also as has been pointed out to me generalists trend towards “experimental” and specialists towards “theoretical” research.

    Wzrd1 September 4, 2012 1:45 AM

    @Clive Robinson,

    I’d consider it slightly differently, a a generalist considers the entire whole of a system of security, while the specialist may well ALSO be a generalist (as you said) AND perform specialist functions.
    The generalist will also, by nature of being a generalist, call in specialists to achieve the goal of the level of security demanded of the organization.

    Now, when addressing a target audience, one must adapt one’s address.
    When speaking to end users, “security speak” isn’t a good idea. It alienates them, it confuses them and the confusion tends to lose the message. In short, they ignore what confuses them.
    So, when I address end users, I’ll speak (I specialize in systems and network systems security and generalize in physical security of the NOC/server room), in terms of “not letting evil spirits into the network”, as a colorful term that gets attention, due to it being ridiculous, especially from a professional. I then move into the harms caused by simple errors of procedure (such as the proverbial thumb drive plugged into a networked system (lived through one such debacle in 2008, but my installation was unimpacted)). The distraction of a non-professional term tends to re-enforce the lesson.
    When dealing with middle management level, I’ll employ a bit of the “evil spirits”, again, it re-inforces the message and more detail and specifics.
    When dealing with upper management, I tend to rarely use the “evil spirits”, unless I experience a genuine lack of knowledge, then I step it down a bit to get the target audience to comprehend what is going on and why. But, I’ll also include metrics in detail, as requested (I insist on two way meetings, with questions at higher levels interrupting sessions, within reason) and provide metrics, including monetary ones.
    I’ve addressed some corporate audiences, with good feedback. I’ve addressed far more governmental audiences, due to my career path.
    Originally, I lived “at the sharp tip of the sword”. After I retired, I went into NA/SA admin positions, but always was security vigilant. Later, I graduated into information security (in the DoD, it’s Information Assurance), as I was essentially doing that job, but getting paid less.
    I’ve always always had the ethic, if I’m doing a job, I should ALWAYS be the expert. Steep learning curves are trivial to me, indeed, I rather love them. I end up “the Shell answer man” in rather short order. Learning is accepted as an expert, the expert only learns more as the expert progresses. Otherwise, said expert isn’t an expert, but stagnated.
    To this very day, when I walk into a corporate office or branch, I evaluate everything, from the approach to the parking lot to the entry. Then, the entry itself and making entry. Then, to the various doors, marked or hinted/suggested/discussed. For, the average worker will happily discuss ANYTHING unless they’re trained to NOT discuss some items.
    It all comes down to teaching concepts to a target audience. It then comes down to REACHING the target audience.
    For, those who don’t know about security overall, which the majority of the populace does not, won’t have a clue. So, you need to educate them.
    From the CEO/CIO/COO to the end user, for ALL are a point of failure.

    Lyndon September 5, 2012 1:49 AM

    How strange, it’s as if Sam doesn’t believe that changing the rules will result in a change in criminal behaviour to attempt exploitation of the rules…

    tz September 5, 2012 8:42 PM

    Magicians and thieves(professional).

    And others. Some things are skills that need a unique mindset. Amateurs – even very smart ones -fail because they ask smart but wrong questions.

    But people want certification in silver bullet manufacture, not thought-wars. They feel safe based on inconvenience or cost. Worse, they fear those who shatter the illusions.

    Peter Gerdes September 15, 2012 5:42 PM

    Doesn’t your very discussion with Harris prove that this is likely impossible.

    Harris is far smarter than the average layperson, better trained and inclined to accept conclusions from abstract academic arguments that contradict common sense and above all willing to devote a large amount of time and effort to personally arguing with someone he knows to be a world respected security expert. If you can’t convince someone under those conditions what hope do you have of convincing the general populance who will only give the issue a few minutes thought, will hear the issue in soundbyte form mangled by the media and are much more resistant to overturning their preconceptions based on abstract analysis.

    As you keep emphasizing on this very blog our intuitive analysis of probability is very bad. Having had to teach probability to college kids I can assure you that we are unlikely (prior to the singularity) to ever reach a point where most adults have a sophisticated grasp of probability (even the ones who learn it mostly forget) and are willing to trust that over their intuitions.

    Unfortunately, terrorism is too big an attention draw to expect people to just overlook the matter and leave it to experts.

    This seems like a nice dream but an unlikely one.

    Leave a comment


    Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

    Sidebar photo of Bruce Schneier by Joe MacInnis.