Schneier on Security
A blog covering security and security technology.
« Seagate's Full Disk Encryption |
| Your ISP May Be Spying on You »
June 27, 2005
Interview with Marcus Ranum
There's some good stuff in this interview.
There's enough blame for everyone.
Blame the users who don't secure their systems and applications.
Blame the vendors who write and distribute insecure shovel-ware.
Blame the sleazebags who make their living infecting innocent people with spyware, or sending spam.
Blame Microsoft for producing an operating system that is bloated and has an ineffective permissions model and poor default configurations.
Blame the IT managers who overrule their security practitioners' advice and put their systems at risk in the interest of convenience. Etc.
Truly, the only people who deserve a complete helping of blame are the hackers. Let's not forget that they're the ones doing this to us. They're the ones who are annoying an entire planet. They're the ones who are costing us billions of dollars a year to secure our systems against them. They're the ones who place their desire for fun ahead of everyone on earth's desire for peace and [the] right to privacy.
Posted on June 27, 2005 at 1:14 PM
• 54 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Didn't Marcus also write stealth.c? If I remember correctly this was used widely by hackers to create root kits and hide themselves once they broke into a computer.
"They're the ones who are costing us billions of dollars a year to secure our systems against them."
Does this mean that if hacking wasn't an widespread activity, the systems wouldn't be secured? That truly sounds like a security joke...
I disagree that is appropriate to "blame the user". This has long been an "out" for the community in both open and proprietary systems.
But these systems are far too complex for almost all users to understand. In fact, the level of thought I have to put into securing systems is substantial, and I'm a security researcher whose background is in systems.
What is really needed is a better understanding of how to build secure systems (not just to prevent buffer overflows).
Yes, its possible through lots of effort to build a (relatively) secure system, but why should it be so hard?
"Truly, the only people who deserve a complete helping of blame are the hackers."
I often have the same thought, except substitute "dishonest people" for "hackers", and apply it to all security, not just online.
In other words, if all people were honest, our need for security would be greatly diminished. I don't think it would disappear altogether though.
Dishonesty is something that humans must live with. It's part of us, as much as honesty and the need for the systems we're talking about. So statements like the one Mr. Ranum did are utterly ridiculous and serve no purpose, in my opinion...
"But these systems are far too complex for almost all users to understand."
Blame marketing for pretending otherwise.
People can blame whoever they like, but it's everybody's fault.
Blaming Microsoft for our nation's poor security trade-offs is like suing McDonald's for making you fat.
On pg 3 of the interview (right above the last question), Ranum says
"And the results show: 80% of corporate desktops are infected with spyware, 15% of them are infected with keystroke loggers."
Where did he get that stat from? It seems wildly skewed to me.
wasn't marcus part of the l33t underground at one time???
"I disagree that is appropriate to "blame the user". This has long been an "out" for the community in both open and proprietary systems."
I have lost track of the number of times I have had the "It hurts when I do this!" "Well don't do that!" conversation with 'users' that would not listen to advice about security.
It isn't their fault that their computer is, by default, insecure. It *is* their fault when they bypass or ignore the warnings given to them by their virus/spyware/etc software, and *still* run that virus/malware.
'Users' who refuse to be educated, *do* share some of the blame.
"It isn't their fault that their computer is, by default, insecure."
So then it isn't their fault for buying a computer that is insecure by default? I have yet to get any viruses or spyware on my powerbook...
"Blaming Microsoft for our nation's poor security trade-offs is like suing McDonald's for making you fat."
This would be true if the only two options for eating were either McDonalds or growing your own food....
Blaming Microsoft (or any vendor) for not making a secure product is like blaming a dog for not being a cat.
It's not Microsoft's job to make a secure operating system, it's their job to _make_ _money_.
If they think that making a secure OS will give them the best ROI then that's what they'll do. If they think something else will give a better ROI then they'll do something else.
It's out job as consumers to try to convince vendors that security is in THEIR best interest.
What exactly are you getting at? There are many operating systems, and many types of food. I don't see how it is a false dichotomy.
We hackers make everything better. Without us, terrorists would have it soooo easy. Even with us, bypassing security (even going from external to internal) can be easier than it should be. Passive attacks are on the rise within the security industry; they're being done not only by some of the most skilled hackers, but those of corporations, governments, and nameless individuals. There is nothing that can be done to stop us. Just realize that without us, the products you use would suck.
There is no such a thing as a completely secure system. Every platform is a secure, but only to a certain extent. Think about it for a moment, is your phone line secure ?
Do you recieve secure tv programming ?
Do you listen to a secure radio ? How secure is that ABM down the street ?
People who write software are the same, sometimes they spend endless nights in some office working to finish something. Sometimes they cut corners, sometimes flaws are found. Every system has a flaw
it just takes time and effort to locate them.
Most of the modern devices were not designed with security in mind. They are designed to fulfill their other primary purpose, whatever that may be.
In a perfectly secure world, no one would know anyone, no one would trust anyone.
Have a nice day.
Las Vegas, Nevada
"What exactly are you getting at? There are many operating systems, and many types of food. I don't see how it is a false dichotomy."
I think he is making a point about the Microsoft monopoly...which is a valid point.
"Blaming Microsoft (or any vendor) for not making a secure product is like blaming a dog for not being a cat.
"It's not Microsoft's job to make a secure operating system, it's their job to _make_ _money_.
"If they think that making a secure OS will give them the best ROI then that's what they'll do. If they think something else will give a better ROI then they'll do something else.
"It's out job as consumers to try to convince vendors that security is in THEIR best interest."
This is very true (modulo the monopoly arguments). Microsoft is not a charity, and they should not be expected to behave as one. The problem is the market incentives. Fix that, and Microsoft will turn around so fast you'll be amazed.
"We hackers make everything better. Without us, terrorists would have it soooo easy. Even with us, bypassing security (even going from external to internal) can be easier than it should be. Passive attacks are on the rise within the security industry; they're being done not only by some of the most skilled hackers, but those of corporations, governments, and nameless individuals. There is nothing that can be done to stop us. Just realize that without us, the products you use would suck."
I agree. The overall situation is more complicated than this, but this is certainly true. Without full disclosure, computer security would be much worse than it is today.
"Blaming Microsoft (or any vendor) for not making a secure product is like blaming a dog for not being a cat."
I disagree. Blaming Microsoft for not making a secure product is like blaming Ford for not making a secure car. As Bruce himself said a lot of times, Ford cars are more safe than even a decade ago because of their fear for litigation. Make a faulty product, go to jail (or pay billions in damages). Microsoft has never been held liable for their software faults; when he is, he will really be interested in making good software. So far, he only seems interested in secure software because of the other fear: competition. Linux, Mac ... are the destination of many users fed up with bluescreens. Unfortunately, not many people have the guts to do it.
Blame the user? Shall we also blame the driver when the car's design causes the steering column to impale him in a 15MPH crash?
Users should not have to take extraordinary measures to secure the systems for which they have paid every cent that flows up the value chain.
In fact, it should quite the significant effort for a user to render an off-the-shelf system INsecure.
I totally agree with Jon Solworth's post. Security to end user should be simple and obvious like home alarm systems. The security control can be complex internally but its interface with end user must be as simple as possible.
Today, users must have security expert's knowledge to be successful with security controls in their systems.
@Blame the user? Shall we also blame the driver when the car's design causes the steering column to impale him in a 15MPH crash?
If the car got one or two stars in its crash testing but the driver bought it anyway because it looked cool, then we absolutely should blame the consumer. Same thing if they buy an SUV that rolls over and gets them killed because they wanted a car that was "more masculine" than a minivan.
We live in a capitalist society where every dollar is a vote. If in a democracy people get the leaders they deserve, then in capitalism people get the products they deserve.
Well, I really disagree with the "Truly, the only people who deserve a complete helping of blame are the hackers" part.
It is the same as saying the problem with insecure cyphers are the cryptanalysts that break them.
There will always be a threat, be it hackers or terrorists or organized crime or foreign governments or industrial rivals. If the "hackers" did not generate as much publicity about security vulnerabilities, most vendors wouldn't work on hardening their systems.
It is false to compare the statement "blame Microsoft for security problems" to "blame McDonalds for being fat" or "blame GM for being impaled with a steering column." I can live without McDonalds and GM cars, because I have comparable and easily available options.
However, Micorosft is a True Monopoly in the Pure sense of the word. Most business people I know are forced to use Windows for compatibility reasons. Let me re-iterate; we don't have a choice.
Microsoft enjoys the benefits of being a Pure Monopoly, Microsoft should also bear the responsibility of its design decisions. The default security configuraiton is poor, and most users don't have the technical skills to change them. Microsoft should bear the responsibility of this. Microsoft Internet Explorer is wide open for exploitation by Spyware. In fact, even professionals have difficulty locking this down. Mcrisoft should bear the responsibility for this too.
I tend to think that the full disclosure vs exploit debate is pretty complicated. If all people do is disclose problems, but never exploit it, it would be arguably easier for everyone because we would not have to worry about people doing bad things.
On the other side, because people do bad things sometimes, it is important that the good guys have the knowledge and motivation to fix it.
However, the argument that hackers make software more secure is questionable at times. In terms of "theoretical" security in where you count the actual number of vulnerabilities in a program, I'd agree that they help get things fixed. However, those people are often the same people that packages up exploits for the average Joe, or even the terrorists to use for their own purpose. So, the existence of vulnerability is not the only issue to think about here. There is a whole question about ability to exploit it. A vulnerability that nobody knows about cannot be intentionally exploited (only accidentally or by dumb luck).
My point is, hackers both help and hurt the terrorist. Inasmuch as they help to get problems fixed by their various tactics, they help us. But as for the issue of packaging, and giving the average Mr. Bad Guy the tools to do bad things, they are actually hurting us. I tend to think that good guys usually have no use whatsoever for actual exploit codes except when testing their security. And to test the security, they do not need a full-fledged root kit or Back Orifice. It is more likely those exploits fall into the hands of people who would use it to wreck havoc.
"We hackers make everything better."
No, actually, you don't. If you really wanted to make everything better, you'd work FOR the companies whose products you're performing "quality assurance testing" on. This line is a dead giveaway as to why you're actually hacking: it's ego.
Simply working for the companies wouldn't make the products better. History has shown that in conflicts between "time-to-market" and "low costs" versus security, security looses almost every time (unless the holes are too blatant).
As for the theory X tools debate, theoretical vulnerabilities do not make companies fix them, only actual threats do.
There is a misconception in all this hacker blaming, and it is that if they didn't write the tools the vulnerabilities wouldn't be exploited. I believe they would be exploited in a much more stealthy and dangerous way. And companies wouldn't fix their products as promptly as they do today (which is already far from ideal).
A false sense of security is much worse than no sense of security, because in the later we actually feel the need to act in order to improve security. At least, knowing that there are hackers in the wild, we can stay alert and actively try to protect ourselves.
I believe we, as security professionals, do share some of the blame.
People still have this belief that "all I need is a firewall and some anti-virus software and then my computer will be unhackable". Well, this is certainly not true, and it is our job to tell the world about it. How could the lay man possibly know if something is secure or not?
Marketing teams (now these guys are the real root of all evil!) keep telling that "hey, buy this and you'll be secure!", "buy that and no hacker will ever be able do attack you", "use this newly-developed-copy-protection snake oil and your musics won't ever turn into MP3", and no one of us - actually the only ones who know the terrifying truth - is telling the world that "hey, don't believe it, it's bullsh*t, doesn't work, they're only after your money".
I like to think of us as Jedis, fighting against those who fell to the dark side of the force (hackers, crackers, marketeers, whatever). It is our "holy mission from God" to educate people so they won't believe this kind of blahblahblah anymore, so they will know how secure they _really_ are.
But we are failing. Those in the dark side are actually doing a better job at this. I must agree that "hackers make everything better."
No, I didn't write stealth.c. Don't post stuff like that in a public forum "Anonymous Coward" unless you're willing to stand by your words and make an effort to know what you're talking about.
My name is in a very old piece of code called "cloak.c" which was written by Michael Baldwin when I was an undergrad. I did a lot of code clean-up and reformatting because I was learning C at the time. I put my name in the copyright and have often regretted doing so. :(
That's the closest I've ever come to the "dark side" - pretty lame, huh?
I agree with you for the most part that a false sense of security is worse than none. I even agree on the unfortunate thing that if there is no actual threat, companies are less likely to try to fix it.
However, I feel that the idea you had that the threats would be more stealth and dangerous otherwise is a bit of an overextending assumption. We have no data out there to support or oppose that point. So, I tend to take the simpler route of looking at what has happened, and make conclusions based on what has happened. That keeps us more or less within reality.
Personally, I'd include those people who would do those stealth and dangerous stuff among the hackers. After all, some hackers already do fit that profile. What's more is, we are seeing more and more plain criminals pick up hacking tools, and using it to commit their crimes.
The reality of our world seems to say that first, someone develops this technology that has the potential to hurt computer systems. Then, the criminals pick it up. So, it is fairly safe to say that most of those criminals aren't creating their own tools. Therefore, the idea that the threat would be more dangerous seem to fail in the face of reality because we have a problem of where those tools would come from.
Tools to break into computer system and control it do not come out of thin air. Someone has to develop them first. Often, it is easier to develop those tools if you aren't working alone, so that seems to provide an impediment to doing those type of work in secret. Not to mention that people generally are good at catching things that are out of ordinary. Hacking is already out of ordinary, they are the real threat we see out there. In the absence of them, maybe there is something else worse. Or maybe what's out there will be even easier to combat since people working in secret have their own set of issues to face when making their program stealth. Hacker already did the work for them (root kits and back orifice can hide themselves).
This does not, however, take away from the idea that if there is no threat, vulnerability won't be fixed. I'm more or less talking about how practical it would be for the people who would want to exploit it for bad things would be able to.
"However, Micorosft is a True Monopoly in the Pure sense of the word. Most business people I know are forced to use Windows for compatibility reasons. Let me re-iterate; we don't have a choice."
I keep hearing this but I don't believe it. There are very few applications that don't have either a port or an equivalent on OS X/Linux. I use a powerbook as my main computer, and in the year and a half since I've owned it I have yet to find a single job that is impossible to do on mac. For example, Bruce's Password Vault software doesn't have an OS X version, but it isn't needed because the ability is built right into the OS (just put a txt file in an encrypted disk image). The only major program I know of that doesn't have a mac version is AutoCad, and very few professions use it.
I could understand if there were a few engineers in a company who had to use XP because of some obscure program made by a third party company. Why that justifies putting the secretaries, executives, middle management, marketing, HR department, etc. on XP is simply beyond me.
Having read the interview I am left with the impressiojn Marcus should change his name to,
Unfortunatly the whole interview was a bit lite on ways to move forward.
He is correct though about allowing only that "that is known to be good" and to "cast out the bad", it's not a new idea various Gods have been telling there followers this for the odd thousand years or so ;)
I think you will find it was the "lemon Laws" not market pressure that made the US motor car industry sit up and take notice.
The history of corperate endevor shows that "lip service" is what the "free market" gives to consumer concerns. And only Sensible laws with teeth turn lip service into action.
Here is the file everyone's talking about.
BTW the relevant comment is:
* Marcus J. Ranum - 1983 - complete re-write and munging
* added more options, and all kinds of evil - including the
* ability to vanish from wtmp and acct as well as utmp. Added more
* error checking and useful command syntax. Now you can attribute
* all *YOUR* CPU usage to others when playing hack !!!
"Ford cars are more safe than even a decade ago because of their fear for litigation. Make a faulty product, go to jail (or pay billions in damages). Microsoft has never been held liable for their software faults"
If I devise a brick & window way of stealing the contents of your Ford's glovebox, I don't think you can blame solely Ford, nor does it sound particularly like Ford's problem that you parked your car in my neighbourhood.
So what I honestly don't quite understand - Brick-breakable windows on cars -- a liability, bug, security trade-off?
And how can a company making, say an OS, be held to a level of accountability that has to cover my bricks...
This debate about MS's culpability regarding Windows vulnerabilities could go on all day if everyone keeps arguing via analogies. The problem is that it's not as straight-forward as "MS is responsible for everything!" vs. "Customers are responsible for everything!" One or the other can be true depending on what vulnerability you're talking about.
"I keep hearing this but I don't believe it. There are very
few applications that don't have either a port or an
equivalent on OS X/Linux."
And you would be wrong.
Certainly the basic office productivitiy apps like mail,
WP, spreadsheet are all available on other OSes. But
apparently you have never worked in the IT department of
a large company, for you would know that the average
enterprise uses many -- in some cases hundreds -- of 3rd
As just one example, a few years ago on a contract I
worked at a large hospital that was upgrading their desktop
OS and also preparing for Y2K. They had specialized 3rd
party apps for scheduling nurses, tracking medical records,
providing up-to-the-minute patient info to switchboard
operators, coding medical procedures for insurance billing,
etc. Lots of complicated stuff, none of it written in-house,
and none of it available on the Mac, let alone for Linux.
On the other side, there's little incentive for the typical
vendor of such applications to provide Mac or Linux versions.
With 99+% of their customer base running Windoze, there's no
return on the extra engineering required to produce executables
for other platforms.
>I disagree that is appropriate to "blame the user".
I think that users need to accept some responsibility (hence blame) for their actions (or inactions). For example, CTOs that go out and invest big $$$ in "mission critical systems" without even THINKING whether security is a consideration - we've all seen it happen. They can't claim they're being deceived by Evil Vendors(tm) when in fact they had a responsibility to at least read the directions.
A case study recently came across firewall-wizards, in this vein. Paul Robertson had a friend of a friend who had unsecured home wireless -- and didn't even REALIZE he had wireless until the cops kicked in his door. Of course, he bought a cablemodem with a built-in WAP and didn't read the directions and never got to the part about configuring security, etc. Now - the question we need to ask ourselves as a society - does the user of a dangerous technology have a responsibility to himself and others to understand its use? I think the answer is a strong "yes." That's why we're required to pass a test before we operate motor vehicles, etc. Should we have a test before you can be on the Internet? Might not be a bad idea. Paul's friend - by making it work but not making it safe - gets his own little slice of the blame.
Like I say, there's plenty enough blame to go around. Let's not rush to excuse anyone.
>Dishonesty is something that humans
>must live with.
Why? Shoot me for being an idealist...
>It's part of us, as much as honesty
>and the need for the systems we're
>talking about. So statements like the
>one Mr. Ranum did are utterly ridiculous
>and serve no purpose, in my opinion...
I'll absolutely grant you that it's a ridiculous statement. So - why did I make it?
The reason is simple: we security practitioners have been calculating the cost of computer security wrong all along.
Call it "hacking" "cracking" whatever - the fact is that expenditures on security are a massive up-front financial drain as well as a back-end cost. If you spend a ton of $$ on security and DON'T get hacked, you're certainly better off then the poor S.O.B. who gets bit-raped -- but you've still spent a ton of $$. That's money that could have been spent on something better.
Of *COURSE* we have no choice in the matter. Because there will always be bad guys and sociopaths. But whenever we, as a society, talk about the "cost of hackers" or "cost of Internet security" we're off by a factor of ? - because we mostly look at the back-end costs not the massive across-the-board financial drain that these creative a**holes place upon us.
If we can encourage society to recognize that, like any other criminal behavior, it's a larger $$-cost than they realize, that helps put us on the front-end of the problem instead of in a reactive mode. Maybe that kind of thinking can encourage us to attempt societal change - don't teach pre-techno-literate kids that "hackers are cool techno-groovy dudes" - teach them that being part of a shared network is a massive responsibility and with responsibility should come a sense of shared duty to a larger community.
Hey, see why I'm so bent out of shape about security? I've *FELT* the waste and *SEEN* the damage. It's not "cute" to me and never has been. Broaden your view and look at the big picture and you just might wake up and think "wow - these as&holes are the friction in the gears of human technological advancement. Let's lube 'em out."
>Having read the interview I am left with
>the impressiojn Marcus should change
>his name to,
OK. Done. That help? ;)
>Unfortunatly the whole interview was a
>bit lite on ways to move forward.
Since I'm not a politician, I tried to respond to the questions I was asked and not go too far off into wild tangents. :) But I disagree, actually. There's plenty in there that represents ideas for going forward. They're just not ideas you're willing to accept, apparently. :)
Here's a good summary of my main recommendation for how to improve security going forward:
STOP DOING SO MUCH DUMB STUFF
How's that? Cheap, cost-effective, and it requires zero additional investment in infrastructure or technology.
No, I am not joking. Now, that's someone's cue to chime in with "but, our users need their 3-D shared virtual reality dancing pigs environment with executable file sharing!" You should be telling us how to secure THAT - not telling us high-level useless advice.
If you don't understand this, I cannot help you.
Oh - one more thing:
Why is it that whenever someone points out that there's a problem, there's always some bonehead in the peanut gallery who says, "SO WISEGUY, TELL US HOW TO FIX IT!? HUH?!"
Y'know what? It's not my problem. I don't HAVE a computer security problem. I didn't create your problems, I'm not part of your problems. My responsibility to get you to stop beating your head against a wall begins and ends at the point where I tell you "hey, moron, you're gonna knock your little brain out if you keep that up."
Enjoy your headache, it's not my problem.
>Blame the user? Shall we also blame the
>driver when the car's design causes the
>steering column to impale him in a
This is Bruce's blog, and he speaks better and more clearly about the process of assigning risk and indemnity than just about anyone I know of. So it's kind of funny for this discussion to be metaphorically taking place under his nose.
Anyhow... If you buy an item and are told it's safe and it's not, the person who told you it's safe was wrong. If you buy an item you know is dangerous and use it improperly, you're in the wrong. If you buy an item you should have known was dangerous and use it improperly, you're ignorant and still somewhat wrong.
Society handles these issues in a lot of ways outside of computer security. For example (I live in mining country) if a vehicle inspector certifies a shale truck's brake lines and they fail the day after they were certified - he's liable. The shale truck drivers effectively shifted the liability by having a 3rd party attest that they had done their part. Etc. This kind of paradigm is beginning to play itself out in computer security, too. "We kept our data encrypted" is about to become the "get out of jail free" card for having your customer database pillaged.
The point is - there's enough blame to go around. If you're playing with anything that could potentially be dangerous (and computers fall into that category) you need to know what you're doing. My guess is that in the future computers will come with a big red tag on them - like the one that came on my chainsaw - that says "IMPROPER USE OF THIS MAY RESULT IN MUTILATION OR DEATH." It'll be pathetic but maybe you'll have to sign a 15-page liability release when you get your next Dell. That's what's gonna happen if the lawyers get their fangs much deeper into this industry.
>Well, I really disagree with the "Truly,
>the only people who deserve a complete
>helping of blame are the hackers" part.
>It is the same as saying the problem
>with insecure cyphers are the
>cryptanalysts that break them
Analogies suck; you can make so many silly analogies and they all fall apart if you stretch them far enough..
I could just as easily respond with "That's like saying that Airport Security would be much worse if we didn't have helpful terrorists motivating us to improve it!"
The question is whether you're doing harm or not. Society needs to step back and look through the marketing and hype and decide what actions are beneficial and what actions are not. My guess is that we've kind of already decided that spammers suck. And we're pretty good on the fact that virus-writers are twisted weirdos, right? Don't you see where this is going? Eventually people (as a whole) are going to get fed up with this hacking crap and it's going to become the purview of an underground minority again. It's just costing us all too much for it to remain "cute" very much longer.
Re: the cryptanalysts - the question is what they're doing with the information. If they're publishing their results knowing that they'll prevent bad algorithms from going into widespread use - they're helping. If they're publishing their results knowing their results will be used tomorrow to put soldiers' lives in danger, or millions of people's privacy in jeopardy - they're not helping. We all understand the dynamics of the disclosure debate; I just want people to acknowledge the presence of a moral dimension to it as well. Jeeze, the atomic bomb scientists were smart enough to figure that out (after the genie had left the building) - what's it going to take for computer security practitioners to get their heads out of the sand?
Marcus with regard to spending money on security and why we do it...
The answer is usually "herd mentality" not reality.
If you look at physical security there are a number of reasons,
1, Quantifiable risk
2, Third party incentives (insurance discounts etc)
3, Marketing FUD.
Within the Quantifiable risks there are three main aspects,
1, Actuarial figures (the recorded past)
2, Trend indicators (the predicted future)
3, Weakest link / softest target suppositions
Of these the first is statistical in nature, and as with all statistics the outcome depends on the sample group you pick (e.g. a downtown ghetto tenement -v- an uptown apartment block with doorkeeper) the rest is purely subjective and therefore fair game for marketing / cost accounting FUD.
Likewise trend indicators tend to be even more questionable, how accurate is your crystal ball?
As for the weakest link / softest target suppositions,
"As everybody else on the block has steel lined doors and triple seven lever mortise locks, so we should otherwise the perps are going to go for the easiest target US"
There is more evidence that a house owner who cleans their door step and polishes their brass door number is considerably more likely to get burgled than one that does not.
If you analyse the costs involved -v- the real risks for physical security then the costs are almost always not justified on the level of risk.
The question is how well does computer security market place mirror the physical security market place and all its income generating FUD.
There's a saying in German: "Gelegenheit schafft Diebe" or something like "Opportunity creates theft". In a REAL world, people break in houses with open windows and doors if nobody is at home. In a REAL world, crackers break things on weakly secured boxes. In a DREAM world, all people are nice to each other and crime and violence do not exist. Now go figure.
We have a similar saying in English:
"Thieves love opportunity."
"Why is it that whenever someone points out that there's a problem, there's always some bonehead in the peanut gallery who says, 'SO WISEGUY, TELL US HOW TO FIX IT!? HUH?!'"
The book "Waltzing with Bears" by DeMarco and Lister has some very interesting points regarding this phenomenon. I have also noted over the years that the American business/political culture tends to have a very dangerous bias towards the "cowboy can-do" characters.
The book summarizes this attitude as "it's okay to be wrong but not okay to be uncertain" (pg 42). As a related tangent to this point, the new biography of Clinton called "The Survivor" also makes for interesting reading on this subject as it discusses how a leader can address significant risks, including how to collect accurate information and how to find and choose a "best path".
The bottom-line is that managers who require staff to have solutions for every concern actually create disincentives for reporting significant (dangerous) risk in a timely fashion. Some chilling examples of this include the NASA shuttle disasters, which (according to DeMarco and Lister) were caused in a large part by a business culture that insisted a proposed solution must be included with every problem/concern.
Kudos to you, Marcus, for speaking out regarding the uncertainties in information security today and trying to stimulate others to help find solutions.
Thanks for the reference, I now have a copy on order. Sounds like a good read!
The NASA shuttle disasters are classic examples of what happens when there is a "reality dysfunction" (a great term and the title of a very cool Sci-fi series) between management and the managed. You could probably also cite the Soviet economy as another good example. I worked for a company once where the CEO would not take "no" for an answer - which got difficult when he demanded the impossible. [On the topic of NASA disasters, Feynman's minority report is required reading if you're interested in the process of organizational dysfunction]
Thanks for the pointer!
I've ordered a copy of "Waltzing with Bears" too.
Ten years ago, or even five years ago, this may have been true:
"They're the ones who place their desire for fun ahead of everyone on earth's desire for peace and [the] right to privacy."
There is real money to be made today in breaking into systems. The crimes perpetrated by hackers--- carding, identity theft, ddos extortion, etc--- are real crimes. It's not just about fun any more.
congratulations. everyone posting here has managed to repeat the same catch phrases you've been spuing out for years. here's a new one you can blatantly abuse:
At least hackers are attempting to break stuff, and beavis...breaking stuff is cool. Computer security guys just repeat all the crap they steal from hackers. Repeating stuff is not cool....so, sit your fat big-4 consulting ass down. Also, if it weren't for hackers, none of you dipshits would be getting overpaid to do remedial work.
Without taking into account the issue of establishing a stone by God, which he won't be able to pick up, how do you think, may be something in this world, what can God never see?
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.