Schneier on Security
A blog covering security and security technology.
« Ten Worst Privacy Debacles of All Time |
| TrackMeNot »
August 22, 2006
I've met users, and they're not fluent in security. They might be fluent in spreadsheets, eBay, or sending jokes over e-mail, but they're not technologists, let alone security people. Of course, they're making all sorts of security mistakes. I too have tried educating users, and I agree that it's largely futile.
Part of the problem is generational. We've seen this with all sorts of technologies: electricity, telephones, microwave ovens, VCRs, video games. Older generations approach newfangled technologies with trepidation, distrust and confusion, while the children who grew up with them understand them intuitively.
But while the don't-get-it generation will die off eventually, we won't suddenly enter an era of unprecedented computer security. Technology moves too fast these days; there's no time for any generation to become fluent in anything.
Earlier this year, researchers ran an experiment in London's financial district. Someone stood on a street corner and handed out CDs, saying they were a "special Valentine's Day promotion." Many people, some working at sensitive bank workstations, ran the program on the CDs on their work computers. The program was benign -- all it did was alert some computer on the Internet that it was running -- but it could just have easily been malicious. The researchers concluded that users don't care about security. That's simply not true. Users care about security -- they just don't understand it.
I don't see a failure of education; I see a failure of technology. It shouldn't have been possible for those users to run that CD, or for a random program stuffed into a banking computer to "phone home" across the Internet.
The real problem is that computers don't work well. The industry has convinced everyone that people need a computer to survive, and at the same time it's made computers so complicated that only an expert can maintain them.
If I try to repair my home heating system, I'm likely to break all sorts of safety rules. I have no experience in that sort of thing, and honestly, there's no point in trying to educate me. But the heating system works fine without my having to learn anything about it. I know how to set my thermostat and to call a professional if anything goes wrong.
Punishment isn't something you do instead of education; it's a form of education -- a very primal form of education best suited to children and animals (and experts aren't so sure about children). I say we stop punishing people for failures of technology, and demand that computer companies market secure hardware and software.
This originally appeared in the April 2006 issue of Information Security Magazine, as the second part of a point/counterpoint with Marcus Ranum. You can read Marcus's essay here, if you are a subscriber. (Subscriptions are free to "qualified" people.)
EDITED TO ADD (9/11): Here's Marcus's half.
Posted on August 22, 2006 at 12:35 PM
• 79 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
"The real problem is that computers don't work well. The industry has convinced everyone that people need a computer to survive, and at the same time it's made computers so complicated that only an expert can maintain them."
Though good luck pointing this out at DEFCON, or wherever the next Avian Flu Of The Internet is going to be announced.
obviously bank computers should be carefully guarded. But they are not personal computers-- and the fact that users were able to run random programs.... that is problematic with the bank's IT management.
In general, there are countering pressures. A truly secure system would not allow any other programs to be installed, but part of the allure of the home computer is that it can be taught new tricks with new software, otherwise it would just be one more boring appliance like a DVD player. And there's the rub. There are thousands of benign programs out there, some by the big guys, many more by small outfits, or hobbyists; some risk is associated with each level (we take a similar risk with our bodies buying hot dogs from a street vendor) but each time you add something new you do take a risk. There is a balance which is different for different circumstances
"....primal form of education best suited to children and animals (and experts aren't so sure about children)...."
Animals neither, please. Well squids may be excluded.
We educate users about cars. We train them, test them, re-test them, and limit or even suspend their driving privileges if their driving habits endanger others.
But Aunt Millie gets to plug her shiny new computer into her shiny new cable modem, without ever taking a course or demonstrating basic proficiency in network computing. As a result of her ignorance, her box may get enlisted in some DDOS-launching, spam-spewing bot net, participating in malware distribution and expensive extortion schemes, and all it costs her is "my internet seems a little slow today!"
Why shouldn't Aunt Millie have to take a test? Why should her negligence cost her nothing?
@Carlos: "Why shouldn't Aunt Millie have to take a test? Why should her negligence cost her nothing?"
The risks associated with bad driving don't compare to the risks associated with Aunt Millie's "negligence". If we also factor in the administrative burden of a testing program, we'll find that it's really not worth it--especially if we have cost-effective technical mitigations that can account for Aunt Millie's "negligence".
Forgive me, however, if you were being deliberately facetious with the comment.
Quincunx is doing it again! Nail him!
@ Carlo Graziani:
One answer that comes immediately to mind is that users are NOT required to be trained and tested before using a home heating system (as per Bruce's analogy), so your point is only valid if we decide that a computer is more like a car rather than a home heating system.
I happen to agree with you, that a computer is like a car. It's something that affords great convenience to the user, but careless use endangers not just the user but others as well. I also agree with your idea, that mandatory training and testing should be done for anyone who wants to enjoy the privilege of using a computer. The problem is, how do we transition to that? People are very much used to being able to just go buy a computer, plug it in, and start surfing the web. It's going to be tough to convince people to undergo any mandatory training/testing. It's going to be even tougher to sell to the industry, since in all likelihood this is going to reduce the number of sales (Aunt Millie will probably decide, screw it, the shiny new-fangled computer is not worth the hassle). The government will probably not put any regulations in place unless something REALLY drastic happens. And how would you enforce such a rule?
"....Why shouldn't Aunt Millie have to take a test? Why should her negligence cost her nothing?"
Are you seriously even thinking that we should have a licensing test for computer use? Also be careful with your wording.
Here is a common definition of negligence: Failure to exercise the degree of care considered reasonable under the circumstances, resulting in an unintended injury to another party.
Is Aunt Millie really being negligent by not taking a computer course and who gets to determine basic proficiency in network computing? You? The government? Frankly I don't trust either option mentioned to tell me whether I have that proficiency or not.
I don't see a failure of education; I see a failure of technology. It shouldn't have been possible for those users to run that CD, or for a random program stuffed into a banking computer to "phone home" across the Internet.
I would like to disagree. This is not a failure of technology.
Computers -- in the sense of a hardware+software system -- are extremely tweakable and malleable. There are no technological problems in forbidding a computer to run programs from CDs or to establish outgoing 'net connections. You just have to configure them properly.
What you are describing is a failure of the bank's IT department to properly set up and control the machines it's responsible for. It was a choice, made by presumably professionals, to leave the machines in a state where they can run arbitrary programs from a CD and talk to the 'net. It's rather trivial to prevent both.
You might argue that the users won't stand for completely locked machines and the IT department's hands are tied. Well, that may be so, but in this case I struggle to see a technological solution to this problem. It's possible to mitigate the risks in many ways, but most of this ways come from established policy and procedures, not from capabilities (or lack thereof) of some piece of hardware or software.
"What you are describing is a failure of the bank's IT department to properly set up and control the machines it's responsible for. It was a choice, made by presumably professionals, to leave the machines in a state where they can run arbitrary programs from a CD and talk to the 'net. It's rather trivial to prevent both."
I see it differently. I see it as a failure of the computer industry, forcing the IT department to be responsible for this.
Yes, we can live in a world where the IT departments are forced to deal with all the problems of lousy security after the fact, but it's not the most efficient way to operate. And it leaves all the individuals, who don't have the expertise of an IT department, completely screwed.
I feel that the problem isn't that computers are too complicated, it's that they are too approachable. Most consumers believe that their limited proficiency with a word processor qualifies them as computer experts. (I am a facility manager posting on a site frequented by some of the top security minds of our time. I must be a security expert too, Right?
As someone who does open up heating systems, I can assure you that there is no internal security system which keeps out those with just enough knowledge to be dangerous. Most people know that they are incapable of fixing the furnace, so they call an expert. It is the consumate approachability of computers which makes them so vulnerable to the "attacks" of the masses, the non-expert consumer.
The furnace/thermostat analogy was particularly apt. There is a disconnect between the two linked systems. Bruce, like most people, understands that the disconnect exists, despite the links. Most of us computer users don't understand that there is a disconnect between the very front end of our computer experiences and the "under the hood" portions which expose us to the threats. Thus, we stumble along blindly, obvlivious to the mess we create for ourselves.
To rely on a technological solution to this problem is building another level of brittleness into the solution. I hate to say it, but this is one area where education is the best solution. Not to make the user more like an expert, but to make them more aware of their limits.
I am using the term "negligent" advisedly, Brian. Aunt Millie can easily
do some simple things to exercise basic network hygiene, which would reduce
her exposure. Not to zero, but considerably. She could even hire someone
to carry out these steps for her, if necessary.
My view is, Aunt Millie has to see some cost for poor network practice.
Relieving her of all responsibility for the considerable damage her
networked computer can do if mismanaged guarantees that the problem cannot
be solved completely, ever. It doesn't have to be the government that
imposes an insecurity cost on her -- testing and licensing is just one
model. Liability is another.
Here's yet another model: suppose she had to put up a $100 bond to her
ISP for that cable modem, and suppose her ISP monitored her traffic for
easily-detected evidence of cooptation by malware -- suddenly sending
thousands of e-mails per hour, say, or sending tons of HTTP connection
requests to an IP address known to be under DDOS attack.
Then the ISP could disconnect her, inform her that her bond is forfeit, and
that she should post another one to get re-connected, and that if she
doesn't want that one forfeited as well she better clean up her computer and her
I bet she'd make a more careful selection of her next computer and OS if
that happened to her even once.
The best argument against user education is that it doesn't work.
There are plenty of reasonably well-educated security folks who read this blog. And I'll bet most of those people still download software off the internet from sites they've never visited before, or click on links they receive from a friend in e-mail. Those are both risky behaviors, totally inappropriate if you are using a machine that has access to anything sensitive.
Yet we still do it. Maybe those researchers were right. Even security people don't care that much about security.
Voting machines are an interesting example of security's inability to focus on UI. The whole "butterfly ballot" debacle was about a poorly designed UI on the paper ballot. But rather then try to fix it by brining in usability experts, the porkbarrels-that-be decided technology was the only cure.
"I don't see a failure of education; I see a failure of technology. [...] The real problem is that computers don't work well. [...] it's made computers so complicated that only an expert can maintain them."
A computer is a complicated piece of hardware and so is the operating system that is running on it. I have a critical dislike for discussions that call for "dumbing down" the computer/OS in order to make it safe or user friendly. However, dumbing down the PC will put severe restraints of what can be accomplished with a computer and I find that counterproductive to say the least.
I would propose building "restricted" computers that would work more like an appliance. In the case you brought up, a bank workstation, a full fledged PC is probably overkill in many positions. If a user employed by a bank needs 3-4 simple apps in his daily work then there is no reason why he/she should sit with a modern PC with a modern operating system. And there is absolutely no reason why the entire PC user base should be limited or restricted because of these users.
If a large user base just wants to use mail and surf the web why can't we build computer-like applicances so that aunt Millie can safely plug in and do just that? It would be possible to do for a fraction of the price you would pay for a PC.
So yes, this is a failure of technology, but I disagree with you on exactly what that failure is.
I see it differently. I see it as a failure of the computer industry, forcing the IT department to be responsible for this.
The security requirements of people and organizations are both complex and vary widely. I can't really envisage a one-size-fits-all solution from the computer industry.
Besides, I'm not quite clear on what kind of a solution can it be. If you just want locked-down machines, well, you can go and buy a thin-client system without local drives or storage. And you can't sell locked-down machines to the general population because they will rightly complain that they can't do what they want to do -- which, yes, includes things like running malware that promises them a shiny or opening executable attachments from their cousin Mabel.
There had been attempts to sell dumbed-down computers, typically under the name of "email appliance" and such. They all have failed miserably.
It seems to me that the main issue with expecting the industry to deliver a silver bullet is that there is no one "security problem". Instead, there is a rather complex landscape of needs, risks, restrictions, and trade-offs that are all diverse enough to make a standard solution unworkable. The real problem is not with the implementation, the real problem is with understanding your specific needs, risks, and trade-offs, and for that you still need clueful people, not an off-the-shelf product which the salesman assures you will make you secure...
What will actually happen is that Aunt Millie will find another ISP with less draconian policies. I daresay you won't find one person in a million willing to sign up for an ISP that fines their users $100 every time their virus protection software fails to detect a threat.
I think Bruce has already pointed this out multiple times but there is simply next to no commercial demand for real security. People want CDs to "just play" when you put them in, without having to mount them, scan them, validate signatures, etc. Look at OpenBSD -- excellent security, completely free, and an installed base comprising less than a tenth of a percent of the world's PCs...
Back to the dumb terminal that runs applications on a system far away from your desk. If you want an application you make a request and it is installed. You want to bulk enter data, you provide a disc to the sys admin group who then loads it for you. You want data out, you ask the sys admin group for a disc.
Keeps things simple, you don't have the worry of rogue data making it onto individual computers and if configured properly, you should be able to run all the same applications you do now. With a central storage that can be managed by professionals.
Think, how many times do you really need to load/save data at your desk? Most people e-mail things around. (okay, I just realized there is a leak there for potential infections) but for the most part you don't need to have a drive of any kind on your computer (or USB ports)
I got a new camera. My photography skills didn't seem to improve. The new one is better than the old one. I'm still just a fair photographer. We're always getting tools that are better than we are. Search engines know more about us than we do and just keep gathering more information. It's all ad and marketing driven. The search engine tools are better than we are or at least more organized than our lives are. Google is so neat and tidy. Yahoo is cool as in !. Life is full of different messes and doing the dishes. I have to go wash some dishes! I'll try posting some dirty pictures of the dishes later.
You get a Ph.D and the dishes still wait. You can't find the time. Use a search engine.
Generally I shy away from anything Microsoft and Windows and recommend people install/use Unix/Linux but FWIW here you go:
"NIST is pleased to announce the release of draft Special Publication 800-69, Guidance for Securing Microsoft Windows XP Home Edition: A NIST Security Configuration Checklist. SP 800-69 provides guidance to home users, such as telecommuting Federal employees, on improving the security of their home computers that run Windows XP Home Edition. Home computers face many threats from people wanting to cause mischief and disruption, commit fraud, and perform identity theft. The publication explains the need to use a combination of security protections, such as antivirus software, antispyware software, a personal firewall, limited user accounts, and automatic software updates, to secure a computer against threats and maintain its security. It also emphasizes the importance of performing regular backups to ensure that user data is available after an adverse event such as an attack against the computer, a hardware failure, or human error. The publication contains detailed step-by-step directions for securing Windows XP Home Edition computers that can be performed by experienced Windows XP Home Edition users.
NIST requests comments on NIST SP 800-69 by August 31, 2006. Please submit comments to email@example.com with "Comments SP800-69/XPHome" in
the subject line."
So let me get this straight, today the quote is "I see a failure of technology. It shouldn't have been possible for those users to run that CD"
But the old quote was
“If you think technology can solve your security problems, then you don’t understand the problems and you don’t understand the technology.��?
Yes, I am attempting lame humor.
"Part of the problem is generational....Older generations approach newfangled technologies with trepidation, distrust and confusion, while the children who grew up with them understand them intuitively."
The children don't understand the new technologies any more than the older generations. The children simply aren't afraid of it and possess a false sense of understanding. That tends to serve many of them reasonably well as the technology becomes mature, stable, and fault-tolerant. But please don't mistake that lack of fear and abundance of (over)confidence as "understanding."
To see this as a failure of the IT industry is rather like suggesting that Ford motors failed in making trucks because they can be used inefficiently. Surely the reality is that this is a failure of management?
PC's are immensely flexible and powerful, but organizations have a tendency to dump them on the desk and hope that power will be used productively.
Running rogue applications is a problem, but it is not a problem with technology. PCs are meant to run new applications.
The issue here is that no one has considered which applications should be run to serve the business at hand.
The approach of blaming technology is rather like blaming AT&T because I misuse my phone to call my uncle in Australia from the office.
On the issue of making PC's harder to use, I would have to say that the problem is not that PC's are easy to use, it is that they *appear* easy to use while in actuality demand a considerable degree of knowledge to use well.
Therein lays your, and our, problem.
Both comments are valid. Technology, which doesn't solve your security problems per the second quote, enabled that CD to be run. Hence, technology failed the user, per the first quote, since it aided compromising security.
See technology is a bad thing. Not only is it useless in solving security problems, it creates them in the first place. Kind of like neocons?
The car analogy is actually quite useful, but I think people are missing the point. Indeed, Aunt Millie is required to be trained, tested and licensed in order to drive a car, but... she needs none of that in order to merely ride in a car. She just needs to know how to use a simple safety belt.
The point being that we in the industry have not gotten to the point where we can just tell users "belt yourself in, and enjoy the ride". That should be our goal.
@ K. Signal Eingang
"What will actually happen is that Aunt Millie will find another ISP with less draconian policies. I daresay you won't find one person in a million willing to sign up for an ISP that fines their users $100 every time their virus protection software fails to detect a threat."
What if all ISPs had to impose this on their users by (international) law? Then Aunt Millie has three choices:
1) Clean up her act
2) Stop using computers
3) Emigrate to the last country in the world where they don't have a law such as the one suggested. The problem for Aunt Millie is that this country was just kicked of off the Internet...
Point well taken. On a related note, what if a unicorn flew out of a monkey's butt?
Something pretty drastic is going to have to happen before a universal, well-enforced, international treaty on malware takes shape. And I'm not sure I'd favor a policy that puts so much liability on users, rather than software and OS makers, ISPs, and for that matter the people creating the malware in the first place. Anybody - well, any Windows user - can get hit by a zero-day worm, even with the best defenses in place.
And, as an additional point to the one made by Kees, if instead of legislating "ISPs as Net Sentinels" we merely agree that ISPs are liable for monetary damages attending malware/spam/ddos/penetration attacks originating from their networks, they will take up the Net Sentinel role in a big hurry. Then Aunt Millie isn't going to be able to vote with her feet any more.
"Yes, we can live in a world where the IT departments are forced to deal with all the problems of lousy security after the fact, but it's not the most efficient way to operate. And it leaves all the individuals, who don't have the expertise of an IT department, completely screwed."
We not only can, but do live in precisely that world. I believe the world works this way because it IS more efficient than the alternatives.
I have the experience of an IT department, but I don't have the free time or desire to spend a lot of time doing patches and upgrades to all of my systems. The risk is acceptable to me at the level of administration I'm willing to do for my home machines.
At work, I use the same idea of calculated negligence in maintaining a handful of servers. The reason is the same - the risk of something going wrong, combined with the impact isn't substantial enough to justify the time (expense in simple business terms) required to be more diligent.
I see the same situation all over the place - people make security decisions that are cost effective in their environment.
My first computer was an Atari 800. There weren't a lot of security issues with that platform. All of the storage was removable media (floppies or cassettes), so the amount of damage that could be done by a malicious program was limited. With little potential for harm, there was little need for an IT department to prevent the harm.
Networking changed all of that over the last decade. Network connections used to be rare and expensive, so they were limited to the dedicated, serious geeks. Always on internet connections are pretty standard these days. In the old days, there was limited interest in what a segment of uber-geeks was doing. Now we have a neverending barrage of newbies showing up, which are an irresistable draw for the scammers who prevent the clueless from discovering the horrors that come from having too much money.
There's no way to close the pandora's box that has been opened with network proliferation. If everyone had to get their email through their ISP's web interface and do all of their web browsing and file transfers through their ISP's proxy server, the ISP would be in a position to notice abnormal usage patterns. Even if we did, what incentive would anyone have to secure someone else's stuff?
When a business is approaching a software project, they have to choose the level of effort they want to put into the project. Given a spectrum containing options from "Can't work" to "Impeccable, provably correct solution", corresponding with a cost of $0 to $infinity, most companies will choose between "Barely workable kludge" and "Usable kludge". (Kludge being defined as a functional, but inelegant solution in this context.) In my experience, the spectrum of kludges does not leave room for noncritical activities, as as improving security or defining requirements. However, activities like optimization may be included. A program running twice as fast is observable. Bad security is observable if your program is exploited, but failing to be exploited could be attributed to any number of reasons. Any item that doesn't make a contribution to the bottom line isn't going to be prioritized.
Ultimately, the only cost effective solution is to be responsible for your own security. If you delegate it to someone else, you'll end up paying substantially more for the same level of service.
Hmm, everyone's talking about education. What's missing from the equation, and analogy?
If the hired repairman incorrectly repairs the heating causing Bruce serious injury, what can Bruce do? Bruce can sue not only the repair company, but also the manufacturer for defects.
Can I sue Microsoft for the defects that caused Aunt Millie's computer to become infected that sent thousands of SPAM messages to my system that I now have to deal with??
Hmmm, seems to me that Liability is missing from the equation.
I may be wrong, however.
How about a sticker on the computer like this one for the heating system:
Only a licensed and trained heating and cooling repair specialist may repair this heater, any and all repairs made by this repair man are done without warranty, and liability. Do not play with fire, you may be burned. Touching outside of heating surface, especially metalic surfaces may cause burns, or other painful injuries. Breathing un-burnt gas in a closed space may cause eye and throat irritation, opening a window is recommended. Leaving the window closed for more than 1 minute may cause loss of consiousness, and death may follow. Any or all similarities to a real license agreement is up to the reader.
In re: the NIST guidelines, I can't help but think that Aunt Millie will be put off her tea and biscuits by the framing advice before download, from:
"Do not attempt to implement any of the settings in this guide without first testing."
So, the dear old girl should set up a test environment before rolling the NIST recommendations into production?
Computers have an amazing ability to do all sorts of things. We call them "programs" or "applications", but they are extensions of the PC capability. Anticipating what may or may not be preferred functionality is impossible. No one knew 10 years ago that kids today would have several hundred gigabytes of music and video files stored on the PCs. No one knew that IM would become ubiquitous and some people would have thousands of "buddies".
If operating systems or ISP networks had been configured securely, a lot of this innovation wouldn't have been possible. It's the open-ness that allows iTunes, IM, and MySpace to flourish.
However, banks do NOT flourish on open-ness. There is absolutely no reason for a bank teller PC to be randomly sending the internet its data. Unfortunately, there is a general workplace culture that demands that users have access to the Internet. This culture almost always over-rides the poor security folks' objections...until there's a breach.
I think what's missing is a specific statement of what the technology should be able to do in this case that it basically can't do.
People need to be able to run random software. They need to be able to put disks into machines and get data off them.
a) it should be easy to automatically sandbox untrusted applications so that they can run, but can't send data to the internet or otherwise cause damage.
b) it should be easy to automatically alert the security team (actually database system) that an unknown program wants to be run and i) record an image of the program ii) compare it with known and accepted software and iii) choose whether to run it on programmed criteria. iv) give an appropriate message to the user.
The technological failure now is that you have a choice of either allowing or disallowing running executables. One choice puts your business at risk and the other choice blocks effective use of the computer you paid for. Neither is acceptable.
In re heater-repairmen and IT, I have on
several occasions found code violations
including serious safety hazards in work done
I hire them because I don't have the time to do the work, not because I have any belief in their infallibility.
But I do inspect their work, rather than just trusting them.
Much as I don't trust SW vendors or IT.
My employer might, but that's their problem.
Bank Teller: Do you have an account with us.
BT: Would you want to open an account?
BT: Do you have an account at another bank.
BT: What bank would that be?
Me: I'd rather not say.
Especially in a bank with multiple strangers standing around listening.
Banks might not flourish from open-ness. They sure can try to get you to be open. This was during the process of cashing a check, a process that requires your fingerprint on the document when you have credentials stating who you are and a negotiable document that they must honor. They are trained to learn as much as they can about you or basically interrogate you during a simple transaction. The money changers have been turned into data collectors. The banks trade on information. You used to hand over a check with your photo ID and get cash. Now you get a sales pitch. A friend called AT&T to resolve something regarding a bill and the rep. went into a fast talk sales pitch which was a waste of time. The bigger the corporation the more time of yours they can afford to waste pushing services, pushing ads, whatever. Innovation wouldn't take place if security was better is laughable. Security is why people adopt certain technologies and trash others.
Start a service and tell users the security sucks and see how many people sign up and give you money. I don't use myspace, not because the security sucks, but because the quality of the content is low it doesn't have value for me. Try getting people to pay for myspace and you'll get an education on how low the quality is. Much of it isn't worth the cost of the power to support it. That gives you an idea of how much effort is put into securing the data. It's not worth securing. It's a culture of junk that flourishes. Security professionals are going to ignore it because it is isolated and if AiRhEaD90210 loses her myspace data it doesn't matter. You don't need to trash myspace, the users do it for free by the minute.
I'll add that if the entire myspace site melted down and crashed for good, it would last for a news cycle and be forgotten about in a short time. It would be replaced in days by something better. Well at least something newer, which in our culture is better. If it was gSpace and had Google coolness, the kiddies would go nuts for it. You wouldn't need to make it secure, just functional in a dysfunctional family kind of way.
"So, the dear old girl should set up a test environment before rolling the NIST recommendations into production?"
It states that it's a DRAFT [document] and comments are welcome, so I'm thinking by the very mention of it being a draft and comments being welcomed that it's not all finished yet but soon will be, perhaps you could comment and help them? :)
The behavior exhibited with the free CDs mentioned in the article is not isolated to security related matters. For example, how many people continue to have unprotected sex after all of the massive public education campaigns that have been conducted? This isn't a problem isolated to any particular population. This is a problem in the USA and Third World, SubSaharan Africa. The basic problem is that humans don't have perfect impulse control. While the reward of inserting and running a CD isn't quite as primal and biologically rewarding as sex, the lure of flashy graphics and hot, new music is too much for many people to resist.
Natural selection weeds out this sort of behavior in the biological world. Perhaps it takes infecting half of the corporate network and getting fired to sufficiently reinforce basic security behavior. When the population in general recognizes proper bahavior and conforms to it then those who don't abide by those behavioral norms and put the larger population at risk tend to be shunned and isolated by the whole.
"A security flaw affecting the social networking site MySpace exposed more than 1 million of the site’s users to malicious programming..."
How many people using myspace even care, let alone know this? They are excited about ringtones or some other funny stuff. If you told them their data was about to be lost, they wouldn't care.
"Experts" shouldn't be so sure about animals, either. Sure, punishment works if you just want to see results (although a healthy dose of positive feedback will also be necessary for optimal results), but it's only really justifiable when you don't care about what you actually do to the animal involved. People should have better morals than that, especially these days...
MySpace: A Place for Dolts
"You see, when you sign up for MySpace, you instantly have your first friend. You're immediately best buddies with the most popular person on MySpace: Tom. Now, to understand the stupidity of this, you have to understand that this is a social networking mechanism; if I'm friends with John and John is friends with Sally, then Sally is syllogistically my friend, and if I visit her profile it will tell me just that: "Sally is in your extended network". But if EVERYONE is friends with Tom, then there might as well not be an extended network feature at all, and he is defeating the purpose of his time and his website. Basically what I'm saying is, Tom is a dumbshit."
The whole thing is at http://www.kuro5hin.org/story/2005/7/16/72023/...
Typo alert: "vary primal" in the article should be "very primal".
I agree that it's not so much that people don't care about security but that their ability to understand it is, in fact lacking. As the post states, it is up to the professionals to make it simple for us, since they are the ones in that profession and therefore, more likely to understand the problems we may run into as users. I do feel that security is becoming a greater issue and that tech companies are starting to respond to it's necessity (like with the extreme security features seen with Vista). But at the same time, I feel that this is the information age, and with the internet, anybody is capable to learn and get information about anything nowadays, so although it may be reasonable to rely on pros to come up with a comprehensive and complete product, that does not mean we should be lazy and not educate ourselves. Even if one program may not offer the best security features, there are always more products out there that can complement other software. I cannot help but feel that many of us are lacking the responsibility to go forth and seek resources to get good information to make good decisions.
Actually this is a perfect example of an industry with little regulation (the computer/networking industry), and here are the results. Massive negligence. It's all about the externalities, which is something most 17-year-old I-just-read-Ayn-Rand lassez-faire types don't ever talk about. I'm not sure most of them know what an externality is. And BTW, I was a free-market libertarian at that age too. I'm still one of the anti-authority libertarians, but experience with profit-at-any-cost employers has made me realize free markets are not a perfect system.
Famously, one professor had his account password cracked, and when the admins went to him and explained it wasn't strong enough, he replied "that's okay, I only use it for printing". Unfortunately, nobody is educating users that this kind of attitude costs other people time and effort.
I also agree that not everyone needs to be able to install software; my grandmother, for example. In NetBSD, you can limit what programs people can run via signed binaries, and you can limit what people can run in Linux via SELinux. I'm sure there are third-party tools for Windows that can do this too.
What someone really needs to make is a simple distro for home users, particularly seniors, and bundle it with hardware and Internet access, and have them use that. Game consoles are also viable (think xbox media center).
I've worked at companies with locked-down systems, but luckily I manage to impress the IT staff enough that they let me do things that they normally wouldn't allow. I've only had anti-virus save me once, when I thought a .scr was a screenshot, not a screen saver (executable). I think it's totally reasonable that an employer give employees some kind of test before they can do things like install software. I'd even be willing to put up a portion of my paycheck as collateral if my actions do cause some sort of problem. My current employer lets IT staff bring in whatever hardware or software they want, but they're quite literally the best of the best in their domains of expertise. With many employers/employees, this would be a disaster waiting to happen.
some users are dolts also being lured to fake MySpace sites that capture keystrokes -- including the same logins and passwords used to access corporate networks and sensitive databases.
It gets worse, The more serious danger for companies who hire dolts, however, comes from how often that same login information is used. "What's common practice with most users who are dolts is [to reuse] whatever passwords they use for one account for others as well -- such as banking, e-mail and IM accounts."
Another method is this: but I found the best way to get into a girl's myspace is to tell their best friend that they had sex with their boyfriend and gave him an std. works because girls are retarded and always give their friend their password, and also believe anything.
Maybe it will become one big std forum for dolts. I don't know.
>I see it differently. I see it as a failure of the computer industry, forcing the IT department to be responsible for this.
I see it as a failure of the industries who rely on IT to demand more from the computer industry.
For example, if banks formed a BankIT consortium that specified what an acceptable banking PC both did and did not do, then banks could simply say "Provide this", and computer vendors would be happy to comply. Price, of course, would be negotiated.
If other industries who depended on IT and trusted the BankIT specification also said "Provide this", then vendors would have an even larger market for BankIT-compliant systems.
The problem is that the players keep thinking of it as an externality or a recurring cost, rather than as an investment with a return. Penny-wise, pound-foolish.
Recently on the Dilbert blog (www.dilbertblog.typepad.com) the posting 'My Computer is Rotting' got hundreds of responses from people who think their machine freezing up is to be expected, and that the reliable fix is to reformat the hard drive and reinstall the OS.
They would have no idea if their machine was hijacked by a botnet, spamming the universe or even selling kiddie porn.
Apparently, most users never think about security at all. Or, if they do, they don't know what to think. Almost all their 'information' is actually advertising copy, even if presented as a news article, so if they worry about security their answer is to pay for a heavily advertised plug-and-play service to clean up all
Who's going to pay for your secure computer?
The car analogy is a bad one. Aunt Millie can kill one or more people and cause thousands of dollars in property damage by being negligent with her car - how many people can she kill by being "negligent" with her computer? And before anyone argues that her "negligence" will cause thousands of dollars in property damage, the damage is due to the conscious ill intent and effort of someone else.
Here's a better analogy: if Aunt Millie leaves her front door unlocked when she goes out, and someone walks in and makes a bunch of prank phone calls, or cons someone out of thousands of dollars using her phone, is it her fault? The problem is, the front door lock makers (the computer and OS makers, in this analogy) have made such complex and cantankerous locks that Aunt Millie needs the regular assistance of a locksmith just to be able to lock her door.
what's with all the analogies? drop the analogies and try solving some real problems.
alot of commentators get so caught up with analogising things they don't actually bother to _do_ anything.
Your column comes mighty close to stating that the problem is the availability of general purpose computers, Mr. Schneier.
Damn it's scary knowing that banks still have cd drives on there work stations!
"But the heating system works fine without my having to learn anything about it."
Ok, fair point, but if you treat your computer like a heater then you might find the same "fine" results.
Power it on, let it pump heat into the room, power it off.
On the other hand, if you expect your computer to perform something a little more complex, what would "works fine" and "reasonably secure" look like?
With the recent discovery of 2,000 year-old computers, there must be some better examples to pick from:
>I see it differently. I see it as a failure >of the computer industry, forcing the >IT department to be responsible for this.
Not only the computer industry but general management and internal politics. I know well the IT guys at my firm.
They would like to have more secure settings, do more security stuff. But they are restrained by the fact that people make a nuisance for them each time they try to enforce better security measures. And especially people who are high up on the management ladder.
So basically it comes down to a power struggle. Users don't want to be restrained and the IT staff can only go so far as the internal politics will let them go. Politics again rule the day.
It's unfair to heap all the wrongs on the IT staff.
Same thing applies to general users. If an ISP started to ensure real security policies, users would scream and go elsewhere.
It's the old usability vs security tradeoff. As most people are unaware of the dangers of the Internet, they do not see the need for real security.
Concerning education, It is certainly not the ideal solution but I still think it has a place especially in the business world.
Best of both worlds is good security policing and security awareness. IT security is better if it is multi-layered. If one level fails (non-secure settings allowing CDs to be installed) then maybe the education level will help (the user informed of the consequences will think twice about installing the CD).
But maybe a more effective way would be to integrate security awareness into the annual job review. This would give an incentive for workers to really enhanse their security skills and practices. Good incentives are all.
More cynically, an example also works wonders. Experience shows that people tend to be less careless when one guy gets fired for being sloppy in security.
This leaves the general public of course. Generally people suffer themselves most from their security msitakes (lost data, identity theft, etc). Concerning damage caused by their coputers used in bot nets, maybe it would help making people liable for the damages caused by others using their computers ? Not sure but maybe.
All in all, I agree education isn't THE solution but I don't share Bruce's total pessimism about it, it should be part of the solution.
As per asking the computer industry to make secure computers, I agree there should be more insistence on this and more incentives through liability but it can't be the only solution.
Software and hardware will never be 100% security reliable. There is always a glitch in programs. Even IBM mainframe source code has errors. Maybe we can get more secure computers and that will help but just like education isn't all, secure computers aren't all.
Security can't rely on just one measure, it should be multi-layered. If one measure isn't good enough, you still can hope the other will hold. In the end, all may still go wrong but better chances with multiple layers than with single layers.
The real underlying problem with security is the so-called "intellectual property" (which is seldom intellectual and definitely is not property, but rather a temporary monopoly).
This continuing idiocy creates false "economies of scale" allowing companies like Microsoft to play the customer lock-in games, thus creating monoculture of medioctity.
If you can't "own" software, there's no longer incentive to deliver crap fast, nor there is any increntive to be deliberately incompatible and obscure.
My €0,02: I agree with your view regarding the need to increase the pressure on the software and hardware companies. They must take responsibility for their products' flaws.
But this isn't enough. There will always be issues that won't be solved just by increasing the general quality of the products, e.g. phishing, because they're of a different nature. And for these we still need to put the money on user education.
This said, what I think we should do is try to figure out which are the *key* security ideas we should be talking about when we address the community. Because there are just so many awareness campaigns that are so dull and, to some point, so naive, that won't be of any use. I've seen a few that try to shoot at everything at the same time and, in my opinion, they fail because people don't get the point and just become confused.
I think everyone seems to be missing the Bruce's point thin _is_ a technology problem. The hardware/OS aren't _designed_ with security in mind.
In the example given there is no reason for the CD driver/OS to allow programs to be run from the CD transparently just because there is an AUTORUN.EXE program on the CD. Someone, probably from a marketing background, decided this would be a Good Thing to go into th OS without realising the implications. Probably nobody even thought about the problems.
A two level solution could have been implemented quite easily where normal data was always readable while write/execute access by default required some extra step/authorisation. Then the second step could be policy controlled. As it is users may need access to CDs to read data but by default anything else is automaticaly enabled at the same time.
A computer has a very important difference with a heating system: it is a machine which is designed to be *generic*. The point of it is that it can do whatever you can program it to. With the firewall analogy, it's "default to allow". If you change this, computers might become much less useful. You'd turn them into things like games consoles (yes, I know they are computers, and this is my point too).
Maybe we need two kinds of computers ? The usual generic Turing machine that you can program, and another, locked, that only runs what, er... Microsoft wants ? Hmm, that does ring a bell...
I agree with some of the more recent posts that, as usual, a multi-layered approach is required to solve a problem as wide-spread and complex as this one. But there are (a combination of) things the IT industry can do to make the user experience more secure for everyone. For one thing, they can make computer OSes out-of-the-box much more secure.
Things like: Firewalls up, already configured to block traffic from known malicious sites; pop-up blockers on browsers default to being on; warnings built-in on things like images/links/attachment in e-mails and installation of applications (look at the OS X practice of requiring a user password before allowing an app to be installed...nowhere near fool-proof, but certainly a good start).
I like the idea of sandboxing new apps...maybe there could be a built-in 'trial period' for an app built-in to the OS? Thus, any installation process automatically places the new app in a sandbox.
Also, on most computers, packet-sniffing/monitoring apps are third-party software. And in most cases, you have to know quite a bit about networking to be able to use them. What would be really nice is if new apps were 'sandboxed' and all outgoing/incoming network connections requested by them would be automatically blocked and a message was presented to the user, in understandable English (or whatever language), telling them what's going on and what the potential risks are. Most firewall software and network traffic monitors all use cryptic technospeak that makes them impractical for Aunt Millie to use. And even on the more 'secure' OSes (such as OSX), network connections can be made without your knowledge unless you install a third-party monitor.
Basically, computer OSes should return control to the user and can also be set-up to 'educate as they play'. Of course, this is assuming that the OS creators actually see some sort of financial interest in doing this!
Technical details of a blocked process can always be provided via a 'details' button/window rather than burdening (and scaring) a newbie user with IP addresses and cryptic messages of doom (Be careful! This network connection could bring a horde of viruses charging onto your computer like teenagers to spring break!).
But just seeing how often connections are made/requested on a modern PC is an education in itself for most users. You can always give longterm permission to any process to always have access if you so choose (by checking a checkbox, and then typing in your password).
I'd like to read what Mr. Ranum has to say but unfortunatelly non-US or non-Canadian people are "not qualified people"...
Paid subscriptions are offered for those who reside outside the U.S. and Canada. For more information, please "click here".
You answered "No" to the question:
I am in the US or Canada and would like to receive a FREE subscription to Information Security magazine.
If this is incorrect, please hit your BACK button to return to the Information Security Magazine subscription application and correct your entry......
The problem with the idea of punishment is that people won't know what they're being punished for. Behavioral modification works best when there is a clear link between cause and effect. But if my identity is stolen, did it happen because I didn't shred something I should have? I clicked on the wrong link and downloaded a keylogger? I gave my information on a phishing site? My bank lost its backup tapes? If my computer becomes slow and buggy, is it strange incompatibilities, a zero-day worm, or something from MySpace? If I do something stupid, I do not immediately get a red flashing light that says "Don't do that!" I find out about it later, if at all.
"Quincunx is doing it again! Nail him!"
When you can't attack the argument, attack the messenger.
"Damn it's scary knowing that banks still have cd drives on there work stations!"
Damn its scary that it was probably running Windows instead of something else.
Lets face it, the combination of not being able to sue MS for its externalities and the giant Windows monoculture has led to this. Bruce is right to that extent, it is a tech problem, but one that won't be solved until click through EULAs are eliminated and that yes, in fact, MS CAN be held liable if a whole in the OS or in IE causes worms and all sorts of damage.
I mean, I remember the hype about Mitnik and Mafia boy being responsible for "billions" in damage, but no one blamed the insecure OS and application that let them do it.
I think my Engineering Ethics prof said it best: "It is the job of engineers [and other professionals, such as doctors] to serve the 'Public Health'"--where that last term is defined broadly (from this particular point of view buildings need to be safe to serve the Public Health). "If an engineer fails to uphold standards and act in the best interest of the Public Health he becomes like any other, non-professional, worker." I may not be quoting him exactly, but I think y'all can get the point. Common, sane, and rational standards are required (however one may choose to enforce them is beyond this venue) in all technological fields.
A failure of humanity was required for the failure of technology to happen.
OK, a challenge: if you had two (2) hours of classroom lecture time to teach first- or second-year undergraduate students in computer science something about security, what would you teach? What would you expect to accomplish by teaching that? What if the audience was all undergrads, not just CS majors? This is the problem that we as instructors face: the curriculum is already over-full, so we can't do any topic real justice. We therefore have to pick and choose and prioritize, but we don't really know that much more than the droid on the street. Advice from the cognoscenti would be welcome, so long as the two-hour limit (which forces *you* to prioritize) is taken seriously.
Great challenge Greg. I'd say teach about privacy. Privacy applies to security and applies across disciplines. Most people also value privacy, so they take the security lessons personally. With no privacy, there is no security.
I agree that applications and operating systems are a large part of the problem. The question remains: what should be done about it?
At a minimum, we'd need:
1) Sandboxed applications. Applications that you run off the CD or whatever shouldn't be able to do anything with permanent effects. Hard problem: how to increase their capabilities (to open files that the user specifies, for instance) without havoc? This probably requires a substantial design change in filesystem technology. Applications that the user actually installs should be limited, but coming up with a model that protects the user while still allowing them to download silly widgets is really hard. Similarly, systems must draw a distinction between applications and documents, the way UNIX-like command line systems do.
2) Authentication has to be built into the operating system. It should be hard to spoof password dialogs; perhaps they should have a system/user-dependent background or border.
3) Passwords should be sent to websites and other remote services using something like SRP (http://srp.stanford.edu), so that users don't have to worry about certificates when logging into websites. As a less disruptive alternative, it should be easier to save certificates.
4) With the possible exception of certain privileged, built-in applications (drivers, accessibility, ...), programs should not be able to capture screenshots, draw directly to the screen when not in full-screen mode, capture user input, or inject input into other applications (shatter attack). In particular, commands should not be able to come from anybody but the user. Balancing this with remote access (SSH, Remote Desktop) will be interesting.
5) Program configuration files should be readable only by that program and approved configuration editors. For instance, it should be impossible for other programs to read (or possibly even address) a user's SSH public keys or known hosts. It should also be difficult for programs other than the shell to set shell aliases. Obviously, this impacts usability.
6) Automated checking tools and safer languages should make it harder to write stupid security bugs (buffer overflows, integer overflows, command/sql injection, double frees).
7) Administrative applications should be able to get around some, but not all of these restrictions.
What else do we need to secure the future? What will the impact on usability be? And how many decades will we need to bring it about?
Spending two hours on the introduction to either "Beyond Fear" or "Secrets and Lies" would be a worthwhile effort, for either CS majors or general undergrads. (Damn, I sound like I'm brown-nosing.)
If I was teaching undergrad CS types about computer security, I would probably spend 30 minutes or so talking about the concepts behind various attack types, in an effort to get one point across: "Attackers Cheat."
I'd spend the next 90 minutes on how you build systems that survive when attackers find a new way to cheat. Concepts like assessing risk, defense in depth, and default deny policies.
Hopefully those two hours will be enough to get people interested, and they'll spend more time on their own.
I would just like to throw out some ideas and open them up for discussion etc, and see what you think !
Malware restriction possibilities thru IbIZ
These ideas/suggestions relate to, ideally, a fresh install on a brand new 100% totally Malware free computer. Or one that can be verified as 100% totally Malware free. So you start off with a clean slate in which to build a securer computer.
How about, if the Operating System " OS ", for one, was placed in an Isolated but Interactive Zone/s " IbIZ " such as for eg VmWare, or the likes, and or Sandbox type software, on storage medium/s. Any other trusted application/s-software could also be placed in there too, and/or outside of this in " normal " space on storage medium/s, and/or in their own IbIZ. Or sections of the applicaton/s-software could be further broken down into smaller chunks and placed in their own IbIZ and/or " normal " space.
Other files/folders/software etc could be subject to any or all of the above.
Also any or all of the above need not be stored on the same storage medium/s. These could be split/shared over more than one.
As well as the above, the OS need not be one complete package. It could be split up into componet parts and distributed into other areas such as above, called on only when required. Also independent OS's could be operating, or called on to engage, as and when required, for eg - a media player doesn't need to be constantly loaded or running. This could work in it's own dedicated space, and could operate with it's own OS. As media players arn't normally expected to make full use of a large scale OS, this dedicated OS could be much reduced in size, but sufficient enough to do everything it would need to do. And an OS as such may not even be required to run such an application/software, or indeed other things.
The idea being to isolate Malware from infecting as many places as possible, including core components.
Other ideas could also be implimented and integrated along with any or all of this, including - full/partial disk encryption - full/partial software encryption - authentication - hardware integration, with or without accompanying etc software - etc.
Obviously some parts of such a System would need to communicate securely with other parts, sometimes more than others, and exactly how this is accomplished i leave to others.
I'm not saying it will be a walk in the park, or even a jog to accomplish, more like marathon possibly, and then some lol, but i feel it's worth at least consideration, if not all, but some of these suggestions !
Many of the things you mention have already been solved, in Java's various execution models. For example, look at the JNLP specification, the Java security policy files, and so on. This doesn't mean Java is perfect, but it certainly does provide an existence proof that the major problems are solvable, practical, and deployable.
@Tim: Java can solve 6 and part of 1, with a reasonable performance penalty and a less reasonable usability penalty. Perhaps better JITs and the latest windowing toolkit will reduce these problems. There's still the problem in #1 of setting the policy correctly, and escalating it conservatively if the application requires more privileges.
Much as I dislike Java, it can certainly be part of the solution here; I just think that a much bigger part of the solution will be at the operating system level.
Yes it must be a lonely place... isn't it? :)
I think in addition that, the digital world is too big for everyone to understand it completely, I, you, we, might comprehend more thanbasic users, but still we could be ignoring other security issues on different levels and systems. And who notice those? i learn everyday more and more, and still there are many ways to breach security measures.
Take webserver security as example.
There is a webserver which runs 3000 virtual accounts. Al those accounts have a admin login combo, and these server host thousands of accounts with personal and corporate information.
What will happen if one obtains the master login to the rootserver?
is it like: one key will rule them all?
There must be a basic level of security understanding like:
1.) Never attach mediums to one's computer which are foreign.
2.) Just don't open unknown emails and attachments.
3.) Always LOGOFF from your pc while away.
If they don't do it, the sysadmins will do it, blocking everything in the firewall, doing it for you, but don't complain afterwards about all those "restrictions". ;)
The analogy to a driver's license is very funny because at one time you could drive on the public highways without such a license. It's not that licenses are evil, but hasn't anybody noticed that more people die in auto crashes than just about any other voluntary activity despite all those licenses? Who's the tyrant that believes if the government takes a few hours of your day and some of you money every few years that we'll somehow be safe?
While I see the problems you're raising, I don't see you proposing much in the way of workable solutions.
You say ``It shouldn't have been possible for those users to run that CD, or for a random program stuffed into a banking computer to "phone home" across the Internet.''
Are you proposing a restricted environment, where only authorized software can establish network connections? Who will do the authorization? Won't such a system be impractical, possibly even costing more than the less secure system? Will it even giver real security benefits, or will buffer overflows and the like still be able to subvert approved software, giving worms and trojans (e.g. JPEGs that exploit vulnerabilities) the same control over computers they have today? Perhaps you could restrict what programs can do through some systrace-like facility, but is that feasible in practice?
I have, in fact, been advocating a system where only approved software can be installed and run on systems. The idea is that organizations can certify software as being trustworthy (so that users don't have to make these decisions for themselves), and that your system will be configured to trust some of these organizations. The system would refuse to let you run any software that didn't carry a signature from a trusted organization. Since it would be hard for malware authors to get their software certified by a trusted organization (organizations that did certify malware would not be trusted), it would be hard for malware to be effective.
``The real problem is that computers don't work well. The industry has convinced everyone that people need a computer to survive, and at the same time it's made computers so complicated that only an expert can maintain them.''
I disagree. It's true that maintaining a machine running Windows is a great hassle: you need to keep up with patches, virus definitions, adware definitions, etc, and there's no unified interface to updating all software, much less one that alerts you when updates are available, with a single click being enough to install them (note: this interface exists for the _system_, but the bulk of applications aren't integrated with it). However, on Ubuntu, there is one place to go for installing and updating all software on the system, and it will alert you when updates are available, and it's easy to install said updates. Now, if Ubuntu only had some more proactive security measures (buffer overflow protection etc.), it would be even better...but it's certainly not difficult to _maintain_.
``Punishment isn't something you do instead of education; it's a form of education -- a very primal form of education best suited to children and animals (and experts aren't so sure about children).''
I'm not so sure about animals, either (besides, I hold the view that humans are animals, too). Do people train their dogs to obey commands by punishing the dog as long as it doesn't do the right thing? It would take a long time that way, and I don't want to know how the dog would feel about it. Instead, we reward the dog when it does what we want it to do. Similarly, angry words don't seem to prevent children from being naughty, but offering them activities that both you and they like seems to work very well.
The one case in which I think punishment is a good idea is where it provides an incentive to do the desireable thing. One proposal I've heard and liked that applies this to computer security is holding people and institutions financially accountable for damages that can be traced back to their systems, regardless of whether these systems where what the attack originated from, or if they were merely compromised and used by the attacker. The cost of this, or the cost of an insurance policy to protect against the liability, would decrease with better security - hence, there would be a financial initiative to get better security, even if you don't notice the malware and backdoors that may be on your system. Don't want to invest the time? Fine, but you pay the price. Don't want to pay the price? Fine, but then you take care of the security.
What are you saying is that computers must be intelligent, a lot more intelligent then people are, and in a way, independent. Computer can't guess your mind! They are made to execute things, like running a program from user's CD, if the user wants so. And no one likes restrictions (if i let my computer to do whatever it wants, not what i want, then why do I need it?!).
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.