Schneier on Security
A blog covering security and security technology.
« Friday Squid Blogging: Giant Squid Sex Life |
| School Bus Drivers to Foil Terrorist Plots »
February 20, 2006
Proof that Employees Don't Care About Security
Does anyone think that this experiment would turn out any differently?
An experiment carried out within London's square mile has revealed that employees in some of the City's best known financial services companies don't care about basic security policy.
CDs were handed out to commuters as they entered the City by employees of IT skills specialist The Training Camp and recipients were told the disks contained a special Valentine's Day promotion.
However, the CDs contained nothing more than code which informed The Training Camp how many of the recipients had tried to open the CD. Among those who were duped were employees of a major retail bank and two global insurers.
The CD packaging even contained a clear warning about installing third-party software and acting in breach of company acceptable-use policies -- but that didn't deter many individuals who showed little regard for the security of their PC and their company.
This was a benign stunt, but it could have been much more serious. A CD-ROM carried into the office and run on a computer bypasses the company's network security systems. You could easily imagine a criminal ring using this technique to deliver a malicious program into a corporate network -- and it would work.
But concluding that employees don't care about security is a bit naive. Employees care about security; they just don't understand it. Computer and network security is complicated and confusing, and unless you're technologically inclined, you're just not going to have an intuitive feel for what's appropriate and what's a security risk. Even worse, technology changes quickly, and any security intuition an employee has is likely to be out of date within a short time.
Education is one way to deal with this, but education has its limitations. I'm sure these banks had security awareness campaigns; they just didn't stick. Punishment is another form of education, and my guess it would be more effective. If the banks fired everyone who fell for the CD-ROM-on-the-street trick, you can be sure that no one would ever do that again. (At least, until everyone forgot.) That won't ever happen, though, because the morale effects would be huge.
Rather than blaming this kind of behavior on the users, we would be better served by focusing on the technology. Why does the average computer user at a bank need the ability to install software from a CD-ROM? Why doesn't the computer block that action, or at least inform the IT department? Computers need to be secure regardless of who's sitting in front of them, irrespective of what they do.
If I go downstairs and try to repair the heating system in my home, I'm likely to break all sorts of safety rules -- and probably the system and myself in the process. I have no experience in that sort of thing, and honestly, there's no point trying to educate me. But my home heating system works fine without my having to learn anything about it. I know how to set my thermostat, and to call a professional if something goes wrong.
Computers need to work more like that.
Posted on February 20, 2006 at 8:11 AM
• 90 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
I would guess that a criminal ring trying to launch an attack like this would do so with promotional USB keys. A friend of mine who worked for a major investment bank was given quite a number of them at various seminars and conferences. Given how much more useful they are than a CD with Valentine's Day software of questionable value, I am guessing a lot more people would plug them into a lot more systems.
Personally, I'd probably just plug it into someone else's computer with a good virus scanner on it first, then use it all over the place. If whatever malware installed didn't show up on a first check (maybe cloaked like the Sony rootkit?) it could spread very far indeed.
I'm afraid that I can't disagree more with Bruce's statement that "Employees care about security". No they don't. They regard IT security as a nuisance, at best, and an obstructive, costly adjunct to Internal Audit at worst. If they can find a way of ignoring or avoiding IT Security, then they will use it. It's a continuous uphill struggle to make them pay any attention whatsoever. Security has to be ubiquitious and non-optional.
"Computers need to be secure regardless of who's sitting in front of them, irrespective of what they do."
This is certainly not true today and I don't think this is even a reasonable goal. The most secure systems I run across are the least user friendly, and I do think those are somewhat exclusive features. Home users would likely choose user friendly over secure. I think this is also true of most companies as these decisions are made by people who have a business background and compare the risk with the cost of mitigating the risk.
If I have physical access to a computer, it's not secure. (Yes, there are a statistically insignificant number of exceptions to this, but the focus here is average users on most computers.) As a sysadmin, I need to know how to get into machines without passwords or retrieve data from machines I can't boot. Anything that prevents a bad guy with physical access (even if proxied through a user) from doing something bad is likely to also prevent me from being able to recover a system.
Signed executables would go a long way to addressing this problem. Organizations would be able to configure access only for the set of keys that correspond to the software that is approved. Home users should be able to disable key checks if they're more interested in ease of use than security. This level of control is not built into trusted computing platforms. The philosophy is that if you want to use the OS, you need to play by its rules. Unfortunately, one size never fits all.
Companies don't seem to have much stomach for counter-espionage which is what your suggesting with the firing of folks that fell for the scam. I'm not sure why not companies don't do it. Maybe the stakes aren't high enough. It would seem to be the only way to know whether your employees are vulnerable.
Most companies lack the stomach for it because most companies worth performing espionage against have unions. Those organizations tend to oppose mass firings for any reason, even if it will keep the business afloat.
Have to agree with you there. We recently put in a content-filtering proxy server to track what people do when & where, and it would appear that many of our employees don't even care about work :P They care more about weather, news, sports, and porn.... and not in that order!
Perhaps an approach more likely to succeed is to educate the *security personnel* that security that detracts from business value is a nuisance or more - and that effective security doesn't get in the way of doing the main job, whatever that happens to be.
Education is only part of making this work; a closed, non-transparent approach means that most users only see security when it gets in their way. Transparent security would be seen by the employees as adding to their business value, not detracting from it.
The CD is a good example - most organizations would have a rule prohibiting running of "unapproved software". A more transparent approach would be to simply change the language to prohibit "un-evaluated software". It's not the *approval* that makes software ok to run -- it's the risk assessment and evaluation. Changing the language helps employees recognize this, and make better security decisions on their own.
"Computers need to be secure regardless of who's sitting in front of them, irrespective of what they do."
That's a totally bogus claim, and while it may work nicely as a sound-bite to make people think that you're going to save them from themselves, it's a completely ridiculous assertion.
Consider that I've typed in a document, and now I wish to delete it. What is "secure" for the computer to do? Ignore my command, just in case the key-strokes were issued by a virus or a co-worker that has walked by? Or accept my command, just in case the requirement is that we completely remove any record of that document from our system (let's say it was private information that we accidentally acquired, and which by law we must not hold on to).
Computer operation is full of ambivalent issues such as these, because the jobs we need computers to do fall into such polar opposites - we use storage for retention, but we also need to be assured of data destruction; how do you combine the two securely?
Bruce: excellent point. The desktop PC with its disk operating system is an unnecessarily high risk in most offices.
Sometimes an employee doesn't care. Sometimes the employee cares but doesn't understand. Quite often, though, the problem is that a caring, understanding employee views himself or herself as a special case; it's that other guy over there in Marketing who downloads viruses, not me. Yes, that 5MB flash animation of the Easter Bunny that Jane in Training forwarded to the entire office was a nuisance that she should have known better than to send, but the one I just sent out featuring the soaring eagle singing "God Bless the USA" is a deeply inspirational piece that all good patriotic Americans will surely appreciate...
As far as employees caring about security - I don't think you can make sweeping statements one way or the other. Education and documentation are critical to any security program - and also hardware lockdowns where feasible (no CD roms on bank user workstations for example). But, any security program is useless if the staff haven't bought into it. They will ALWAYS find ways around it (especially those ultimate evildoers - programmers and DBAs). I think what might have worked in the experiment mentioned is not to fire anyone, but just pick maybe a subset of the folks who loaded the CDs, and bring them in as a group to be "questioned" by some outside authority. Have this outside authority strongly drive home the point of personal responsibility - i.e. would you want to be a suspect in an investigation in the case of a security breach? Then ask them how THEY think the problem should be addressed(that's where you get the buy-in) and encourage them to talk with coworkers about what happened. This is kind of the reverse of the carrot/stick approach - my mother used to put it best (44 years teaching in the public school system) - FIRST you have to GET THEIR ATTENTION.......I just see too many security programs fail because staff get the message "Do it because I said so!" like they were little children. This attitude is arrogant and paternalistic i.e. goes in one ear and out the other.
When I was managing an IT department at a manufacturer, I had to deal with a large number of barely-literate users who thought that offers for icons, cursors and email stationary were just fantastic.
It took about two years of using a combination of sweeping the network and publicly shaming users (with good natured but pointed teasing), who installed such things and who set off email viruses, before the general tone was set in the office of not opening every attachment and not clicking on every offer of free emoticons.
It's not enough to publish policies. Were I in that company's IT department, I would personally visit some of the people who tried to install the CD and quite loudly let them know just how dangerous their actions were. When done with a smile, peer pressure is far more effective than frustrated attempts to lock down user's workstations.
The flaw, Bruce, in saying that computers need to be secure irrespective of the user is that the wide variety of users and their needs makes it virtually impossible to broadly lock down computers sufficiently to protect them, and a maintenance nightmare to tailor lockdown to individuals or groups.
What we're waiting for is the generation of users who grew up with computers and have an intuitive understanding of how they work to become the majority in the workforce so that security policies aren't viewed as so much extra paperwork.
Why does the average employee workstation in this kind of environment have the capacity to access removable media at all? It's a recipe not only for people putting in things they shouldn't (like "Valentine's Day" discs), but also taking out things they shouldn't ("iPodYanking" or whatever the current buzzword is for when your files go out the door in a USB-connected device in the employee's pocket).
When I were a lad, we used VT-320 terminals, and we liked it.
Idle question, which isn't addressed in the linked article: did the CD check to see where it had been opened? In other words, did it count someone who read the warnings, took the CD home, and opened it on their own computer as not caring about the company's security? Because the warnings cited might convince someone to do exactly that--oh, cool, a promotion, it says I shouldn't open it at work, I'll take it home and see if I'm getting free chocolate.
Still not a brilliant move--some of us care at least as much about the security of our own machines as those of our employers, because our own machines have our own data and no IT department to fix the problems. But not a threat to corporate security.
This is quite a lot of preperation for a criminal to go through...burning (hundreds?) of CDs...standing out in the cold... And what about all those London security cameras? I think this is far-fetched. Sounds like the beginning of the plot to "Oceans Thirteen" or the next Hacker movie. But would the end justify the means? If the criminals walked away with a million dollars (er...pounds) by transferring money away from Victim Company X, I'd say it was genius. But honestly, that would mean this company had bigger security issues. That would be quite a leap from CD-in-drive to stealing millions. We'd all still suspect it was an inside job. If the result was just another virus being let loose, wow. Wouldn't that be original. :)
"However, the CDs contained nothing more than code which informed The Training Camp how many of the recipients had tried to open the CD."
I read this as the program connected to some site to register that it was run. If they did something like open a web page, the access would show up in their logs and they would be able to differentiate between an IP address in a company's block vs an ISP's.
"What we're waiting for is the generation of users who grew up with computers and have an intuitive understanding of how they work [...]"
Uhm... I think this generation is currently vanishing. It was the generation who grew up with 8 bit computers and the start of the PC and the Internet. Today's users are faced with black boxes, becoming even worse with DRM sealed machines, ubiquitous computing, and such... How many of them have ever viewed a file with a hex viewer? How many of them do actually know what hexadecimal is?
I'm speaking about percentages, not about total numbers, of course.
On my new shiny computer, purchased from Dell and pre-installed with MS Windows XP Professionsal, a CDROM placed in the drive seems to run a program on that drive without me asking it to.
Thus I am deprived of the ability to virus-check the CDROM prior to it starting to execute, for example, an installation program.
I think this option should be turned off on every computer on delivery.
I would like to turn it off, but have so far not found how to.
On another computer of mine, there was an option (under a previous version of MS Windows) to turn off this option. But it did not work.
Any help would be appreciated, as would other views on whether such functionality is a security hazzard etc.
I have to agree with the usefulness of glass teletypes over full-capacity workstations in basic security terms, but only if you have a) an os/environment that lets you centralize the functions of your business, b) a sensible administrator who hands out fine-grained permissions only as necessary to perform duties, and isn't afraid to revoke them immediately on completion.
However, I also have to complain vociferously about the idea of turning computers, as an entire class of things, into useless, limited-capacity boxes for the safety of the company. Most of the world wants a computer because it can do anything you tell it to. You must limit yourself in terms of spheres here. Bruce, you want to lock down computers exactly because of what they are: general purpose, reprogrammable calculating devices. Which means that you don't want computers on the desktops in your cases. You want remote access over lockable protocols with fine-grained permissions.
Someone mentioned we used VT terminals and we liked it. I really think that is a viable solution. It doesn't have to be VT terminals but unless they need heavy display performance (and Windows Vista will), for most 2D applications the protocols are there to do this remotely. Have a central bank of blades and run the applications remote with a very thin client. If you need to take a file home then email it. Corporations can look at the email and at least know what is going where. For most office work that is all that is needed.
VT-320? Luxury! We had to fight for the "good terminal" (an ADM 3A)...
Firing/shaming is a good idea. One problem some firms see with the former course is that if word gets out that they had to fire somewhat high-ranking people for doing extremely foolish things, they will suffer reputational losses. Of course, when that promo CD isn't just a test, they'll suffer even larger losses...
C Smith had a very sensible comment. If possible, security should be invisible to employees most of the time.
From a business perspective it's usually not a good decision to fire half of your staff or make your systems difficult to use, all in the name of security.
Also, it might be more relevant to ask how many employees are breaking against rules and policies and why they do it, instead of asking who is doing it. For instance, in many organizations you just have to use your own USB stick to get your work done (maybe because you don't have network access in a conference room, and your company hasn't provided you with any other secure removable media), and therefore many people do it, although it's forbidden in some IT security guidelines that nobody has ever read (except the persons who wrote them; this seems to be very common...).
Rules should be adapted to a secure, yet productive way of working, otherwise people won't follow them. In the "personal USB stick" case one possible solution for a company is to buy sticks that employees are allowed to use, while a wrong one is to punish or fire people who are using the sticks for doing their jobs.
And really: people are not interested in corporate security, most of the time they have much more important (or more amusing) things to think about.
Interesting experiment, but it is simple to fix. Using group policy (or locally), just turn off auto-run for the CDROM drive on your desktop computers. Simple. I do this for all networks I admin and have yet to run into problems. As a humorous sidenote, you will quickly find out who used their work computers as CD music players. Another best practice is to setup all your desktop users as LUA, which by itself, prevents almost all malware from installing/running.
In a way security professionals seem to disagree about what security should be like:
Some think that security works when it makes things more difficult. Making it harder to do regular work is the price to pay for making it harder to do bad things. In this approach, a lot of security responsibility is put on all employees, not just security people. Employees should check each others' ID badges in corridoors. They should clean their tables from papers when they leave the office. And they should constantly be suspicious about their communication channels and be aware of malicious software on the Internet.
Others think that security works when it makes things easier. Security should be handled by security professionals. Physical security with guards and access control let's employees focus on their work inside the company's premises instead of watching over their shoulder for bad guys. Personnel security let's employees trust their co-workers and keep their drawers unlocked. IT security let's employees communicate and use their computers without worries about encryption and malware protection, and so on.
Obviously neither of these models works on its own, and good security professionals know that. While the latter presents a good goal (employees can make money for the company without worrying about security), the former one is often needed in practice. Still, like with many other things, if security is everybody's business, it's nobody's business -- plus that it's an unnecessary resource-hog. An organization usually needs both the professionals and a healthy security culture.
"Computer and network security is complicated and confusing, and unless you're technologically inclined, you're just not going to have an intuitive feel for what's appropriate and what's a security risk."
Everyone has heard the phrase, "Don't take candy from strangers." When we become adults and "candy" can be redefined as "CDs" or "free gadget" or "promotional gimmick" does this simple rule of thumb not apply anymore? Even a child should know better than to blindly trust any gift that is handed to them by a complete stranger. Why are the people in this story suddenly excused from that very simple responsibility?
It's true that much of IT security is not intuitive (the proper use of encryption technology to secure private correspondence, for instance, can be downright baroque) but I'm getting really tired of people applying this same excuse to every single little failure of security that pops up. This particular instance is a clear case of user negligence and irresponsibility, and if there's a way to track these CDs to the people who used them rather than just the network they were run on, I sincerely hope every one of them is reprimanded.
There comes a point when you just have to expect a little bit more from people. This is an example of one of those times.
The answer to your example is neither. The act of deleting a file should require some form of confirmation that cannot be performed programmatically. This may be as simple as having a "Delete" that marks the file for deletion but doesn't remove it and having a seperate mechanism for removing "deleted" files from the system.
"Do you want to empty your wastebasket now?"
When I were a lad we used Volker Craig VC4404 terminals and hated 'em. But they made the vt220s seem like wonderful pieces of newer technology.
@Secure: "Uhm... I think this generation is currently vanishing. It was the generation who grew up with 8 bit computers and the start of the PC and the Internet."
You misunderstood my meaning. You're thinking of "intuitively understand the inner mechanics", when what I mean is "intuitively understand the system as a whole". My 5 year old nephew understands how to put a CD in the tray and start his Bob the Builder game. When he's 20, he won't have trouble understanding that executing an emailed attachment is probably a bad idea. In the office I mentioned, the worst offenders for setting off viruses were the old women serving as admin assistants to the sales force: They learned computers late in life as black boxes used by rote steps, but never grokked them as a tool or a functional system, so never connected "virus" with "attached file" (actual quote: "why would someone email me a virus?"). Trying to explain to these personnel that you shouldn't open attachments freely was a lost cause.
The whole problem with computer security is, as Bruce observed, that large parts of it are these abstract intangibles. The generation growing up with computers accepts the system as a functional whole and learns the innards as necessary, quite easily.
For a criminal enterprise, this kind of attack might be a longshot. Of course, every major city has a financial district and if you targeted workers in that district, your odds of successfully isntalling a keylogger or some form of backdoor software on a computer in a bank goes up a bit. When you consider how cheap CDs are and how easy it is to stand on a corner during lunch time, the investment isn't really that big, probably cheaper than buying enough lotto tickets to significantly improve your odds of winning the jackpot.
But what about someone who just wants to sew a little chaos? It still sounds far-fetched to most people, but terrorists really might be interested in something like this. A virus isn't "original", but that wouldn't make it any less effective at shutting down a couple of banks given the right planning. Even if it was just a "another script kiddie vandal" behind it all, that wouldn't make it any less damaging or costly for those organizations effected by it. You can't write off a risk just because the perpetrator is some stupid kid.
Why on earth would a bank connect PCs with CDRs and USB ports to its network in the first place? Simply: for the convenience of the administrators, not the users. Computers can be set up through network pushes in many environments. Having CDRs and USB ports (and floppy drives) is irresponsible in high-security environments.
Also: You guys had terminals? Lucky.
"You misunderstood my meaning."
I think I've understood you quite well. Question is: Can you understand a system as a whole when you don't understand the inner mechanics, intuitively or anyway else? It is all about abstractions, and when the abstractions break you have a problem - your intuition won't help you anymore.
"When he's 20, he won't have trouble understanding that executing an emailed attachment is probably a bad idea."
Executing an attachment is already an abstraction. Do you mean executable programs? Are you aware that a JPG is "executed" in a similar manner in which a program is executed by the CPU and that it could trigger a bug in the JPG viewer?
Will your nephew know the difference between code and data, and the fact that the difference is only marginal when it comes to working with it? Will he know that a safe file format today may be dangerous tomorrow? Will he know that even viewing the email (without clicking on the attachment) could trigger a bug in the mail application, just like visiting the wrong web page with the browser can infect your system? Will he know that even receiving the mail (without viewing it or even knowing that it is there) could trigger another bug?
As already stated above, the computer is a general purpose machine. Absolute security is only possible by filling it up with concrete and throwing it into the ocean, together with the user. This machine won't be a security threat anymore (well, except for the fish).
Virtual machines and sandboxes may be a solution. But even those are not really secure. Have you ever tried to read an old 720k diskette in a VMware workstation 4.0.5 build-6030 session? It will trigger a missing BIOS function, immediately stopping the complete session with an error message. Don't know if it still is in current versions. Maybe it could even be triggered programmatically, opening a path to the host system...
@Bruce and @Huge
I agree with you both. @Huge - users don't care about "Computer Security" in the abstract. They see it as a stupid imposition. @Bruce - users care about computer security in the concrete. They mostly don't want to cause harm to the company the work for and they certainly don't want dangerous computer security people stopping them getting data to their computers. Their problem is that they don't relate this in any way with the people who interfere with their computer use.
To those who say: "stop them using the CD drive" or say "turn off autorun" (which amounts to the same thing for some normal users): You are part of the problem. The CD drive isn't put there for no reason. When security people stop other people doing what they need to do (which includes getting data on CDs from customers / partners / etc.) then other people stop believing in those security measures.
What is needed here is more intelligence. When someone puts a CD in the drive, start a script. Ask some questions: Do you accept us scanning this CD and recording a copy for company records? Actually record the CD on local disk and an SHA256 checksum on the server. Allow applications to be run and installed, but only in a security context which allows access to files in a specific directory and not to the network. Allow files to be opened, but limit their application context in the same way. Automatically kick off an application assessment process. Inform the user, with gratuitous violence, if needed, if they did something wrong.
As long as our security measures visibly stop or largely slow down real needed work our security lectures will pass through people's heads like tau neutrinos through a bus shelter.
@ Matthew and Chris
We didn't even have zeroes. We had to do everything with just the ones. And we were happy to have them!
Thanks to winnot, who wrote "You can disable CD autorun and fix other stupid windows behaviors with tweakui". I will look into that, as a solution to my minor problem, and see whether that fix requires me to execute .exe or .com files, and (if so) whether the site itself is one that I can assess (with reasonable effort) as a sufficiently trusted souce.
On the more major point, why are operating systems from reputable suppliers shipped with, as standard, gaping security holes for the average user?
I also recollect, well over long enough ago for comment by security gurus, such as Bruce, concerning the promulgation by Microsoft of the .doc file format (with inbuilt macros as virus carriers), in preference to the less risky .rtf. Now, I don't recollect seeing a single comment or mainstream posting in support of my view, nor one refuting it on grounds of technical accuracy (and we all run the risk of technically innacurate postings).
Where are we on these issues?
I might well be wrong; if so post that it is so.
If I am right, and the mainstream computer industry provides, prefers and even preferentially promulgates practices that increase the spread of malware or the risk thereof, surely I am not the only guy who NOTICES AND CARES.
@Secure: "I think I've understood you quite well. Question is: Can you understand a system as a whole when you don't understand the inner mechanics, intuitively or anyway else?"
As you work with a black box, you can develop a conceptual model of the internals that's far from perfect, but allows you to infer certain behaviours or internal logic. That conceptual model makes it easier to explain security needs and to grasp the necessity or usefulness of best or at least better practices.
What I'm identifying is a generation of computer users who are cargo cult users, who came to computers late in life and have no conceptual model of the inner workings of a computer. They proceed by rote memorization of necessary steps, and are incapable of inferring anything about software--they simply know whether or not pressing the right sequence of buttons opens the correct window or not. This group represents a unique security threat that will, thankfully, diminish with time because the generation after me is growing up with computers everywhere and develop the necessary conceptual model in childhood when it's easiest.
Terminals? Oy, I learned to touch-type on a Terminet.
"As you work with a black box, you can develop a conceptual model of the internals that's far from perfect, but allows you to infer certain behaviours or internal logic."
Agreed. I've thought too deep into perfection. A basic understanding to handle leaky abstractions is important when you develop for the system. As a mere user a conceptual model is good enough most of the time.
"They proceed by rote memorization of necessary steps, and are incapable of inferring anything about software[...]"
Umm... yes, I have such a customer. He is not even able to memorize the necessary steps. Completely computer illiterate, thus we have to write down the more complicated things he only uses once a week, instead of using it on a daily base - even if it consists of just one step: "To start the Outlook export, press the button in the toolbar labeled 'Start Export'". He forgets even this.
Computers need to be secure regardless of who's sitting in front of them, irrespective of what they do.
Unlike most commentators, I agree with Schneier. And despite it first seems as security and usability are exclusive, they can't be (by definition for example). And securing PC while maintaining most of the usability is exactly what Windows is trying to do.
Alun Jones asked how should a secure application act if an user wanted to delete his document. It should delete it while keeping a backup (so you caould always recover it), which is exactly what WinFS is said to do (you can actually recover any version on it). Of cause these backups are to be removed after some time to free up disk pace.
Justin said: "The flaw, Bruce, in saying that computers need to be secure irrespective of the user is that the wide variety of users and their needs makes it virtually impossible to broadly lock down computers sufficiently to protect them, and a maintenance nightmare to tailor lockdown to individuals or groups."
I say that taking time and effort in proper setup of different user accounts can save a lot of time and effort on later maintenance (less issues caused by user-ignorance or user requirements). Most often it pays off in matter of time and effort. However, you should never do that as after setting up the accounts, your role in security/IT support might become unnoticable to your superiors and they might choose to replace you for someone cheaper.
There are many solution available for securing computers. Using virtual machines (with IDS and other monitoring and detection software) on corporate workstations is becoming more and more common (and it is a very good way to monitor intranet for suspicious activities and to protect computers from real damage). Activating DEP (for all programs) on Windows systems is almost a must when securing them. Internet traffic is already being monitored, filtered and analysed almost everywhere.
The weakest link is almost always the human - especially when talking about confidentality (he/she can always pass the data to outsiders - even when no IT is present). All we can do is limit the persons access and/or log his/her use of confidential information (as means of detecting leaks). Almost everything else can be enforced by proper choosing and configuration of applications (including hardware and OS).
Note I used the word "almost" a lot. By doing that I express the lack of absolute security - it just does not exist. We can't build completely secure system, but we can choose the best available configuration for our needs.
Also the black box theory seems to be correct. And that is a good thing - more work for us = more value to us ;).
I have to agree with Mike et al. When you're responsible for a hundred or more desktops and have no staff, locking things down tight creates more work than fixing the occasional messed up PC. I know of departments where only the sys admins can install any software at all. It takes literally weeks to get something installed because they are so backlogged.
Now in a banking environment (or any place where security is paramount (read: $$$), there needs to be tighter controls. But for most situations, a tightly locked down PC is a pain. Keep it patched and run a decent a/v program, monitor the network for abnormal behavior, and continue to drill into your users' heads the basics of security, and you can run a fairly open PC.
I very much doubt if Every single person in a business enviroment actually needs access to ANY input device/s ports etc ! Even the mouse + keyboard sockets etc can be out of reach if the PC was in it's own sealed compartment inside the desk for eg.
Software is readily available to block ALL of the above, and/or restrict to whatever is required and whoever, and some Apps are even Free. Make sure it's password protected though.
The People that do need access should be properly trained and sign an argreement that they are Solely responsible for that PC, with all the implications that implies, including naming + shaming + firing.
Some employees don't even need internet access, so get that locked out too for them. For those that do, then it's very easy to secure IE, or use a different browser like Opera or FF.
These days you need not only very good AV, but AT + AS + AW + ARK + HIPS/IDS etc also.
The way last ended and this year kicked off, proves that the nasties are on the rise, and getting more devious and potentially harmful, just like the perpetrators.
I wouldn't be @ all surprised to hear of something very similar happening again, but this time for real.
@SpannerITWks: "The People that do need access should be properly trained and sign an argreement that they are Solely responsible for that PC, with all the implications that implies, including naming + shaming + firing."
There's a useful comparison to be made here with firearms. Police officers are issued sidearms, and their use of that sidearm is subject to a lot of regulations, practice, training, and requirements. No one questions this because everyone accepts that guns are essentially dangerous things that require skilled handling to be used safely.
Economically, a computer can probably do more damage in the wrong place at the wrong time with the wrong malware on it, yet companies hire finance clerks with only their word that they completed the deVries Business Tools course.
Your very incisive post is in some ways its own rebuttal. Your questions, which were I think meant as rhetorical to indicate that answers are impossible, are in fact the exact sort of questions that we should be asking ourselves and trying to come up with good answers to.
Various people in this thread have tried to answer them; I think there are problems with all their answers. The sort of answer that comes to my mind is a revision control system such that not everyone who was privileged to edit a document was privileged to delete all record of it.
I'm sure you can and should identify problems with that answer. And I'm sure no answer is perfect by any means. But we can definitely do better than we are at the moment, and we should. I think you are perhaps unwittingly doing a better job of the kind of thinking Bruce is trying to promote than anyone else here!
You definitely need a comment system that threads :-)
You had ones?! You lucky, lucky bastard! All we had was an empty set!
re: access control
This is not a technical problem, it is a political problem in most cases.
One major impediment to making systems more secure is the userbase itself expecting things to "just work".
"Why can't I play my Celine Dion CD on my computer? I can at home!"
"Why can't I connect to my yahoo mail? I can at home!"
"Why can't the sales department run Instant Messenger? It would make it much easier for us to communicate..."
Etc... Note that in many instances, there can be legitimate reasons why people ask these questions. They want to get stuff done. IT staff can have a jaundiced view of users in that they can't be trusted.
Socially, people see a locked down computer as a sign of an incompetant technical staff. "If you can't make the machine work without turning off all of this stuff that *I* want to use, you're obviously no good at your job." "The IT department doesn't let you do anything with your computer."
All of the press releases and virus alerts to the contrary, most people have an expectation that whatever they can do on their home computer is safe, and therefore they should be allowed to do it at work as well.
It's also an ownership issue. Nobody says, "Why can't I run this on the company's computer" do they? Has ANY technical professional EVER heard anyone refer to a computer as company property? (Well, maybe those arguing the merits of various terminal types --> when you only had VT100 terminals, it was "THE computer"). Instead they say, "Why can't I run this on *my* computer?"
People have a belief that the computer on their desk is their computer. As long as that is the case, I don't see the culture of computer use to change. As long as their PC at work says, "Dell" on it and their computer at home says, "Dell" on it, the difference between the two is functionality removed to make your life more difficult :)
Back to the original thread -> I think Bruce is right in the 10,000 foot view. Corporate computers should be designed from the hardware on up as secure devices. They should not be "commercialized home computers", which is the unfortunate state of affairs in today's computer market.
There's nothing about standard PC hardware that's inherently insecure, except for the fact that it takes nothing more than a screwdriver to open the case. The blame for the sorry state of computer security can be placed entirely on the fact that operating systems and applications are designed with flashy new features in mind, not security.
Part of the reason that people expect to be able to do anything with their work computer is that there's no obvious cost associated with any action performed on a computer. People know better than to use their work telephone to make personal calls overseas or to use the work copier to make Relay for Life flyers because these activities have an obvious financial impact on the company. But it's a lot harder for people to see the security impact of running a program off a CD handed to them on the street, because hey, it'll just play a cute little animation, then it'll go away and nobody will ever know the difference, right?
I guess that's the upshot of the Sony rootkit debacle—maybe people will start to realize how much damage can be done by hidden code entering through the CD-ROM drive.
> There's nothing about standard PC hardware that's inherently insecure
I don't precisely disagree with this, but I don't precisely agree with it either.
Secure vs Insecure is a discussion that only makes sense in context, right?
So for many corporate environments, a USB port that is accessible to the user may be considered *insecure*. Just like a CD reader accessible to the user may be insecure. It depends on what your system needs to look like to be "secure".
But if you're buying 100 PCs from a manufacturer, can you get 100 PCs without a USB port? Can you get 100 PCs without an optical drive?
Sure, to some major extent this isn't just a hardware problem. If you can get 100 machines without an optical drive, you probably don't *want* to, because there are reasons to have an optical drive. But the reasons to have an optical drive are administrative, not technical (you need to install software that only comes on CDs, or you need to have a floppy drive so that you can flash firmware on some ROM, etc).
So yes, it's partially a software problem. The point remains that most commercial personal computers (hardware + OS) are designed from the bottom up as consumer end-products, not as business end-products.
@Huge: I'm afraid that I can't disagree more with Bruce's statement that "Employees care about security". No they don't. They regard IT security as a nuisance, at best, and an obstructive, costly adjunct to Internal Audit at worst....
Then you have not managed to convince them to see IT Sec. as a business enabler. Certainly not an easy task but to some degree possible. It needs a lot of work but it's basically a combination of security awareness with audit and enforcement of the security policy.
"Computers need to be secure regardless of who's sitting in front of them, irrespective of what they do."
The only way to achieve this is to setup a restricted environment where users can do only what they absolutely need to do.
If your users can copy or receive files outside the system or read/write/run anything outside the "absolutely required" set means it'll never happen.
"The only way to achieve this is to setup a restricted environment where users can do only what they absolutely need to do."
Right. Kind ofthe way my VCR works.
Oh, and if you want to achieve that then look for something else than windows.
Windows is basicly a single user operating system designed with emphasis on convenience and features, not security.
For what it's worth, the office I used to work in had the kind of security described here - you couldn't load a program from a CD-ROM; only IT could do that.
Well, I needed to use, fairly often, a program that was regularly updated. Every three months the vendors would send me a CD-ROM. And every three months I had to submit a work order to IT for somebody to come down and use their over-ride to install the new version on my computer. And they would come ... eventually. But it was a huge hassle for them, and a far huger hassle for me.
The point being: if security requires that IT handle something that users would otherwise do themselves, then IT had damn well better be fast and responsive about it.
"Right. Kind ofthe way my VCR works."
Or software you see on common ATM machines.
A few months ago, there was an issue that took the attention of the public media for quite some time - about some very big private investigation company which used a similar Trojan horse, to steal sensitive information from very big companies, The Trojan went through, by the use of the autorun of some CD which claimed to include some presentation relevant to companies. The Trojan horse generated some faked traffic to Microsoft update site, before it started to deliver sensitive information, in order to look natural.
In my view, the problem with employees, is not that they don't care about security. It is that they don't understand security, and are counting too much on automated software to do the job, without understanding the real threats.
I'd venture to suggest that there are two problems - (a) users don't understand security in the main and (b) they don't care, because it's corporate data and not personal to them.
One approach to raising security awareness I have seen is to release a paper to employees on home computer security - i.e. installing anti-virus software and updating it, using anti-spyware software, dangers of IM etc. - this sort of thing works quite well due to the increased reporting of malware threats in the media - in addition, it's personal to users - they are protecting their own system by following the guide and helping to ensure they don't fall prey to common scams etc.
In time, the questions about how the business 'does security' start to arise from the user base based on the contents of the 'home computer security' paper and awareness increases. Users then start to report instances of non-compliance or where they see issues - i.e. "my computer AV hasn't updated for a while - I could be at risk from viruses, better call IT" etc. The more eyes looking at security within an organisation increases resilience to security issues and helps to improve security posture.
Of course, it also helps to make security responsibility a part of everyone's job responsibilities - this responsibility should be included within every employee contract and helps with enforcement.
I also quite like the approach of making an example of individuals who continually flout policy despite warnings - i.e. constantly visiting inappropriate web sites - this depends on the strength of senior management and HR - i.e. will they let you do this or not. Of course, this also needs a rigorously enforced disciplinary procedure too to work.
Of course, without the full support and backing of business senior management (not just IT management), then most security awareness raising campaigns will be for nought. Senior business people need to understand the issues and the very real threats facing them and the companies data - grab the management, get the backing and awareness will be raised across the whole business.
I forgot to write that I was speaking about an issue that hapenned in Israel.
Well, my windows usings friends have their security figured out.
They have air-gap security, with their PC on the outside.
They can run whatever CDs they like, download whatever they like from the internet, and their security is not comprimised.
Too bad about the spam-bots.
"This is quite a lot of preperation for a criminal to go through..."
Not realy, probably less than planning for other crimes that might yeild the same "pound" value.
After all "Dolly-Birds" can be rented by the hour for promotional work, as can students for even less.
If you managed the paper trail correctly you could end up either compleatly in the clear or worse (better from the criminals perspective) putting the blaim on another organisation altogether.
When you get to think this way you start thinking about DRM and audio CD's...
To disable the "Autorun" feature on MS OS's you could do worse than look at some of the web pages of Virus checker software some of them have help files indicating how to do it.
Use the regedit method (not their download file ;). There are other sites that give similar information.
"I also have to complain vociferously about the idea of turning computers, as an entire class of things, into useless, limited-capacity boxes for the safety of the company"
Very few people need the level of functionality given by their desktop PC's to do their day to day work.
In fact one report indicated that there had been a steady decline in office productivity since the introduction of personal computers.
Another indicated for similar reasons that the most productive year for office workers was 1973...
The solution to both security and office productivity is to find what best fits the organisations business/risk model and act appropriatly.
Unfortunatly in the current mass market climate of "Nobody was fired for buying..." and "Out source to reduce costs..." just about everybody gets a PC on their desk that has an OS that has been made "user friendly" by the manufacturer and likewise with the "economy of scale" produced hardware with all the "bells and whistels" setup ready to run.
It then costs large amounts of time and money to close these systems down (even if you get the supplier to do it).
It is actually quite possible using diskless clients and the like to make a reasonably secure business environment that is still quite usable. However it is not going to happen any day soon, as most businesses appear either, not to understand risk or they think it will hurt their "competative edge" as their competitors don't do it...
If you think that's a little unkind to senior managers, look at the reports of the number of businesses that close due to data loss through not doing sensible backups...
While most people do not need the level of functionality provided by the average PC to do their job, the other side of the equation is much more complex. I think there are many of us who forget what it was like to know nothing about security. The average user does not feel compelled to learn about security either. To them, it's no different than if someone told us we needed to live our lives always thinking about a particular management style. I know how I would respond to such a recommendation. =)
If each PC had to be configured to give the user only access to the functions they had a justification to use, it would be an administrative nightmare. General purpose systems that exceed the needs of the individual users can be used by more people. That's much more agreeable to the business than trying to justify each application be written to provide the best security possible at merely 10 times the cost.
Another problem is that people resist change. Right now, there are a lot of people who do personal stuff on company time. How would you feel if your job suddenly said you can't view web sites or take personal calls while you're at work? As long as the individual does what they need to do, is that really a problem? Not everyone has a strong work ethic. This site is a self-selected group for which that may be a foreign concept. There are a lot of people who do the bare minimum necessary to not get fired. Taking away what they do with the 40% of the time they have nothing to do would just be seen as arbitrarily making it a less pleasant place to work. Many jobs are similar in nature to a cashier - if there are customers buying stuff, they should be doing their job. If there is no one waiting to pay, that person isn't doing the company a disservice by looking at a magazine while attending to the cash register.
For example, I work for a bank and am on this site from my work laptop. This isn't a requirement of my job, so it could be argued I should not have access to do this. However, since security is one of my responsibilities, it could also be argued that spending time thinking about security and looking at what happens in the outside world that may be of interest to the company is a good thing. Ultimately, I already resolved all of my emails from this morning and none of my systems have any problems, so there is no net harm to being here.
Businesses are concerned with costs and risks. A security risk that is unlikely or too expensive to mitigate against is going to continue to be an accepted security risk. Security is only appealing to companies when the cost of implementing it is small and the benefit is large. Rarely is that the case.
"If each PC had to be configured to give the user only access to the functions they had a justification to use, it would be an administrative nightmare."
Actually, it wouldn't. Most people need exactly the same computer functions: e-mail, web, and office suite, and not much else. And more and more, whatever specialized applications they might have to use are only available over the net via a browser.
I think you can deal with most of the non-engineer computers pretty easily.
"I'm afraid that I can't disagree more with Bruce's statement that 'Employees care about security.' No they don't. They regard IT security as a nuisance, at best, and an obstructive, costly adjunct to Internal Audit at worst."
I think you're confusing "security" with "IT security." Of course employees care about security. They don't distribute company secrets at streetcorners. They don't invite mobs of strangers into their offices. They care about security.
Of course, they care about getting their jobs done more. When there is a conflict between security and working efficiently, security will most likely lose. They'll take work home with them. They'll put secrets on a personal laptop. They'll hold the door open for another employee, even if that employee is a stranger. They'll give a copy of their office key to their secretary.
IT security is worse because employees don't understand the risks. Security is always a nuisance, but IT security is worse because it doesn't make sesne.
I think these kind of distinctions are important. We'll never "solve" IT security if we don't understand the mindset of the people.
"The CD is a good example - most organizations would have a rule prohibiting running of "unapproved software". A more transparent approach would be to simply change the language to prohibit "un-evaluated software". It's not the *approval* that makes software ok to run -- it's the risk assessment and evaluation. Changing the language helps employees recognize this, and make better security decisions on their own."
Nonsense. It makes not the slightest difference what the standard says. If you can run software from a CD, people will.
umm.. yeah. Build a bank of remote access blades (i.e. terminal services, citrix etc) and then if you need to take a file home *e-mail* it?
I hope you weren't serious. If you have that level of infrastructure in place it would probably be better to use remote access to the facility and never have the file leave the facility directly.
"Actually, it wouldn't. Most people need exactly the same computer functions: e-mail, web, and office suite, and not much else. And more and more, whatever specialized applications they might have to use are only available over the net via a browser."
This is a base set of applications that should be available to all users. While those constitute 90% of what most users use, the other 10% are not irrelevant applications that can be dropped. In every old, established company I've been in, there have been custom applications which aren't used much, but must be used. There's a lot of resistance to change, and no business value in taking an existing application and making it available over the web. In my experience, proposals for improving the long term security of the organization at the cost of additional expense in the short term don't get seriously considered by business people. We don't have to like it, but the reality is that they're the ones who control the money.
"I think you can deal with most of the non-engineer computers pretty easily."
I believe the opposite to be the case. The engineers are the ones most likely to find an alternative solution if one of their applications goes away. If I have a web browser and SSH, I can ignore the rest of the applications on my laptop.
People like the administrative assistants are the least likely to know other ways to do their work and tend to work for people who make decisions about priorities. I'd hate to see security being used as an excuse for moving from an automated, but custom solution to a manual paper based one.
Something to keep in mind about the 'business decision' to buy 100 PCs of whatever kind from whichever vendor, is that the guy who made the decision may have thereby earned himself some 'free gifts' for his home and family.
The best salesmen are adept at finding who in any organization will make the decisions and then finding just how generous they have to be to make the sale.
Personal greed can trump any work ethic.
Thin client can easily take care of the physical access problem. However, it is impractical because IT budgets are made of desktop PC (and OS) turnover, and IT managers won't give that up. In addition, users want their laptops and screensavers, developers want to download and install at will, and managers want the latest in handheld synchronization.
> Well, I needed to use, fairly often, a program that was regularly updated.
> Every three months the vendors would send me a CD-ROM. And every three
> months I had to submit a work order to IT for somebody to come down and
> use their over-ride to install the new version on my computer.
> The point being: if security requires that IT handle something that users
> would otherwise do themselves, then IT had damn well better be fast and
> responsive about it.
Actually, you have this all backwards. The problem here is not your IT department, and it's not your level of access, it's your software vendor.
From an economic standpoint, your company has an investment in a piece of software that has a very high administrative overhead. The correct answer is to have the vendor fix the overhead, not try to eliminate the overhead by implementing bad security.
Your vendor is supplying you software in a ridiculous format (note, this is not uncommon in the software industry). This gets back to my earlier post -> those who design and deliver the "business computer" (hardware and software) design and deliver a product that is almost unsustainable from a maintainance standpoint.
They do this for lots of reasons. Computer manufacturers want their computers to be ubiquitous regardless of the end environment, so they put the same hardware (for the most part) in a personal computer intended for home use as they do in one intended for business use. Software vendors (for the most part) are more interested in protecting their product from illegal copying and use than they are in providing a commercial product that is easy to maintain in an enterprise environment. We use several commercial products at my job, and the hoops and jumps that one needs to go through to install and distribute this software to the end-users is a nightmare (license dongles, license servers, key codes that generate a license string that needs to be activated, etc. etc.)
Personally, if I was the head of an IT department for a Fortune 100 company that had a substantial IT budget (say in the billions of dollars), I would call a meeting with my local regional sales representatives for HP, Dell, Sun, Microsoft, Sony, Toshiba, etc., plunk them all down together in a meeting room and say, "Gentlemen, I'm tired of replacing hardware on a 3 year cycle because you people build a product designed to last only slightly longer than a light bulb. The computing needs of 99.99% of my users can be fulfilled by a machine that was built in 1995, if I could get a secure, supported operating system to run on a hardware platform that still had available parts. I don't need a computer designed for someone's house, I need a computer designed for my business. I want a hardware platform that is very modular and easily repaired, I want an operating system that is stripped down and designed for an absolute minimum of client-side overhead, and I want software packages that provide this set of functionality and this level of security. I want a business proposal in the next 18 months, or I'm taking my $22 billion a year and operating on a skeleton budget while I start a business computing company that *will* give me what I need."
Computer companies don't build business computers, period. For the life of me I don't understand why Sun (for example) has chosen to try to invade the "cheap PC" market when they're already locked out of it -> it would make more sense for them to start designing and building machines actually intended for business use. If they did, and walked into an IT manager's office for, say, Wal-mart and said, "We have a machine that will do what most of your users need and will last 8 years" they'd sell them like hotcakes.
> However, it is impractical because IT budgets are made of desktop PC (and
> OS) turnover, and IT managers won't give that up.
They won't give it up, because they don't have a choice right now.
> In addition, users want their laptops and screensavers, developers want
> to download and install at will, and managers want the latest in handheld
True, but they want these things because they are condiditioned to believe that these are reasonable things for them to want. The cold fact is that most of these things are *not* reasonable things to want, especially in an enterprise environment (and in most cases they aren't things that you actually need at all to get your job done).
It's possible that this a case of a little bit of knowledge being a dangerous thing. I would suspect that users who know a bit about viruses and the risks of running software off a CD handed out on the street also know just enough to realise that their work PCs have anti-virus software on them, and are thus 'safe', and therefore think that running the software won't be a risk - but not quite enough knowledge to realise that it is actually *is* a risk, for a variety of reasons.
Wasn't there a study recently which showed that people's behaviour on their work PCs was more risky than that on their home PCs? I think one of the reasons they quoted for this was because people felt that the work PCs were more able to cope with the risks, with the added benefit of the IT department being able to fix any problems that did occur.
Here's a similar post I wrote on the 18th, I linked to some other interesting examples... I had a lot of the same thoughts that were captured in your article and tried to give some suggestions to help folks who don't have any kind of IT support. I am sure I left some things out... but it's a start.
Exactly what is meant by users' "Understanding Computer Security?" I'm an IT professional and don't pretend to understand everything. There's just too much to know!
"I don't understand why Sun (for example) has chosen to try to invade the "cheap PC" market when they're already locked out of it -> it would make more sense for them to start designing and building machines actually intended for business use. "
Actually they did (diskless clients) and they where a fairly dismal failier because they where regarded as being designed to fit in with the Sun OS stratagy, not MS Windows. Which of course ment to your avarage IT manager "my users can't run their apps"....
The company I used to work for (I won't name it, but think, "World's Largest Retailer), they finally realized that issuing R/W CD-Roms wasn't a great idea after one of the VP's left the company for another job and took the code for several proprietary applications with him. According to friends still with the company, you're lucky to get a floppy drive these days.
OTOH, they rarely ever see a successful attack from the outside. Almost all of their problems come from employees being dumb about security.
I work for a large finance company and security is pretty good. The networks are tightly controlled but not locked down. The PC's don't have CD's, but they do have USB.
Our best security times was 8+ years ago when the 15,000 desktop PC's and 1,000 servers were all run on OS/2. I'm not saying it was a better OS. One of the reasons we had to drop it was because the vendor (IBM) dropped it. We had demands for Windows apps, but companies were providing good emulators that satisfied our needs for those. We ran a proprietary financial calculation software that interfaced with the mainframe for database information.
The beauty of the solution was that it was "off mainstream" enough that it didn't warrant attention of nefarious persons. Even though we had an anti-virus software that got updated every two weeks, it rarely got any hits. The "audit" software we ran quarterly only showed "unapproved" software on the developer's PC's. (We had about 1,000 developers on staff at the time.)
There are better ways to run a company than using "off-the-shelf" hardware and software choices. Several commenters have alluded to that. But the bottom line is the bottom line. The person who makes the ultimate IT choices is the CTO or CIO and s/he is going to make the choice that has the best *financial* impact for the company. There are versions of Unix/Linux that are completely locked down. There are ways to totally lock down Windows. But those cost money. There is hardware that cannot be hacked. More money. Our latest hardware roll dropped floppies and CD's primary because of cost. Cutting $30 from the cost of each PC for 20,000 PC's makes someone's budget look pretty good.
It comes down to risk versus cost. If a company can accept the risk of having PC's with USB ports that employees can use, then they won't shut those down. I have friends in the US Military who cannot use the USB ports - they are hardware disabled. Yeah, the IT guys break them. "snip" And fixing them is a Court Martial offense. (As mentioned, that's a bit draconian, but it all depends upon the company's security requirements.)
At my company, 80% of the PC's are locked down, but the primary driver was cost. Security was considered a side benefit. Being able to cut the cost of the PC and a couple of head-counts from the support groups was the driving factors in limiting these choices.
BTW, as an aside note to how nasty security can get, we have one critical system that updates the production databases. It allows a DBA to directly manipulate the data on the mainframe - very dangerous and very ripe for abuse. In order to access this system, the DBA has to request a password and key that are randomly generated and only usable by that ID, on that PC, for 2 minutes. Every keystroke typed is logged. I feel sorry for the peeps that have to use this system, as they have to make between 50 and 100 changes per day. Here's the hammer-home point: The peeps who use this system are not direct-hire employees - they are contractors hired and managed by another company!
We can talk about how security should be "this" and "that", but the bottom line is and always will be the bottom line. Those contractors are cheaper than employees.
My employer doesn't pay me enought for me to care. I'm as loyal to them as they are to me. Which isn't much.
I've been a fan of yours for many years, going back to when I toyed with writing a law review article on PGP, IDEA, and DES (and, then, "Blowfish") back in law school.
Your article on employees and security is great; however, I think there's another angle that needs to be explored. That angle is simply this: What obligation do the people who depend on their employees/users being secure have to actually, you know, make that a workable state of affairs?
I'll give you an example. I'm an attorney. In my state, we can send court filings electronically, straight to the courts. Its a great time saver, but as you can imagine, unauthorized access to the "central" system that handles the electronic filing can be very problematic.
Recently the firm that handles this task for the state courts updated their password protocol. Now, their user passwords have to be changed repeatedly, every number of days or so. The system rejects any password that you or anyone else in your office has ever used before, and contains an alphanumeric combination requirement such that a valid password will have letters, numbers, as well as upper and lower case.
Sounds really secure, right?
The rotation of the passwords is so frequent, and the requirements for a "valid" password so stringent, that employees all over the place are unable to remember their passwords. What are they doing? Well, some of them are writing them down on yellow sticky-notes on their computer. Others are having internet explorer "remember" these passwords so that anyone who logs into their computer can just get "in" to the system. I've cracked the whip in my office about this, but I talk to fellow attorneys who are confronting the same problem in their own practices.
There is such a thing as too strict of a security system.
Employees have a lot of their own personal passwords to remember, and in my view there's a happy medium that needs to be considered when securing employer information so that the protocols are useable and workable. A rigorous security regimen that is so unwieldy as to not actually be followed means nothing in terms of actually, you know, securing things.
For this reason, I think sortware like the Password Vault or similar needs to be a fundamental part of any well organized office security system. That's what we've done (using a lame commercial package I'm trying to convince my colleagues to replace with more functionally secure software like yours).
Keep on writing!
Best, a Gonzo Lawyer.
you guys need to remember that you are cost center staff that are only there to make sure the trains run on time.
why does the CD/DVD drive need to work? Cause the sales, executive, and engineering staff fly around the world for internal and external meetings and need to be able to watch DVDs and listen to CDs to make their lives bearable. They need to have some music so they'll stay the 16 hours a day necessary to keep the firm in business.
Why do non l337 get to install software? Cause they have a 1 week project that will be sped up by 3 days by installing some abnormal software (web or shelf), and it can't wait the 4 weeks it takes for some-one from IT to deign to come down and process the 35 pages of forms and 15 approval signatures.
IT need to get its elitist ideas about itself thouroughly cleaned out. You are not elite. You are slightly better educated security guards with bad attitudes. Complaining about luser issues, when the lusers are the ones who do the real work and are profit centers, is beyond ignorant.
And I say this as someone who spent 3 years running IT projects and internal product development, has a comp eng degree, and is now on the sales/exec side.
I'm a self defense instructor who has worked with victims of violent crime for the past 14 years. One of the things I find interesting is how the concept of security has changed.
Take the title of this post. "Proof The Employees Don't Care About Security" brings very clear images to my mind of workers accepting suspicious packages from delivery people in unmarked vans, people losing company issued ID's but not bothering to tell building security, and employees helpfully holding the door open for unknown people to gain entrance to secure facilities.
But no, I'm wrong. It's only about data security.
I think one of the reasons for this (to me) odd shift in attitudes is that guys like myself do our jobs too well. Just a few decades ago a scuffle in the parking lot of a bar between two loudmouths was pretty common, now such occurrences are very rare. When the probability of serious physical injury due to an assault declines, and even the possibility of getting punched in the face becomes remote, all of a sudden having some spyware on your laptop assumes tragic proportions.
Somebody took a free CD and popped it into the computer, trusting that it would be fine because they had a safe feeling about it... Everyone else here is talking about security risks and who would fall for this?
I see AOL's business model.
And exactly why is this any of your business? I am personally quite happy to leave it up to the corporations in question whether this is an important issue for them. I am sick and tired of websites insisting on my entering a password for stuff that is totally unimportant and doesn't need any level of security at all, and I really can't stand it (to the point of refusing to enter a website) when some jerk tells me that my choice of password doesn't meet his lofty standards. If you truly want to improve security, try being friendly and cooperative rather than pompous.
I'd have a bit more respect for the IT department, if the last three times they snuck into my office to update my software, I hadn't had to spend a day manually restoring all my customizations in order to get back up to full productivity.
Look, I don't understand why some of the major firms don't underwrite an effort of this sort: Start seeding the net with *harmless* malware. Emoticons that make your computer embarass you in front of your coworkers. Trojans that tell IT their security sucks. Virii that scroll "don't run unapproved software, dipshit!" across the bottom of the screen. Imagine if a billion dollars a year was spent scanning network addresses, *and simply telling people who had security holes that they were there!*
Think of it as the computer equivalent of that stinky stuff they add to natural gas, so that people know there's a leak BEFORE there's an explosion.
I went over to "the Training Camp's" web site, read their press room, Googled furiously, and was generally unable to find any more information than was already in the Silicon Press article. Other sources did cover it, but all obviously based on the same press release. Just one additional snippet was found (in Silicon Press' editorial), which mentioned that one CD was executed 14 hours later in Rome. This is a pity, because they could quite easily have given some hard figures which would have been interesting and worthwhile. However, this was not a sociological experiment but a publicity stunt intended to promote the services offered by TTC, which has a new course along the lines of IT security training for the whole office. Since it is a publicity stunt, unless TTC decides to release their raw figures I think it is fair to presume that the number of CDs executed from corporate IPs is no more than those actually reported, to wit, 3. The number of CDs distributed is completely unknown -- lower limit, 3, upper limit, however many people work in London's financial district. However, the stunt would look pretty silly unless they gave away at least a few dozen.
On the basis of that I think the lessons to be learned can be summarised as:
* contrary to TTC's point, the great majority of employees may have accepted the CD out of politeness but in fact did NOT run it on their employers' network.
* However, no matter how much you educate the users, there's always one (or three) who will go and do the darnedest things, so if your security is based on "crunchy on the outside, chewy in the middle", you lose.
Additionally, since one of the offenders was in Rome later on the same day as he/she was in London, in the middle of the week, it is a pretty fair chance that he/she was a senior executive, and is not going to be fired for ignoring corporate computer policy. In fact if it had been a malicious trojan, it is rather more likely that the IT human who set up the boss' laptop would be getting the boot.
The other interesting thing is the way responses on this blog have polarised. There is the "users are stupid and the source of endless trouble, we must lock up PCs so they can only run MS Office" camp, and the "what is it with these security nazis?" countercamp. To some degree I actually fall between the two stools, er, camps, but I am steadily drifting towards the latter view. Once I was in IT support and experienced the frustration of user stupidity, now I am in a software engineering section which falls outside our IT department, and instead I experience the frustration, and hubris, of IT stupidity. The difference is that frustrated users generally do understand that IT security serves a useful purpose, whereas IT departments seem to continually forget that, with a few rare exceptions such as software companies, IT is not a core function of the business, but exists purely to assist other employees in doing their jobs more efficiently. You are there to serve them. If you are not helping them do their work, or God forbid if you are actually setting up roadblocks, then there is no excuse for having you at all.
I'm not kidding here; studies on productivity gain from IT investments (admittedly controversial) have found that in some industries IT investment has delivered almost zero return and in at least one it appears to be actually negative. (There are many industries where that is not the case, but the big winners are mainly industries which spend a lot of R&D effort customising IT support to workers' needs.) A very big part of the problem is the whole obsession with SOEs. SOEs are an incredibly stupid idea. They make life easier for the IT help desk while eliminating all possibility of the PC creating a really large productivity gain to that particular worker's tasks. (It's like giving your delivery drivers a truck and then forbidding them to take it outside the carpark because it's easier to do repairs on site.) Someone else commented that maybe 90% of the stuff on each desktop needs commonality but the remaining 10% that differs is quite important. I would go further than that. Of the 90% of commonality that we often get, a great deal of it is trite and wouldn't significantly affect productivity if we had to go without it. For example, an office suite, typically costing just under a thousand bucks per desktop, is usually the first item stuffed into the SOE. But the fact is that most users rarely need a word processor, even fewer have a need for presentation software, and unless you have a very small office, local databases should be banned altogether. And Outlook is a dangerous piece of crap. That leaves just spreadsheets, which actually are quite useful. Yet SOE after SOE insists on pushing out MS Office Pro to everyone in the company because that's the done thing, and then tries to punish people who install software they actually need.
Conversely however there are some universally useful services which generally are NOT provided! For example, with the vast amounts of data generated by modern information systems, search is a critically important function; yet we see users installing unauthorised things like Google Desktop because the built in search facility is so lousy. Ability to efficiently search for data shouldn't be a minor add-on, it is a core function far more critical than a word processor. Another example, every computer user should have access to a secure, transparent backup system. Sure, most sites provide a backup system, but it is rarely secure and almost never transparent. Or rather, it's transparent on the way in, but let Joe User just try getting some old version back out and see what hoops he has to jump through. The result is that the savvy users -- the ones who care about company IT security -- end up having their own parallel backup system on DVD or USB drive (you know, the ones you're trying to disable), so as to avoid having to deal with the recriminations, delays and byzantine bureaucracy of the IT department.
Oh, and do users care about their employer's computer security? Some don't, mostly those who feel embittered by some form of corporate misgovernance or numbed by endless hyperbole, but in my experience yes, the great majority do care. Additionally, most users also DO understand security threats in general; what they have difficulty with is keeping up with the continuously evolving spectrum of threats. In this they are poorly served by a press which revels in hyperbole and puts out a dozen "internet meltdown" scare stories for every informative sidebar on phishing.
One of the latest of these pieces of hyperbole has even been echoed here, the ludicrous claim that USB mounted removable media pose a grave security threat to companies. The reality is that these drives can be a big productivity boost to knowledge workers while the supposed threats are classic movie plot threats. Removable media hasn't been a significant vector for malware spread for more than a decade, for the simple reason that the generation time is more than a million times slower. And if you *really* feel that you may have a glamorous corporate spy siphoning off gigabytes of valuable company research (as against an occasional printout), the first place I'd look is the IT department. As for losing it at the pub: just put a True Crypt partition on it -- unless the IT department has stopped you from installing it...
Not: Computers need to work more like that.
But: Computers at the workplace need to work more like that.
Your first is not only unsustainable, but all citizens should rightly be angered by such a proposal.
The modification makes your proposal both workable and palatable, and should enjoy the support of all. It makes your otherwise outstanding argument complete.
As to the tendency of security types to this type of un-modified thinking (myself included) Let me just add "thank god for hippies"
If you have a general-purpose machine you don't have a secure machine, and vice versa. Users value the general-purpose features of a computer. Therefore, users hate and despise security.
I've been in this business a long time. We had perfect security - the computer was in the computer room and only the programmers and operators could enter. There were no terminals. Think punched cards. Of course, it didn't do much for the average office worker. They got computer generated reports.
Then we had the mainframe. It had security, because you could only run the applications you were granted access to. Even if you had a terminal, you couldn't just plug it in anywhere, at least not in the IBM SNA world, because every terminal had to be set up on the network by the central site.
The people who did this were all well paid experts, and a bottleneck to getting things done quickly.
Then came the departmental mini computers to free people from the constraints of corporate IS. Then the PC and knowledge workers had the tools to be more productive. Have any of you ever waited days to have the "word processing department" type a business letter?
Without the PC, despite all its flaws, we'd still be stuck in the 60's and 70's. The trick is going to be getting a secure OS that still provides some flexibility to the user. Think some sort of DRM, where the data is useless outside the corporate network. Or an OS that protects itself and the corporate data, but still allows limited privileges for user installed software.
Feh. And IT folks will always be desperately lonely, pale nerds who can't actually keep the computers running and crash the server every day or two with their great "updates"-- but somehow find 10 hours a day to cry about non-existent threats to security.
Tell us something we don't know.
Your response has only made the point that sweeping generalizations tend to be inaccurate.
The fact is that most employees, IMO, *do* care about security; they merely don't understand it. This leads to employees who break security policies because they can't imagine that they are causing a security problem -- they care about security, but there is a disconnect between that and following policies and standards set by IT security.
Besides, the point is that it only takes *one* employee to cause a breach.
As to "non-existant threats" - the fact that people *did* accept a CD from an untrusted source, and then installed unapproved software from it that performed malware-like actions demonstrates the reality of the threat. This isn't a theoretical threat, this *actually happened*.
Though it is beyond me why the vast majority of users in an organization need access to removable media... if they do, it's likely the result of short-sighted IT.
I am a computer literate and still do feel overwhelmed by the amount of information about the newest security flaws. I think most users have no sufficient understanding about IT security, and therefore are doing mistakes. In some occasions they may be ignorant. In my former company we used a dumm terminal as a client, based on a Unix OS. It was the safest infrastructure I ever used.
Recently I received a new issue of "Maximum PC" in the mail. This month's theme is how to rip anything from CD/DVD and store it, transcode it, squirt it onto our iPods and mp3 players, etc.
One of the interesting points they had was that Windows' "Auto Run" feature is a risk: it will automatically execute programs on the CD or DVD when it is inserted into the drive. Why is this bad? Because companies like Sony can secretly install their DRM software on your PC without your knowledge or consent (said software of course interferes with your ability to rip the music.)
I would hazard a guess that the above experiment leveraged this to determine if their CD found its way into a computer.
The article goes on to describe how to disable Auto Run for all users.
I would assert that disabling Auto Run is a reasonable precaution for anyone, not just for those who wish to snarf music. But I don't believe this will protect against "double-clicking" on the CD drive's icon, which
will have the same effect as Auto Run.
The employees in that story sound as quality as NORAD!
Why there was NOT a "stand down" order:
explaining the "failure" of the Trillion Dollar Air Force to defend its headquarters
One of the first anomalies that many people noticed immediately after 9/11 was the inexplicable non-reaction of the military air defense system to the hijackings.
It has been standard operating procedures for decades to immediately intercept off course planes that do not respond to communications from air traffic controllers. When the Air Force "scrambles" a fighter plane to intercept, they usually reach the plane in question in minutes. The Air Force plane will then fly next to the non-responsive plane, and rock their wings -- a way to say "follow me" to a nearby airport (if the plane merely has lost its radio equipment). If the intercepted plane refuses to respond, there is a graduated series of actions the Air Force can use -- firing tracer bullets in front of the plane, even shooting it down if it is a threat. This is analogous to police pulling motorists over for having their lights out - every driver in the US knows that when a police car behind them turns on their siren, they are supposed to pull over, just like every pilot knows that when an Air Force fighter plane pulls beside them, they are supposed to follow their orders, too. If the light bulb has merely burned out, the motorist will get a warning, but the police have a graduated series of responses they can employ if the driver is not merely having a mechanical problem (ie. they have just robbed a bank and are driving with the lights off to avoid being seen).
The airspace over the northeastern US is among the busiest on the planet. It is home to the nation's political, military and financial headquarters, the largest population concentrations, and key strategic facilities. A jumbo jet in this area suddenly changing direction and altitude, and refusing to respond to air traffic controllers would be as dangerous as a truck on a busy rush-hour freeway driving the wrong way at full speed. When planes go off course in this busy environment, instant reactions make the difference between life and death -- which is why NORAD (North American Air Defense) practices these kinds of scenarios, and instantly scrambles fighters when there is any hint of a problem.
For critics of the official story of 9/11, the smokiest of the smoking guns is the "failure" of NORAD to intercept the planes. Even if one ignores the abundant evidence that allied intelligence services in other countries provided warnings that the attacks were about to happen, the information from the "insider trading " just before 9/11 that indicated which airline companies would be used, and other clues that clearly show complete official foreknowledge -- there is still enormous evidence that does not fit the official paradigm of "incompetence responding to a surprise attack."
In BSD and Linux, you can just not give non-root people the ability to mount anything - CDs, USB drives, whatever. But then the responsibility would fall on the administrators to make back-ups. If your employees hard drives fail, you can't yell that them for not making backups if you didn't let them use CDs or USB drives.
Of course, Linux and BSD aren't particularly vulnerable to viruses in the first place.
Of course, they could all be remote terminals that do little more than connect to a central *nix server that everyone has an account on. (Saves the trouble of worrying about physical security.) This makes backups easier for the admin - a RAID array if they have the money, plus regular tape backups, rather than having to go from machine to machine to make individual backups.
Very efficient use of resources, but a lot of productivity lost if that one server goes down, unless the company can afford a backup server.
Unfortunately, if one user monopolises computer resources, it will slow down for everyone. A good administrator should be able to put limits for different users, but this probably isn't a good solution for heavy programmers who have to do lots of resource-intensive compiling or graphics-developers who have to do lots of resource-intensive rendering.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.