Schneier on Security
A blog covering security and security technology.
« Truckers Watching the Highways |
| E-Hijacking »
December 8, 2005
Monocultures and Operating Systems
Dan Geer on monocultures and operating systems.
Posted on December 8, 2005 at 3:11 PM
• 39 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
That is because there is only ONE Microsoft operating system: Windows.
Here is a list of a few of the Microsoft OS's http://www.microsoft.com/globaldev/reference/...
That doesn't even scratch the surface when you bring in Service Packs and Patches.
Monoculture sounds like something made up by someone who doesn't understand the real world of system implementation
Incredible potential infection statistics if even 1/2 accurate. One begins to wonder whether sacrificing backward compatibility is really any sacrifice at all!
AG: You're unclear on your point. Has Microsoft released lots of operating systems? Sure. Does it matter? Not really. DOS and Win16 are dead. Windows 95 and descendents are on their way out. These days it's all Windows NT derivatives on the desktop and server (at least if you're a Microsoft shop), and those versions share lots and lots of code with each other. And of course, inside a single large corporation a monoculture is very likely. IT has incentive to run a monoculture, it's easier to maintain. But when every single system in your company is running a particular revision of the same OS it's much more likely that a single vulnerability will take the whole thing down.
Agreed that there have been many versions of Windows over the years, just as with OSs and software in general. Yet consider that, in general, companies and entire industries as a whole tend to flock to the same platform and software configurations.
Many to the current version of Windows, with other companies lagging, but most often to one version or another. When SP2 came out, companies usually updated en mass after testing. I wouldn't want to argue that noone uses Windows 98 anymore (because I've seen it). But I doubt that NT 3.5/4 workstation, Windows 95, or even Windows ME, are still used frequently in businesses today in the US (beyond which I would not presume to be able to say) with payrolls larger the 25 or 50 employees.
Within industries, such as Healthcare, there are specific software applications that end up mandating a platform standard. MS Office (itself a source of trouble in this equation with regard to viruses, etc) has also had this effect when new versions come out that no longer work well on previous versions of Windows. Indeed Microsoft has had its role to play, based on sound business reasoning on their part (and to their benefit of course), in standardizing an enterprise en mass (FTE licensing; big pushes to use platform "XYZ", due to free/cheap software deals on "ABC").
The effect is not specific to Microsoft of course. They are clearly though the most visible and most numerous example, as well as the most targeted.
Does anyone else find it interesting that Mr. Geer doesn't know how to use scientific notation, but is trying to convince others using statistics?
30,000 = 3 * 10^4 != 30 * 10^4
30,000 * 100 = 3 * 10^6 != 30 * 10^6
I understand this might be an honest mistake, but orders of magnitude are something to pay attention to.
I guess if you're assuming arbitrary numbers for agument's sake, being accurate with those arbitrary numbers is fairly arbitrary, too.
By my calculation, the actual ratio is actually more like 3/200, or 1.5%, which seems way too low to me, given how many machines I have found malware, viruses, rootkits, or keyloggers on, and some of those were running anti-virus software (not kept up to date, of course).
The article is still interesting...
Chris: it's reasonable to assume that Mr. Geer's original manuscript used "3.0 * 10^4" and "3.0 * 10^6", and that the decimal point was lost during publication due to typographical error.
I wonder what the numbers are like for the "monoculture" of open-source LAMP web application servers running on x86? BIND DNS servers? Mail servers? Cisco routers?
A huge swath of our critical infrastructure is not desktops.
Also, I always wondered if code obfusication tools could be used to combat this monoculture to some degree. If every machine was running Windows, but each with binaries containing differently-randomized brandch addresses, would it help? For the low-handing fruit of buffer-and-stack attacks, I think it would. Application-level vulnerabilites... maybe not.
I've seen worse manglings of scientific notation by publishers who didn't understand it - the worst being a change from "10^-25" to "10-25".
@Brandt & M. Skala
Could be. It seems odd that they would also drop the decimal from the percentage and that he would go on to say that 1.5% seems high. I would venture to guess that the 5-10% range is probably frighteningly close to accurate. I guess it isn't a big deal either way, the general point is the same...
Part of any good security plan is how to mitigate and repair the damage AFTER the security is broken. Having the same computer platform all over the globe is not a whole lot better than using the same key for every lock in your office building. Once broken, everything is broken.
The same would apply to a world where Macs or x86/Linux were as dominant as Windows is today. A monoculture of hardware platforms has similar problems. The vulnerability of monocultures are not limited to malware, but to supply, distribution, and quality control problems. As Adam Smith noted, competition keeps the marketplace healthy.
The term "monoculture" BTW is a term derived from biology. A population of genetically similar organisms is called a "monoculture", and, not surprisingly, these populations are much more prone to catastrophic pandemic.
I'm sure you are asked a lot, but just out of curiosity, being a security guru, do you use Windows?
If not, do you use OSX, linux, or xyz?
(I'm guessing Mac since I see you have a book on it). Is your choice partly for security reasons, or something else entirely.
If so (Windows that is), then by what means do you protect your computers?
Ballpark, how much time do you spend on such tasks?
Your insight is greatly appreciated.
All the more reason to move everything to open standards.
When Microsoft and Apple and Red Hat can all produce and consume all the same files interchangeably, then they will be competing on other fronts, including security among others. Unbundling of file format from OS is vital.
Smaller, faster, leaner niche systems can then arise and compete.
The biological analogy of species diversity as a guarantor of a healthy ecology I think is appropriate for countering diseases.
(BTW, my bias? I keep my machines Microsoft-free.)
It seems to me that "don't have a soft chewy centre" is a better lesson from this than "don't have a monoculture."
E.g. if you decide that infection rates are such that you don't want more than 100 boxen of one OS on a single network, and you require 1000 boxen for your enterprise. Which is easier?
(1) Find and support 10+ different OSs
(2) Use one OS, but split into 10 sub-nets with firewalls etc. between them.
Guess which scales better.
I don't like the assumptions that Geer makes at the bottom of page 2. He is implicitly assuming that the probability that each desktop is infected is independent, allowing him to estimate the probability that there is no group infection is (1-x)^y. This isn't true; in a monoculture, infections are highly correlated among machines. You cannot assume that the probability that each machine is infected is independent, and so you can't multiply together the probabilities. This is especially true in corporate environments where system administrators install the same versions of the same software with the same patches on every machine and then connect them to the same network.
I don't disagree with his conclusion (homogeneous environments are bad), but his argument is flawed.
Wonder why Bruce will want to promote it?
There is plenty of "real" data to rely on such analysis that doesn't show up in real life.
Geer seems to be after one thing .. monoculture, but it's not microsoft's doing!
Is the problem he cites one of a monoculture in code base, or the fact that everyone gets the exact same build? It seems like Microsoft could eliminate a lot of the smash-the-stack type worms if it gave everybody a freshly compiled version of Windows, each with a different layout of code in memory. Of course, there would be extra cost associated with this - patching would be more difficult, as the patches would need to be semantic patches instead of just binary blobs, the compiler would have to be rock-solid with very few bugs, and there is time and power associated with recompilation. But it does seem like a way of eliminating buffer-overflow worms without eliminating a market monopoly.
Winn Schwartau has published on his blog last summer (http://securityawareness.blogspot.com/2005_09_01_securityawareness_archive.html) his experience of switching from windows to a mac. It is very interesting from a user and a security point of view.
Alan and Says not Pat:
As a software implementer I am saying
FACT: No company I have worked for has ever had only 1 or 2 operating systems running.
When I worked at MS I personally ran 3 different OS's at my desk.
As security experts, programmers, etc I'm sure monoculture sounds like a magic bullet, but monoculture just does not apply in the real world.
I work at a rather large company. For 95% of the machines here, we are facing monoculture in two flavours - server and desktop. It does exist. When you drop to companies with under 100 employees but strict licensing agreements, you find that monoculture actually goes up, and the upgrades and patching go down.
It really doesn't matter if a few machines have slightly different configurations when they run more/less the same OS that gets patched in more/less the same way. Those slight difference amount to little more than anomalies, leaving what is statistically significant monoculture.
I guess there is an exception to ever rule
But it the last 5 enviroments I worked ( which adds up to slighting over a 100,000 nodes) there have been at least three desktop OS(XP, W2k, and Mac) and 3 server OS(W2k, W23k, and Linux) in each enviroment.
Once again, Marcus Ranum has already commented on this topic.
The Myth of Monoculture
Monoculture Hype Alert!
This is Marcus' quick comment on the following "news" story.
NSF Grants Two Universities $750,000 to Study Computer Monocultures (25 November 2003)
With the help of a $750,000 National Science Foundation grant, Carnegie Mellon University and the University of New Mexico will study computer "monocultures" and the benefits of diverse computing environments. "The researchers intend to create an application that could generate diversity in key aspects of software programs, thus making the same vulnerability less effective as a means of attack against the population as a whole."
$750,000 to sit around and whine about Microsoft? How do I get a gig like that?!
As Bruce has stated previously, he and Marcus don't always agree. I bet it would be fun and educational to buy a round of beers for them the next time they are in the same bar.
"As Bruce has stated previously, he and Marcus don't always agree. I bet it would be fun and educational to buy a round of beers for them the next time they are in the same bar."
Buying drinks for computer-security experts is always a good idea.
He raises a good point, regarding Microsoft's purchase of Connectix. Leveraging a VPX wrapper around legacy apps could be very attractive way of plugging the holes.
But it is hard to take anyone with that kind of muttonchops seriously... ;)
Re: the "variations" of Windows
"Monoculture" has to be defined in context. That context is exploitable vulnerability.
If W95 and W98 have the same exploitable vulnerability, then they comprise a monoculture.
The same goes for all other situations. If they share a vulnerability, then they are susceptible to the same attack. And it's susceptibility to a single attack that even makes this worth studying.
For example, consider Java as a monoculture platform. Assume that all versions of Java have a vulnerability, on all platforms. If an exploit is coded to JDK 1.1, then the exploit can run on 1.1 throu 1.5, but not on 1.0.2. If coded to the 1.5 class-file, then it will be rejected by the JVM before it can possibly run. But the "monoculture" in all cases would be defined by the version of Java and the coding version, not the OS, nor the CPU architecture.
At my work over the last 7 or so years I've haven’t wondered far from a KVM switch setup or virtualization software to run xyz on top of abc. Many of us posting to this site see a breadth of OS's daily in our respective areas. That doesn't mean that the sales, customer service, finance, HR, or administrative/executive departments, etcetera, see near the same environment in their respective areas.
The vast majority of systems in an enterprise tend to run a single platform. And no, that isn't all bad, given that standardizing behind a platform speeds up the application of updates and upgrades, improves recovery/re-install time, etc. What Geer is arguing is the inevitable result, that updates don't get applied fast enough nor can ghost multicast support still always outrun worms.
Which brings me too…
Very, very, very good point. Use subnet firewalls underneath border firewalls, with host-based firewalls on each machine that are configured and locked by a central group (no exceptions allowed by the user). I’d add that going further, using configurations/policies to control user access to hardware like USB storage, and restricted software installation permissions, are ideal also. I'd go even further if I could and add control on the filesystem level, over where executables can reside, also where user-run programs have access to add/delete/modify files, etc on the system. A mention that AG made about the real world stops me from suggesting further though :-) but it could never-the-less be done today with certain platforms and their underlying filesystems (guess which ones I’m referring to :-). In a real world though, better, more distributed use of firewalls is definitely possible, and should be used more (including prevention of user exceptions on the host level).
The real problem is it's impossible (in real life) to write secure code with C/C++. You get very fast code with enormous amount of bugs.
I doubt Microsoft will ever come up with anything new. They've simply invested too much on their current codebase.
To fix things there has to be a new programming language that will force coding style and syntax to avoid common program errors. Even better, means to verify correctness. There also have to be a new innovative company with a clear vision to deploy that programming language for an actual operating system product. Then there has to be support from other software vendors for that platform.
The problem is all the big players have invested too much in their windows C/C++ codebase already so changing the situation won't be easy.
>>The real problem is it's impossible (in real life) to write secure code with C/C++. You get very fast code with enormous amount of bugs.
>>I doubt Microsoft will ever come up with anything new. They've simply invested too much on their current codebase.
Ever hear of C#? it addresses some of the more-prevalent issues you were referring to in C/++. I think your comment seems to be about 10 years out of date.
"Buying drinks for computer-security experts is always a good idea."
How does getting experts drunk improve security? :-)
YP: Windows is more broken than ever today. How exactly did C# help? To me it looks more like a crossbreed between C++ and java without much thought on how it would actually improve quality of code.
@Ari "How exactly did C# help?"
Most compellingly, running inside a VM, and the addition of runtime checks.
Read: Buffer overflow protection for "free".
More interestingly, the forced higher level of abstraction and (anecdotal) higher productivity levels lead to better/more accurate designs and subsequent implementations.
"To me it looks more like a crossbreed between C++ and java without much thought on how it would actually improve quality of code."
Then you obviously haven't looked very closely. The language designers examined the features of (V)C++ and Java; what worked, what didn't... they then combined those features into a hybrid that combines much of the benefits of both, while retaining few of the weaknesses of either. The similarity to C++ and Java means a very rapid cross-skilling for existing programmers in either field. I think that's very well done, in the sense that nearly everyone wins.
On the Operating System monoculture argument, I don't think that diversity for the sake of diversity is such a great idea...
For a company, unity in operating environment means that a central response group can realistically be responsible for the health of that environment. The more variability there is in the environment, the higher the cost of maintaining all parts of the environment at peak health.
Filias put it best in that regard, in that the security of the network is more than just the individual security of its nodes. In the absence of extra network security, having a multiculture in your enterprise minimises the chance of a single random or targeted virus/worm taking out your entire netork, but you are simultaneously maximising the chance that a single random virus/worm will take out at least part of your network.
Probably my biggest concern in the idea that "monoculture is bad" is that no argument really seems to address any alternatives. What if we get everyone on to linux? Wait, but that's another monoculture... but it's ok, because there are heaps of "flavours" of linux, and the configuration is often different.
Hang on, hasn't that already been said about Windows? "But windows is insecure out of the box", I hear you say? Sure it is, but so was RedHat linux for a very long time. And that's a question of default security, not monoculture.
Ideally, each person should evaluate their own personal requirements for an operating system (for personal use), and choose accordingly. Thing is, that is effectively what is happening - people want easy to use, they want what the other guy has, because they know it's going to be easier to get support.
I agree with Ranum that the problem isn't monoculture. The problem (AFAICT) is how secure the choices are, when weighed up with what consumers want/demand.
Check out Coyotos/BitC (http://www.coyotos.org/). It's the next research project building on KeyKOS and EROS. EROS has also forked into CapROS in an effort to get something useful now (http://www.capros.org/)
@Ari and Troy
And what is CLR (or Java VM for that matter) is written in if not C/C++? So how did C# (or Java for that matter) help eliminating flaws, exposed to the hostile outside world?
Regardless of the language of implementation of the VM runtime, all code executing inside the VM is subject to runtime checks, preventing (or at least making much more difficult) buffer overflow and related attacks.
Besides, the sheer volume of applications running on top of the CLR/VM means a high volume of testing in realistic situations, and so a bug in the VM found by one application can be distributed at low cost to every application using a compatible VM.
That last should read "...a bug in the VM found by one application can *have a fix* distributed at low cost..."
The bug has obviously already been distributed to everyone, for free :-)
Point I was trying to make is that either OS or VM is
fairly complex piece of the software, in fact much more complex then most of the application being executed on top of it. Hence, I fail to see how language, used to develop these applications, could
significantly improve security landscape.
"So if I will replace word "VM" in your argument with words "Windows XP" it should still hold, right?" - Gajin
A segmentation fault in user-space used to crash the machine, or produce very strange (non-deterministic) behaviour... the same segmentation fault in Windows XP (Or others) will usually just terminate the application. If that's not an improvement, I don't know what is...
"Point I was trying to make is that either OS or VM is fairly complex piece of the software"
That's not necessarily true. Pretty much all non-trivial OSes are complex beasts (although it may be built on a kernel with a simple concept, the implementation of drivers and interfaces normally involves the loss of any simplicity), but a VM could be as simple as one with three operations: clear number, increment number, display number. Obviously such a VM would be crazily difficult to compromise... as long as operations are chosen which scale well, the security of the VM can be deterministic.
This isn't quite the case in practice, the number of bugs found and fixed (and unfixable...) in each release of a particular VM (Java, Perl, CLR) is sobering... but the point remains that despite being vulnerable to any bugs in the JVM, a Java application is almost always going to be more stable/secure (bug free?) than the same application written from the ground up in C++.
Which brings us neatly back to the "monoculture is bad" debate (in contrast to "windows is bad", which is a much less contested issue).
"...as long as operations are chosen which scale well, the security of the VM can be deterministic."
I am sure it is deterministic ;) I don't see how "deterministic security" equals to something good, though...
"... but a VM could be as simple as one with three operations: clear number, increment number, display number."
Yes, as long as you do not expect it load and execute any programs or do something else useful. Otherwise, supporting services should be available from VM as well, including, but not limited to: allocating memory for program to execute, reading program into that memory, starting program from the specific address, etc. And, oh by the way, if number we calculate and display to the user is randomly generated pin code, it better be *random* too, and no, your choice of language is of the little help here -- "... one can write Fortran in any lanuage." (sorry could not resist ;).
As far as comparable complexity of the VMs and OSes, I would highly recommend taking a look at freely available Microsoft CLR runtime (AKA Rotor) or the Novell's one (AKA Mono) and then compare it, say, to the FreeBSD core.
"... a Java application is almost always going to be more stable/secure (bug free?) than the same application written from the ground up in C++."
I fail to see how does this follow from anything you said before, but it could just be me being dense, sorry...
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.