Schneier on Security
A blog covering security and security technology.
« The Constitutionality of Full-Body Scanners |
| Risk Reduction Strategies on Social Networking Sites »
December 1, 2010
In 2003, a group of security experts -- myself included -- published a paper saying that 1) software monocultures are dangerous and 2) Microsoft, being the largest creator of monocultures out there, is the most dangerous. Marcus Ranum responded with an essay that basically said we were full of it. Now, eight years later, Marcus and I thought it would be interesting to revisit the debate.
The basic problem with a monoculture is that it's all vulnerable to the same attack. The Irish Potato Famine of 1845–9 is perhaps the most famous monoculture-related disaster. The Irish planted only one variety of potato, and the genetically identical potatoes succumbed to a rot caused by Phytophthora infestans. Compare that with the diversity of potatoes traditionally grown in South America, each one adapted to the particular soil and climate of its home, and you can see the security value in heterogeneity.
Similar risks exist in networked computer systems. If everyone is using the same operating system or the same applications software or the same networking protocol, and a security vulnerability is discovered in that OS or software or protocol, a single exploit can affect everyone. This is the problem of large-scale Internet worms: many have affected millions of computers on the Internet.
If our networking environment weren't homogeneous, a single worm couldn't do so much damage. We'd be more like South America's potato crop than Ireland's. Conclusion: monoculture is bad; embrace diversity or die along with everyone else.
This analysis makes sense as far as it goes, but suffers from three basic flaws. The first is the assumption that our IT monoculture is as simple as the potato's. When the particularly virulent Storm worm hit, it only affected from 1–10 million of its billion-plus possible victims. Why? Because some computers were running updated antivirus software, or were within locked-down networks, or whatever. Two computers might be running the same OS or applications software, but they'll be inside different networks with different firewalls and IDSs and router policies, they'll have different antivirus programs and different patch levels and different configurations, and they'll be in different parts of the Internet connected to different servers running different services. As Marcus pointed out back in 2003, they'll be a little bit different themselves. That's one of the reasons large-scale Internet worms don't infect everyone -- as well as the network's ability to quickly develop and deploy patches, new antivirus signatures, new IPS signatures, and so on.
The second flaw in the monoculture analysis is that it downplays the cost of diversity. Sure, it would be great if a corporate IT department ran half Windows and half Linux, or half Apache and half Microsoft IIS, but doing so would require more expertise and cost more money. It wouldn't cost twice the expertise and money -- there is some overlap -- but there are significant economies of scale that result from everyone using the same software and configuration. A single operating system locked down by experts is far more secure than two operating systems configured by sysadmins who aren't so expert. Sometimes, as Mark Twain said: "Put all your eggs in one basket, and then guard that basket!"
The third flaw is that you can only get a limited amount of diversity by using two operating systems, or routers from three vendors. South American potato diversity comes from hundreds of different varieties. Genetic diversity comes from millions of different genomes. In monoculture terms, two is little better than one. Even worse, since a network's security is primarily the minimum of the security of its components, a diverse network is less secure because it is vulnerable to attacks against any of its heterogeneous components.
Some monoculture is necessary in computer networks. As long as we have to talk to each other, we're all going to have to use TCP/IP, HTML, PDF, and all sorts of other standards and protocols that guarantee interoperability. Yes, there will be different implementations of the same protocol -- and this is a good thing -- but that won't protect you completely. You can't be too different from everyone else on the Internet, because if you were, you couldn't be on the Internet.
Species basically have two options for propagating their genes: the lobster strategy and the avian strategy. Lobsters lay 5,000 to 40,000 eggs at a time, and essentially ignore them. Only a minuscule percentage of the hatchlings live to be four weeks old, but that's sufficient to ensure gene propagation; from every 50,000 eggs, an average of two lobsters is expected to survive to legal size. Conversely, birds produce only a few eggs at a time, then spend a lot of effort ensuring that most of the hatchlings survive. In ecology, this is known as r/K selection theory. In either case, each of those offspring varies slightly genetically, so if a new threat arises, some of them will be more likely to survive. But even so, extinctions happen regularly on our planet; neither strategy is foolproof.
Our IT infrastructure is a lot more like a bird than a lobster. Yes, monoculture is dangerous and diversity is important. But investing time and effort in ensuring our current infrastructure's survival is even more important.
This essay was originally published in Information Security, and is the first half of a point/counterpoint with Marcus Ranum. You can read his response there as well.
EDITED TO ADD (12/13): Commentary.
Posted on December 1, 2010 at 5:55 AM
• 57 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
"Marcus Ranum responded with an essay that basically said we were full of it. Now, eight years later, Marcus and I thought it would be interesting to revisit the debate The basic problem with"
Such love between the two of you ;)
Personaly I'm all in favour of diversity as nature tends to be.
It is not of necessity any more secure but it encorages wider thought and brings light to bear on problems from different directions highlighting different facets of the problem.
@bruce: "you can only get a limited amount of diversity by using two operating systems, or routers from three vendors"
This is a key point. In fact the whole biological metaphor on computer systems is overdrawn.
"Even worse, since a network's security is primarily the minimum of the security of its components, a diverse network is less secure"
If I may mix mechanical metaphor with your biological ones, it's like early aeroplanes, where two-engine planes were more dangerous than one-engine: you needed both engines to stay in the air, but each engine in the set was as likely to fail as in the one-engine models.
Like brak'lul - good in theory but also means there's twice as much that can go wrong.
Doubling at a minimum...
configurations to maintain.
security tests to perform.
technologies to patch.
the number of skill sets to hire for (and getting individuals competent in both drives prices up)
incompatabilities between security models.
The test of your monoculture critique is happening right now, isn't it?
Google just google, and Cloud service providers are not likely implementing control SC-29 Heterogeneity.
or perhaps the key to the risk of homogenous enviornments is massive redundency.
Nature does that too...reasoning from your analogy -- in areas, like the polar regions, where diversity is very thin the numbers of individuals within a species population are quite large.
Oh monocropping wasn't the only factor in the famine. It happened at a time when Ireland was a net exporter of food.
It wasn't that they grew only one crop.
It was the only crop the sharecroppers had for personal food.
They grew many crops and livestock that were all bundled up by the Ascendency and English landlords and sold overseas. Had the English occupation been anything but indifferent to Irish suffering they could have provided them with additional foodstuffs until new potato crops were harvested.
The politics and economics of colony and Empire are what caused the famine.
Monoculture is an age old biological problem. If you have a large monoculture it tends to give you even high quality results, with low costs and little variability. This is usually considered to be good. It's all very well except when it fails catastrophically from rare - sometimes hard to predict - events.
If you have diversity you tend to have higher costs, and variable quality/yield. However the system is more robust and does tend to respond better to rare shocks.
There is always a tendency towards monoculture as it tends to better yield and more short term profit. It's just prone to going wrong in surprising ways...
Software monoculture is just the same as biological monoculture, in the end it's all a numbers game, and we all know how bad we are at dealing with rare events...
Moreover, in genetics we have errors (aka randomness) during the copy process...maybe we _need_ errors in computer science also, to serve a better level of security...
On the second flaw you name, there might be economics of scale where costs can group themselves on the monoculture system where experts are focusing on a single standard but it still confirms the real underlying problem of the negative externalities associated with the monoculture - specifically vendor lock-in.
One example to play off of is the Green Revolution in India. Turning to expertly controlled GMO seeding for high yields of growing all monocultural crops (acres and acres of corn, cotton or soya) was marketed to the Indian people as the economics of scale for increased food security at lower operational costs. But this has turned out to be the exact opposite effect on the nation. Farmers are now being driven to debt by Monsanto. Why? Not only did vendor lock-in take place but what we could call "security flaws" in a crop like bt cotton failed all nearly all farmers, causing many to have no crop and a mass epidemic of farmer suicides. The prices become more outrageous as they lack financial backing to afford the upgrades. This extreme won't obviously happen on the side of software but the same model of clustering the best food security minds to work for them resulted in a scenario where flaws spread throughout the entire user-ship (the farmers).
An argument to what i've said though is "well that's a problem of proprietary and strict license controls, what if the monoculture was Linux or Apache?" This is true, I made it out to sound like Monsanto can only be big bad Microsoft, but the realization is that the monoculture model thrives on necessary capital to fund the security experts. Therefore its more likely a Microsoft will then have an edge in an economics of scale but that's only until there is a better model for the funding of Linux security (universities, grants, big distro names, etc).
@Bruce: "A single operating system locked down by experts is far more secure than two operating systems configured by sysadmins who aren't so expert."
Why does the use of two operating systems warrant only the use of sysadmins and the use of one operating system get the advantage of an expert? Don't you think that this argument is a bit unfair in the fact that you don't think it is possible for a single expert to be well versed in two different operating systems?
Long, long time ago I heard about a "natural" experiment. Some European village had a religious conflict that split the population roughly in two equal gorups. One group emigrated to found a new village in a tropical colony (think Mennonites in South America).
Initially, they prospered. Then all of a sudden they were struck with an infectious disease. I think it was typhus, or something like that. Half the population died.
Investigators went to the colony and took blood samples. These were compared with those of the population in the original village. It showed a few blood types common in the original village were completely absent from the colony post-epidemic.
1 It was blood type that determined lethality of the disease
2 A single prevalent blood type would have meant an all-or-none survival rate.
This is the type of variation you were talking about. Not the 1 vs 2 OS'.
Back to software. First, Microsoft agrees with the "mono-culture is bad" argument. They have said repeatedly that Linux/OSX are less mallware prone because they are rare. Which is a mono-culture based argument.
Second, the variation is NOT only at the vendor level, but at the code level. What binaries are actually available. If every computer in the world uses the same binary (or source) for it's SSL connections, that is a mono-culture. And that binary is a target for exploits.
Punting MS Windows and OSX versus Linux in a "diversity" match is apples versus oranges.
There are on a handful of binaries for Windows and OSX (or MS Office and Safari). However, the number of different Linux kernel and library binaries is unknown, but large. The code-bases are also diverse for many of the applications on top of it. Computer configurations are diverse too.
We know from past security warnings that even deep bugs in Linux that went back for a long time were not exposed in all distributions or all configurations.
The correct level of analysis would be the distribution, ie, Windows XP/Vista/7 spX versus OSX 10.5.X 10.6.X, versus RedHat/Fedora X, Debian/Ubuntu X, SuSE X, ... all using different kernel, library, and application versions.
In these distributions, code bases must be compared. That is, diff a RedHat vs Ubuntu distribution, and diff Windows Vista vs 7 or sp1 vs sp2 (the proprietary ones in binary, of course).
As long as the people who deploy such solutions know what they're doing, diversity is thoroughly overplayed just for the sake of it. The avian approach works the best because it's the most logical in this scenario.
In case of something like a targeted DoS, there's not much one could do via diversity w.r.t. hardware. and deployment of multiple protocols is barely feasible for the cost involved of writing applications for each protocol and maintaining them.
"Don't you think that this argument is a bit unfair in the fact that you don't think it is possible for a single expert to be well versed in two different operating systems?"
I know that they can be. I know people who are experts on multiple platforms.
But those few people are outnumbered by the hordes who are not even experts on one system. Yet they are holding jobs where they control the configuration / patching / etc of those systems.
From Bruce's example of the Storm worm. Those machines that were infected ... I'm willing to bet that the majority were NOT admin'ed by people also tasked with supporting Apache. They were admin'ed by admins who simply did not perform basic admin functions.
And that won't change no matter which platform you deploy.
I think that the analogy still makes some sense if you apply it to the _same scale_.
The biological case "Ireland farmed one species of potatoes. Said potatoes went irrevocably compromised. Ireland hungered to death." cannot be compared to a single corporate IT infrastructure.
Imagine that (admittedly non-realistic) case: "Ireland relies solely on the Microsoft ecosystem. Said ecosystem gets massively infected by a critically damaging virus. Ireland economy grinds to a screeching halt.", that is the proper analogy. If half of the companies in the country had its IT infrastructure based on another system, then half of the country would still run. Just as half of the potatoes would be palatable.
At the other end of the scale, a single corporate IT infrastructure with heterogeneous ecosystem would be more like a single farmer cropping multiple species of potatoes. It can indeed get costly as at this scale managing the peculiarities of each species (soil, maturity...) could become complicated. Yet it can still be quite worth it.
Real life example: when McAfee AV false-positived and quarantined some kernel32.sys or whatever, turning our Windows machines into useless boxes, we could still save the day thanks to our few Linux and Mac machines.
@winter "every computer in the world uses the same binary (or source) for it's SSL connections, that is a mono-culture"
Excellent point. There are technology surfaces that cross OS heterogeny reducing it's safe, diverse, complexity.
Think OpenSSL or open software in general. This is not a religious open software is insecure argument but one based on cost. Managers and tech say 'open' they think 'free'.
First off I have no respect for Ranum. He is an idiot and the only person I know who has argued that patching shouldn't be done and we should all move to software that doesn't suck (namely OpenBSD and Apache). His justification was that we all patch and patch and patch and still get infected.
Second, any large corporation or government agency running a large set of Linux desktops-- even Linux with a VM containing Windows for Outlook-- enters the microculture. Their machines are hard to get into anyway; now you have Linux boxes and you have to get in "somehow" and then your target is what, other Linux-running corporations? It shifts you off the radar. Sure, you're a juicy target; but you're a small, hard-to-hit target and aiming at those will neuter the worm's general effectiveness.
I would love to see a sweeping setup at a lot of companies to get everyone who just needs to access Web applications, Outlook, and office apps onto Linux with OpenOffice or OfficeLibre or whatever it's called now. I don't believe the setup is good enough for central management though on Ubuntu, not sure on RHEL but I think they have better central patch control and A/D integration.
It's a nice theory and all; but the reality is Linux/MacOSX/FreeBSD/OpenBSD with Ubuntu/MacOSX/Ubuntu-like FreeBSD/Ubuntu-like OpenBSD systems on top those kernels, all running a variety of different desktop services (DBus vs competitors, different init systems, etc) is going to be a malware developer's nightmare. It's already a regular developer's nightmare. More than that, though, with the giant Windows monoculture outside, any corporation or government organization that holes up under the $2 umbrella of alternate OSes is going to find out real quick that it actually protects you from a large amount of the rain.
@BF Skinner: Excellent point. There are technology surfaces that cross OS heterogeny reducing it's safe, diverse, complexity.
Nope -- bad point, which is also the problem with Schneiers point here. He's trying to sound "reasonable" -- but it's all nonsense.
Using the same protocol isn't a monoculture -- that's a communication protocol. All plants use DNA -- that's not a "monoculture"!
I don't see how one can even begin to support the idea that "two isn't much better than one" -- it's vastly better, incredibly different.
I don't see how you can complain about administrative costs, as if each company should support 100 varieties -- that's nonsense. One farmer grows one or two variety of potatoes -- it's the ecosystem that's not monocultural, not a particular plot of land.
The whole thing is evidence and theory free -- it's a short ramble by Schneier without really discussing the issues.
I don't expect anything better from a short essay in an online mag -- it's not a serious article, so to speak -- but we shouldn't take it as anything more than Schneier trying to make a few bucks for working a couple of hours with the first thing that comes off the top of his head.
I see a twist in the rarity of linux/mac vs windows infection portion of the monoculture arguments. They say windows windows is attacked more because there are more of them, typically true I'd agree... You get more "bang for your buck" by infecting a larger base. I would argue however that Windows is also the poorest maintained, even though it's the favored target. Out of the box it is more insecure than other OS's, which only adds to its appeal for being attacked... easiest to attack + largest install base is a win-win, it's the path of least resistance.
Flaws abound in all OS's, and most fix issues found in them quickly these days. The trouble is that people don't like change and stick with aged OS's and patch levels for various reasons. The incentive to improve the OS's security is there for corporations, but patching and firewalling etc incentives are not there for ma & pa. Hence XP SP2 (and later OS's and SP's)turning the firewall on by default, it's a measure that was non-intrusive to most, added a lot of security from a single vector (the intertubes) but only shifted the exploit targets slightly.
M$'s latest OS's have learned from many mistakes, and they are more secure out of the box, however the less secure and more attractive targets are still on the net because the incentive to upgrade still isn't there. They know win7 is better for security over XP, just as smokers know there are health risks associated with their habit. They don't "feel" that they are going to be affected by them. When they are, hindsight is 20/20.
I also missed what the point-counter-point are... both seem to be saying the same thing... I do have to (begrudgingly) agree with Mr. Ranum, that the analogies don't work for computers as there are so many different ways to configure a network/computer. A single potato strand is not XP and another Linux, the computers have too many config variations and software variances to even be considered as one "type of plant".
And again, Marcus said it, when your the leader in market-share, innoavation falls by the wayside until some upstart/rival begins to upset that balance. Look at FF vs IE vs Chome vs Safari... IE did nothing until FF came along with market share gains. Then Chrome came along and force FF to do even more... good competition is needed, but I think monoculture is the wrong analogy overall. Stagnation is the true enemy.
I am surprised that no one has mentioned Stuxnet which at it heart targets a monoculture of SIEMENS SCADA systems.
Bruce, seriously you need to put in some time in IT working as a sysadmin.
In a mixed unix-windows environment one side is going to get done well and the other side isn't, and its going to be the side not done well which is going to get compromised.
The costs are probably at least triple, not less than double due to economies of scale.
A big problem is that I've never met a single person that could keep up with both technologies and really do a competent job (I have difficult enough task finding people who do a competent job at one or the other).
Its is vastly better to have as much uniformity as possible, to a fault, in order to leverage expertise with a single platform and make it secure.
One of the reasons why I like KVM as a virtualization technology is that it runs on top of vanilla RedHat and I know how to lock down RHEL extremely tight. From the vendor perspective both VMware and Citrix feel that they will produce a more secure product by shedding the RHEL-like layer under them and eliminating most of the upstream distro security holes -- but I already have to deal with the upstream distro security holes anyway, so that's a sunk cost to me -- and I'd prefer to have something that looks identical, not something that looks different that I need to spend more time learning how to secure.
Seriously, I've been doing this stuff for decades at the level of 30,000 servers (previous job) and 2,000 servers (current job) and doing the all the O/S level security, SOX, PCI-DSS, etc and I couldn't disagree with you more about monoculture. Currently, I've been involved in a project to spin off about 1,000 servers into a new company and the opportunity to build on a consistent RHEL5 platform makes it substantially more secure than the environment the software is coming out of.
@kangaroo: "Using the same protocol isn't a monoculture -- that's a communication protocol. All plants use DNA -- that's not a "monoculture"!"
There are two assumptions you're making that poke holes in your argument.
The first assumption is that the multiple implementations of the protocol were developed independently, and thus should have different security problems. Given code sharing in open source systems and the history of even proprietary OSes 'borrowing' code (the original Windows network stack was built from the BSD code; the original IE was a re-branded Spyglass Mosaic, which derived from NCSA Mosaic), the idea that the implementations are different isn't a safe one.
The second assumption is that the security flaw isn't built into the protocol such that all conforming implementations will have it.
In addition to comments about 'cost of the bad guys' to target a heterogeneous environment, I think another important point to make here is 'stuff happens'.
One thing I've learned about security is that issues/problems/incidents inevitably occur. It is your organizations (or Ireland's) ability to flexibly and rapidly respond to the incident when it occurs that defines long term success. In the case of a monoculture, you can reduce the operational costs due to having a single platform to target, but you reduce your operational flexibility to adapt to an incident when it occurs, and increase the likelihood of a catastrophic, broad event.
I think what is more important in this case, is not that a given organization has a monoculture (although that is bad for the organization in terms of incident response) but that as an industry we don't. Having a diversity of operating systems, technologies, etc - gives us long term flexibility of response to emerging security issues and also proliferates successful strategies (genes).
Does anybody remember that efficiency and robistness are negatively correlated?
If you have made your company ultra efficient, then it will collapse with the smallest " issue".
This is a hilarious post. Much of the discussion is about how a particular metaphor fits a real-world system (eg, "The Internet is like X"). What's even worse is to go from metaphor to practical application: "since the Internet is like X, then things that work for X should apply for the Internet."
I say leave the metaphorical analysis to philosophers and literary critics. At some point, we have to say that the Internet is the Internet and we really don't need a metaphor to tell us how to operate it.
Both monoculture and diversity have their pros and cons. Diversity, at its best, can be a key element in driving creativity, innovation, choice and competition. Monoculture, at its worst, can lead to lock-ins, complacency, monopolism, single source dependency and the likes. Pretty much like in real life, the option to go with has to be examined on a case by case basis with a thorough risk analysis in function of available budget and goals to be achieved.
@Lamont - You've got it backwards... you are agreeing with Bruce.
A little point of history about the Irish potato famine.
Nobody ever seams to ask "what did all these tenant farmers" used to eat before potatoes?" and "Why did they switch over to potatoes?"
The two answers are wheat and they where forced to by "efficiency".
Put simply the amount of land required to grow wheat to feed a family of tenant farmers was about four times that of potatoes so the landowners reduced the land available to the tenant farmers accordingly.
What is not generaly talked about is that even prior to the blight and famine, the tenants where duying of starvation because the land they used had no crop roatation and was thus "worked out" after a few seasons and the potato vield per area of land was droping quickly any way.
So ultimatly the deaths where not due to just the monoculture and a blight. The blight got going because the crop was nolonger resiliant due to the soil being nutritionaly poor, likewise the tenant farmers where in a nutritionaly poor state and they lacked any resiliance to disease them selves and thus succumbed remarkably quickly.
If it had not been the blight causing a famine the tenant farmers would probably have been wiped out in a couple of years when a new strain of influenza etc came round.
The real cause (and some of them where my ancestors) was the Protostant landowners (many being Scotts not English) pushing by way of what we would call an "efficiency drive" these days the mainly Catholic population into a possition where they had no resiliance irrespective of any hybrid vigour their might have been.
A study of British history shows that the English where fairly good at instigating "ethnic cleansing" by driving both Scots and Irish off their lands, not directly but through "appointed agents" such as the Chieftain's in scotland. The Likes of "Clive of India" and "Cecil Rhodes" carried this policy out for personal gain in both India and Africa respectivly. It is what Rudyard Kipling was refering to when urging the American's to "take up the white man's burden " in his now infomus poem ( http://en.wikipedia.org/wiki/... )
In many ways this ideology continues to this day. For instance,
"I say leave the metaphorical analysis to philosophers and literary critics. At some point, we have to say that the Internet is the Internet and we really don't need a metaphor to tell us how to operate it."
I agree. Particularly in this instance where there are multiple aspects (theoretical vs practical). And where the analogy doesn't provide any additional insight into the processes.
Lets not forget bugs vs malice.
Bugs tend to affect similar systems (Morris worm) but have differing, no no effect on some of that type (e.g. Blaster).
Malice is much more cunning, and can intentionally be devised to attack all forms, either through generic or specific attack vectors.
As the ultimate root gene is the various RFCs and protocols, this is the most effective attack point (ASN.1 vulnerabilities), but though a single attack vector (SNMP overflow) can result in a breach of security, leveraging that breach to gain control or even to execute any code usually requires a very specific payload (e.g. Blaster). The Morris worm was probably the more effective bypass to this problem, as the systems it infected all had a shared environment (/bin/sh), though it could also compile code, as the shared environment had a, shared, standard, C compiler element.
In short neither system in itself is sufficient, as each type of system has significant inherent vulnerabilities, and not fully effective counter controls.
Interestingly, no 'pure' versions of either exist in commonality, as even a 'MS shop' has 10 different versions of MS builds, and likewise of any Solaris/Linux/*BSD shop.
It's probably a bit like cockroaches - they are amazing creatures, with a incredible ability for resilience. But they are not the ultimate creature, and have many other downsides, apart from 'security'. They will never dominate Lions, nor will Loins ever dominate them.
The existence of a monoculture depends on the defined level of abstraction. I use several operating systems on one model family of processors. In one sense, that is a small amount of diversity. In another, it's a monoculture.
Also, as others have alluded, a few different products does not constitute diversity in the same sense as biological diversity. If I married my twin sister, you could still quantify more biodiversity than the diversity present when using two different operating systems within the same installation.
Every time I hear the term "monoculture," I think of reaching for fluconazole.
great article! i actually wrote a post a couple months ago on the same question - whether there's more to monoculture vs diversity than we usually think - and i ended up making the same three points. nice to hear other people thinking the same way.
In theory, if everyone ran their own operating system and changed passwords often, no hacker get very far. That much is true.
But there is also some practice to it. I think most can agree that Windows, for example, is much more insecure in many more ways than Linux. (That is, stays unpatched for a longer time. Hell, we've often seen years before Microsoft even acknowledges a security problem.) In this case, a Linux-only network may stay more secure in the longer run, despite that the few problems that hit it have wider consequences.
The real problem is laziness. You make the point it costs more to support two different OSes. The fact is it shouldn't cost any more. The only reason it does is because you have culture of people that overly specialize.
I cross-training is the real answer to that issue. My back ground is Nuclear, Electrical, Electronics mechanical, equipment engineering along with computer science and informations systems. I admin both windows and linux systems I program in ASM, C, C++, C#, java, js, html, php, perl, pascal, fortran, python or about anything else I happened to needed at some time. Not to mention PLCs like Allen Bradley,Gould, modicon, semens...
The problem is most people are just to lazy to learn more than one thing.
But that said most people I know who admin linux systems also know how to admin windows properly as well. The reverse however can't be said.
There we go. You hit it before I did: implementation and binary. Even if only the binaries were varied, we'd have a much lower infection rate. Code generation, different compilers, different implementation languages, instruction stream encryption, etc. all result in different binaries from the same low level software design. If you add in source deriviatives, which are prevalent in FOSS, this creates more diversity. Guess what guys? Much of this diversity is essentially free. It's transparent and seemless. The best of these have this property: on the surface, everything looks the same; beneath, things vary. Will someone explain why that costs so much more for IT admins? Especially if they were to operate their networks with tools that run on multiple OS's with same interfaces?
Excellent point. Unless the protocol has a security flaw, there will be numerous implementations. Like any protocol or standard, encouraging the development and use of a diverse array of implementations reduces risk. It also pays to have two or three that can be plugged in and a software configuration line to say which to use. So, if we heard about a vulnerability that affects, say, OpenSSL but not Cryptlib, we can just swap out the component until the problem is fixed. Software developers that make these things modular allow for diverse implementations. I'm more for using systems that bake security and correctness in, though. Recent examples for web applications include Ocaml-based Mirage and Java-based SIF.
Although Bruce and others say a diversity of 2 isn't significant, I call BS on that one. I've seen plenty of systems survive 0-days because they were one of two popular platforms and the bad guys didn't get an attack on both. Just switch over and bam. I often recommend combo's of software, supported by admin scripts/etc. that ease the management burden. Good examples are Linux/NetBSD, Apache/Appweb, PHP/Quercus, SunJava/ApacheHarmony (not now though), OpenSSL/cryptlib, etc. They are installed, maintained, kept patched, etc. Automated as much as possible. To reduce cost, mainly do this on highest risk or highest value systems. Also try to use POWER processors for web-facing or critical servers because most automated worm attacks target x86. x86 also had some architectural attacks in past years our systems were immune to due to lowered complexity. Diversity is very important. Even if it's just triple diversity, that increases the burden on the attackers. My main recommendation is to try to keep similar and consistent interfaces, but make things different underneath however possible. Each thing should also be made with security in mind: a whole bunch of insecure implementations is worse that one good one.
Good point. Laziness may be a factor. As I think of it, I wonder if that's the real problem though. You mention all these skills you have. Look at the average H.R. posting or resume for skilled Windows developers and you will see that they've learned a ton of technologies and skills. They are just focused vertically on popular brands with many integrated offerings, rather than horizontally across a bunch of lesser used brands. In our area, most companies are Microsoft shops. The more Microsoft technologies a candidate knows, the better their job security is. I think the people you referred to are just responding to market demand.
This brings me to the next point: are the markets demanding a monoculture? I would think the answer is yes. They want the efficiency and ROI that monoculture brings. Security is already a hard sell on the bottom line, so there's not much incentive for a resilient, heterogeneous environment on paper as they see it. Properly deployed, a redundancy of two or three doesn't add much to the cost. They can also gradually adopt extra diversity, spreading out training costs. Most companies don't think about this, though. So, they specialize in one platform, OS, brand, etc. and build everything beside and on top of it. Then, they demand workers that specialize in this. And the resulting monoculture is textbook economics.
"Even worse, since a network's security is primarily the minimum of the security of its components, a diverse network is less secure"
how is it worse?
min(Microsoft, Linux, ......) = Microsoft, no?
couldn't pass that one unsnarked.
@cm It's talking about networks where if any member of the network is compromised it lowers the effective security of other elements of the network that weren't themselves compromised.
I am reminded of something I learned early on. Breaking into a private network only requires a single entry point. If a private network has two gateways G1 and G2 each connecting between the private and public networks and they aren't identical, you are vulnerable to any breach of either G1 or G2 thus increasing your vulnerability. This is because G1 & G2 are in parallel & both must survive all attackes for the private network to remain secure. This scenario is an argument for limited monocultures.
Getting back to today's topic. For the Irish, the monoculture was bad because nearly all potatoes were vulnerable to the same exploit. The Irish didn't care about individual potatoes, they cared about potatoes in bulk and this was a disaster for them.
Internet users mostly don't care about computers in bulk, they care about their PC and they care about the remote service they connect to. Home and office desktop PCs are close to a monoculture, servers aren't so much of a monoculture. The networking switches are also a lot more diverse than home & office PCs.
For any individual user their PC being destroyed is going to be something between an annoyance and a major financial disaster. Each individual person for whom it would be a big issue should have an incentive to migrate from the lowest common denominator operating system to the most secure available ... unfortunately if everyone did that we would have another near monoculture ... at least of the PCs worth "pwning".
Another option is to avoid popular OSes, but I can't see how that would work in bulk. It's really hard to see a workable solution for any but small minority groups. For myself I use a non-mainstream OS (Kubuntu) that is good-enough & have often said that if it ever becomes really popular I'd switch to Freebsd ... again, not an option for everyone.
@ Nick P,
"This brings me to the next point: are the markets demanding a monoculture? I would think the answer is yes. They want the efficiency and ROI that monoculture brings."
The key word here is "efficiency" not "monoculture" which was the point I was trying to make above about the Irish Potato Famine.
Because certain business gurus believe that "efficiency" is the way to go ("lean & mean" "Business Process Reengineering" etc etc) people believe "It is Good".
Put simply it's not, if you think what happens when you over specialise as an organism (such as the much touted Saber Tooth Tiger) you might be much more efficient however you become prey to your environment more and more so that even the slightest change has considerably more chance of killing you off.
That is beyond a certain point a "efficiency" makes you less resilient and you do not have the reserves to enable you to adapt and change to stay in tune with the environment. And as it is with an orchestra a single out of tune instrument brings down the rest.
As you and one or two others know from previous exchanges I have a bit of a mantra about Efficiency -v- Security, in the "General Case" the more Efficiency you have the less security you have.
The clasic example of this is "side channel" attacks. The more efficient a general purpose system becomes the more information it leaks.
And as has been seen recently a combination of the three types of Cache attack rob you off any hope of security as what leaks the info is not the specific software implementation (due to monoculture) but the inherant "protocol" failings (due to efficiency).
And again I have talked about "protocol" issues being more devastating and longer lasting than other attack vectors. And the prime example of this was Secure Sockets.
And the point to realise from both of these is that it is the "Protocol" that alows a cache attack to work at almost the lowest level (hardware) because of the inherant "efficiency" of designs (loop unroling / lookup table on AES) and at the highest level (semantic protocol errors) again usually due to trying to get "efficiency" in the design. And in both cases the "efficiency" issue is both OS and Software implementation independent.
If you look at nature usually the most enduring species are those that are not efficient. To be fleet of foot and faster than your prey so you can catch it you need to be light in weight. This means you generally don't have much in the way of energy stores either short term or long term or actuall physical strength. As in the case of the cheeter you might be able to outpace prey but it needs to be something within your strength capability to bring down and other stronger but not as fast hunters can take any kill away from you, and as you have limited energy stores and a high metabolic rate for speed you starve fairly quickly.
Now before people try to leap down my throat over "Efficiency -v- Security" you can have both in a single design BUT you have to have it for one specific task only not for the "General Case". Nature has specific examples of high efficiency (photosynthesis) but as in grasses it is a small (but essential) part of the overal design which is generaly inefficient.
And this is the key part of "efficient design" which most people miss, you only optomize certain small but essential parts of the design, the rest you design for resiliance not efficiency.
You see this in other areas of engineering where the "nuts and bolts" are "efficient designs" but the overal structure is designed with resiliance.
Which brings me back to "mono cultures" there is nothing realy wrong with them if they are also designed for resiliance. And this is what we see with high assurance systems. The individual parts are efficient but not resilient but when combined in the right way give resiliance.
High assurance is something that is borrowed from earlier engineering when designing electronic systems (MTTR/MTTB trade offs via probability of fail in parallel/duplicate systems). You design the system with high efficiency but effectivly low reliability in the component parts, you then employ gross inefficiency at the next layer up by having fail over etc. Providing you get the balance right your system keeps running irespective of which individual components fail at any one time (MTTB) providing they can quickly be replaced (MTTR) before another component fails.
To get this to work you require a reliable "framework" because at some point you will always have a "single point of failure" (often the backplane in the card cage or "Frame" in medium availability systems which is where the "Frame" in Framework comes from via 18th Century looms etc).
Now from the software perspective an API or high level protocol is the framework. And as some will remember I keep banging on about NIST and it concentrating on primatives (AES and currently Hashes) not frameworks (something CEPT / CENLEC and other European Standards bodies do well and it's why GSM works world wide unlike certain other cell phone protocols).
If for instance NIST had mandated a "Framework" around their primativeves then replacing them when defects are discovered would be relativly trivial. But they did not and this is going to haunt us for many years to come as implementations for "efficiency" will deeply embed the primatives in a compleatly non replacable way.
If you think about for instance "smart meters" that will (if the politicos and Utilities get their stupid way,) be controling your house beyond your control in the near future, have a design life of a minimum of 25years. DES didn't realy last more than ten years imagine what would be happening now if 75% of US homes had smart meters using DES in a very simplistic (think effectivly ECB) way?
That's effectivly our future as long as we ignore Frameworks, with replacable primatives all because we want to "be efficient" and thus very brittle.
Efficiency is the curse of the libertarian "free market" view point. The market is good in the "here and now" "live for the moment" but that "here and now" has a direction and it's a one way road to hell if nobody sets the direction. Good regulation based on sound engineering experiance gives us the direction of resilience and longterm stability for nearly nothing if "built in at stage zero" it works the same way as Quality Assurance does.
And this is why "monocultures" fail they have a direction that is not that of resilience because they over optomise on "efficiency" in the wrong way.
@ Bruce / Markus
If either of you have read this far have a think on it and how it effects your respective views.
And please feel free to use the idea yourselves in any future Op-Eds etc.
I think this is one of the few times where I disagree from skimming through. I need to read both papers before I can really side on the debate.
I'll leave out my redundant comments on the correlation between living monocultures, and software monocultures... but think about this
lets assume Microsoft had 99% market share of everything related to computer technology... and, well, with the stock market crash of 08, completely crashed and folded... thousands of people out of work, technology that was maybe close to being finished gets tossed out the door, and existing technology is somehow bought and kept under lock and key by someone else... we'd essentially be OS-less
..... maybe that would be good
I wrote on this subject, back in 2001, for an obscure little site named "Binary Freedom" which later had its DNS name snatched... so it is now on System Toolbox.
CyberDiversity deals w/ mono- and hetero- issues dealing w/ genetics, technology and even memetic culture.
@BF Skinner: Best analogy ever!
Totally agreed!, indeed we could quantify that the mobile smart phone market has been protected from the viruses because the lack of a mono culture:
Perhaps not anymore!:
Mobile phone viruses in China affecting well over 1 million phones
@ Nick P
To a large extent I agree with you in they are responding to market demand.
When talking Monoculture and IT, I believe a more valid comparison to draw is with the Xerox photocopier than the potato.
The greater issue at stake than budgets and cost efficiencies is the drive for innovation is missing in Monoculture systems.
Being a virtual monopoly is a detriment because no one is really looking for a better solution as there is only one available. The Soviet Union had a virtual monopoly and only excelled where it felt it was competing with the West (in the arms race, not the consumer race).
Xerox lost the photocopy market and was bought out by Fuji, the Soviet Union no longer exists. If recent history has anything to show; Monoculture is bad on so many levels for the long term of anything. Diversity and intense competition is the key to survival, has been for hundreds of millions of years, the digital age is no different.
To address the issue of does a greater diversity of software present an increased security risk as some have suggested.
The answer isn't as simple as yes and no. It would depend on the developers. What is the developers intent, skill level, procedure for development, report system, ... and so on.
Looking at the open source development community. They aren't motivated by greed or money and wont rush to market a product just to meet their bottom line. The reporting system for bugs and issues seems far more responsive than Microsoft and other companies in my book. I'm not saying proprietary is all bad it isn't money is a great motivator for good even.
If you have a more diverse software system it is unlikely all systems can be affected by it. There is a greater potential that a portion can be affected.
While it is impossible to lay the claim diversity is a guaranty for security holes to effect the entire internet. The greater portion any one software group is the larger potential for a larger portion of the culture to be affected.
There is a another advantage to diversity built into it. A person has to find a security hole meaning they have to study the software to find it. The more diverse software is the more they have to study to effect larger portions. The more time it would take them the greater potential it will be fixed before used.
I have to declare round 2 a draw because the purpose of the debate may have been overshadowed by an idea touched on but largely ignored.
Ranum refers to the "all eggs in one basket" scenario as the result of "market ebb and flow."
In fact, the computer field naturally gravitates towards fewer and fewer platforms; one platform being the potential result (of this "cyber-entropy.") Other platforms (e.g. Mac and Linux) exist because they appeal strongly to vertical markets and do not compete directly with Windows.
Any further efforts towards "multi-culturalism" (multiple-platforms) may quickly be undone or require too much effort to be worthwhile.
The problem with the "diversity in OS" theory is that 2 is not really diverse by any standard, statistical or biological. If the computing market were split evenly in 2, then we would immediately see a rise in cross platform malware. 10 is not really diverse either and frankly I don't see the market splitting into 10 equal OS segments any time soon.
I'm not suggesting we all run out and buy Windows. I'm suggesting that the real theory behind the "diversity" argument is actually "security through obscurity" which is not a good security policy after all.
Bruce--while I'd agree the monoculture argument is often overblown, I think you're now overly dismissive of some parts of it.
The Windows monoculture isn't just Windows. It's Windows, running on x86, with IE, Windows Media Player, and a host of other apps installed.
Contrast this with Linux. While a large majority of Linux machines run on x86, many do not (my house is a Linux monoculture, but three hardware architectures). In fact, I'd argue the primary reason so many Linux machines run on x86 is just bleedover from the x86 monoculture, and this would not be the case in a more competitive ecosystem. Although it isn't always necessary to know the target's architecture, it certainly helps.
Then there's application diversity. Even when Windows users don't use IE and Windows Media Player, a malware author can still count on their existence when writing their code. What do they target on Linux? Firefox? Chrome? Opera? Totem? Kaffeine? The same "fragmentation" that so many decry about the Linux desktop is just diversity by another name.
An ecosystem consisting of half Windows and half Linux isn't just two monocultures side by side. It's one monoculture and several related but not identical cultures.
@Staudenmaier "the computer field naturally gravitates towards fewer and fewer platforms; one platform being the potential result (of this "cyber-entropy.") Other platforms (e.g. Mac and Linux) exist because they appeal strongly to vertical markets and do not compete directly with Windows."
While a certain level of commidization is occuring what you've said is only true of the desktop. The important machines are the server farms. And Microsoft doesn't rule the server market Unix does.
Allot of the problem lies with issues outside of the technical. We are technologically capable of creating operating systems, applications and protocols that are FAR more bulletproof than what we have now. In fact, we already have some. But the cost of these sturdier products, in both production and sales, is much higher than the current fair on the market. Producing these safer products will cut deeply into the profit margins of those companies that make and see them. Technology, especially computers, are a case of "Good Enough." We _can_ make virtually bug free systems, but the cost in time, resources and ca$h is to high. So we make products that are good enough and fix the bugs on the back-end with patches and add-ons, like antivirus software and such. It is just not economical to do it any other way.
There is another issue when dealing with security in the IT world. There's an old saying: Secure, fast and user friendly. Pick two. If this could be overcome and we could have all three the overall security of the computer world would increase tremendously. But, again, The cost factor is to high. Not enough profit.
Now, I'm not against profit, mind you. I am just saying that there are external, non-technical issues that have a large impact on this discussion. Do I know how to solve these issues? No. I wish I did, though. Both for the technical reasons and the fact that it would be extremely profitable for me. :-)
Nice points. But that old saying shouldn't be tossed around too much. You know the one: "secure, fast and user friendly. pick two." Many medium to high robustness products offer all three. Examples include CapDesk desktop, Green Hill's INTEGRITY Workstation, and Pidgin with OTR (best in usability for secure chat, imho). So, one often has to make tradeoffs but it is possible to get all three. It just takes good engineering and a little bit of luck. ;)
This blog post just got mailed to CodeProject users by their editor. Might be seeing some new faces soon.
@ Nick P,
"This blog post just got mailed to CodeProject users by their editor. Might be seeing some new faces by their editor."
Maybe we should put links into our earlier chat's it might just enliven some grey cells ;)
What do you think they would make of "Prisons-V-Castles"
Thats a good idea. Ive recently been trying to gather mine together. As for prisons vs castles, I think their heads would explode trying to figure out how to do it in Win32 and. NET. (They are a microsoft tech site.)
Noone has said that not having a monoculture means that every company has to use both Linux and Windows. As long as some chose to run Linux, some chose to run Windows, some chose to run MacOSX, whenever disaster strikes, (hopefully) a large enough portion of our society will continue to work for us to survive the initial period. And that's what matters.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..