Schneier on Security
A blog covering security and security technology.
« Australia Man Receives Reduced Sentence Due to Encryption |
| James Bamford on the NSA »
October 21, 2009
Ballmer Blames the Failure of Windows Vista on Security
According to the Telegraph:
Mr Ballmer said: "We got some uneven reception when [Vista] first launched in large part because we made some design decisions to improve security at the expense of compatibility. I don't think from a word-of-mouth perspective we ever recovered from that."
Vista's failure and Ballmer's faulting security is a bit of being careful for what you wish. Vista (codename "Longhorn" during its development) was always intended to be a more secure operating system. Following the security disasters and 2000 and 2001 that befell Windows 98 and 2000, Microsoft shut down all software development and launched the Trustworthy Computing Initiative that advocated secure coding practices. Microsoft retrained thousands of programmers to eliminate common security problems such as buffer overflows. The immediate result was a retooling of Windows XP to make it more secure for its 2002 launch. Long-term, though, was to make Vista the most secure operating system in Microsoft's history.
What made XP and Vista more secure? Eliminating (or reducing) buffer overflow errors helped. But what really made a difference is shutting off services by default. Many of the vulnerabilities exploited in Windows 98, NT and 2000 were actually a result of unused services that were active by default. Microsoft's own vulnerability tracking shows that Vista has far less reported vulnerabilities than any of its predecessors. Unfortunately, a Vista locked down out of the box made it less palatable to users.
Now security obstacles aren't the only ills that Vista suffered. Huge memory footprint, incompatible graphics requirements, slow responsiveness and a general sense that it was already behind competing Mac and Linux OSes in functionality and features made Vista thud. In my humble opinion, the security gains in Vista were worth many of the tradeoffs; and it was the other technical requirements and incompatible applications that doomed this operating system.
There was also the problem of Vista's endless security warnings. The problem is that they were almost always false alarms, and there were no adverse effects of ignoring them. So users did, which means they ended up being nothing but an annoyance.
Security warnings are often a way for the developer to avoid making a decision. "We don't know what to do here, so we'll put up a warning and ask the user." But unless the users have the information and the expertise to make the decision, they're not going to be able to. We need user interfaces that only put up warnings when it matters.
I never upgraded to Vista. I'm hoping Windows 7 is worth upgrading to. We'll see.
EDITED TO ADD (10/22): Another opinion.
Posted on October 21, 2009 at 7:46 AM
• 83 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
There are less warnings in Win7, but the usability isn't as good as XP(imho). I've been using it daily since the open-beta and I still have trouble finding locations I used to know cold. The compatibility is there, programs from XP/Win2k work well on win7, but so far I'm not a big fan of it's administration interfaces. Windows is still bloated and maintains too much backward compatibility perhaps, I commented on this topic in an interview with the Mac Observer: http://www.macobserver.com/article/2005/07/...
i've been using Win7 for several months, and am extremely happy with it. i have it loaded onto machines designed for xp, and -while they were paperweights with Vista - they're very happy and functional now.
of course, i work for windows, so i'm really biased.
The two major problems reported with Vista was bad drivers and UAC. Bad drivers was solved over time, but the UAC knock lasted. Vista was the probably the first time (windows)users were not logged in as system admin. If you use any OS as sytem admin you will be out of line with best security practice and occasionaly inconvenienced. I really have no problem with any OS- once you get to know its default behavior. The ultimate responsibility is with the operator and that is the essence of the problem.
Well, it's not as bad as Vista. It's not worth paying for the upgrade, but at least it's an acceptable alternative to XP when purchasing new systems.
You never upgraded to Vista even though improved security was one of the key elements of the release? Seems like that "security" thingy is about as high on your list of thing to consider for purchase decisions as it is for the general public, Bruce.
I think I will get around to installing Win7 right around the time Windows 2012 ships.
The problems I have with Vista are numerous. The slowness, the memory footprint, and such are all part of it. The biggest problem I had and continue to have is that the interface changed in strange and interesting ways that weren't even close to intuitive and had no ties to the older versions of the OS. Things were frequently made needlessly complex and overall it is quite difficult to figure out how to adjust things. The control panel was completely revamped, albeit the "old interface" was provided. I don't know why the "look" needs to change at every version of the OS. Although, this seems to be the only way that many folks out there understand if they are running "Vista" or not. From a networking perspective - the interface for setting up networks is frustrating as all get out. Another annoyance was that "My Documents" and the overall layout of the filesystem changed, it seems now to be more MAC-like, but I was familiar with the old layout so more change for no real gain. I can continue but my fingers are getting tired.
W7 - well we'll just have to see.
I think Vista had many failings.
On the security front Vista's UAC feature was a kludge to fix the administrator account problem in Windows. In Windows 7, UAC is just a kludge reworked to appease Vista users who hated UAC.
Microsoft is stuck with a long legacy of Windows programmers writing software that requires admin privileges and a huge security-clueless / security-indifferent user-base that is used to running as admin.
It would be easy for Microsoft if their admin problem could be solved with a simple technical fix but that's not the case. At this point it is as much a cultural problem as a technical one. Changing attitudes and practices is often slow and painful.
Warning: There isn't a smooth path to migrate directly from XP to Win7.
I suspect that this migration omission is Balmer's way of exacting retribution on all the Vista critics -- lose things in transition or pay the price (time and $) for a double transition (XP-Vista-Win7).
I don't know about anyone else, but the negative security news about Vista that was the biggest turn-off for me is DRM. From what I heard, Vista makes your computer do a lot of work whose sole purpose is to keep you from doing things that the media industry doesn't want you to do. This contributed to the bloated RAM and processor needs and to some of the hardware compatibility woes. Besides, I just don't like the idea that my computer is working hard to keep me from doing stuff not for my benefit, but for someone else's.
I don't know if Windows 7 will be any different on this score.
@Casey: "Vista was the probably the first time (windows)users were not logged in as system admin. "
Agree with your point on security but with UAC I think a user is still logged in as admin. It's a kludge. The user logins in as admin but also gets a standard user token as well. Everything runs with the standard user token by default and the user is prompted when admin privileges are required. Given that a lot of users will just click through without a second thought (if they haven't turned it off or scaled turned down the level of warnings already). So I'm not sure UAC makes much difference. Worse yet, there are certain circumstances that allow auto elevation of privileges. If you were a malware writer what would you prefer: a window's users running as admin with UAC or a Window's user running with a standard user account?
One way of looking at the UAC is that it isn't about the end-user. Rather it is about forcing programmers to write software that don't require admin privileges so that Microsoft can just abandon the whole UAC feature and have proper standard user accounts as the default.
My only criticism of Win7 is that when I plug/unplug power the screen goes off for a second or two - this didn't happen with XP or Vista.
On the plus side battery life does seem better.
Other than that the experience is just underwhelming.
The lack of an easy upgrade path from XP to Windows7 is what pains me most.
Vista is a dog, and I struggled with it on one of my laptops for eight months or so until moving to Windows7 because I heard it was faster. The upgrade made a huge difference, so I'd *like* to roll it out throughout... but I've got 380+ machines here, most of which are currently running XP. With no easy upgrade path provided, that's an enormous investment of IT team time. It simply isn't going to happen I'm afraid. It's not a lack of will to improve security, it's simple logistics and lack of cash in the current economy.
So, it'll have to be a slow adoption as and when machines are replaced. I expect to be supporting XP users for several years to come.
You are right, but I meant that windows users were not used to using anything less than unrestricted admin rights. UAC+admin is not the same as standard user.
If you follow the two-man rule, no user should be allowed to make the decision to alter a system alone. That is a huge inconvenience. Even the SA should not be able to do this. The problem is that the organization that has only 1 person (home use, self-employed, etc) will just make a call on the fly.
I thought Vista failed because of all the sucking it did. UAC was just another UI design decision to suck, not some sort of architectural issue that involved security/compatibility tradeoffs.
Sadly, Vista introduced a security notion that barely made the news, and I have yet to hear about whether it's still in W7.
Microsoft introduced the security model of integrity levels (IL), similar to the Clark-Wilson model, within the user memory space. This can be summed up as saying your IE process is not as important as your Outlook process and should not be allowed to access it directly (through IPC or OLE, or whatever) without asking permission first.
This should have been a touting point for MS with all the malware spreading through the browser, they should have done a lot more work to make that feature known and recognized.
E.g. lowering the IL for Firefox was a world of pain.
I also wonder how badly the elegance of Apple's UAC equivalent (and their brutal ads) can have damaged Vista
@ Isaac, am on the same page as you.
I don't feel comfortable with a OS forcing DRM issues on a user at the expense of speed and memory footprint.
I'm interested how that new OS is going to fare in this regard.
I'm under the impression the new boy tossed a lot of the screen saving intuitive menues out of the window, not sure i want to bother.
will wait and see.
Been an issue for a long time, not in windows but in technology in general. I did a graphical analysis of this very thing in my master's project in 2001.
Basically, the jist of my theory was an expansion of Mayfield's paradox which stated: getting everyone on a system costs an infinite amount of money and keeping everyone off costs an infinite amount of money, but the costs between the two extremes is comparatively lower.
My theory simply graphed that too much convenience, and you fail due to the costs incurred due to poor security (too many exploit it), and too much security and you fail due to the costs incurred due to poor usability (too few want to use it).
Seems Microsoft goes from one end of the extreme to the other, either being crushed by exploits or dismissed by dissatisfied users. They need to accept that they can't keep everyone happy and make everything secure at the same time. They need a reasonable balance between how much security to sacrifice for usability and how much usability to sacrifice for security.
One can dream, right?
casey: If you follow the two-man rule, no user should be allowed to make the decision to alter a system alone.
Oh, give me a break. That may be appropriate for very large networks -- but then, very large networks are inherently insecure because of the number of interfaces. You're more secure if you break up your system into small firewalled systems while giving the local users more authority, rather than creating a more centralized system.
Just look at biology. After 4 billion years, what's life's solution? Break everything up into cells -- each one is locally controlled with no IT department micromanaging every local decision, but the ill-effects of a bad decision are generally localized, with an immune system and suicide codes to keep local errors from propagating.
Basically on a terminal or small server -- security should be there to keep external threats from propagating in, and giving the local admin a pause before making a serious change, not as a check against the local administrator's actions. The Soviet approach is just asking for a complete collapse of the system.
I once worked for a multinational that used the Soviet approach -- they wouldn't give me a root password to my sun box. Well, that created several problems -- when the crappy CDM of the era locked up, I had to wait an hour for an IT guy to make it down to my cube, or I could pull the plug which eventually would lead to hard-disk corruption and several hours of IT work. Guess which one my manager preferred?
Then, since they had their transnational network completely open, since each box was locally secured, my desktop managed to shutdown their entire network. Why, you ask? Well, they had misconfigured it to act as a router -- it's ARP and routing tables got overfull, and it ended up gnarling up routers across the planet. Of course, I had told them that several times before -- but they could only afford to pay little attention to any particular desktop box.
Soviet style thinking. It's stupid and insecure. It's much better to have a large number of small points of failure that have a local effect than to have fewer points of failure with larger effects, no matter how much redundancy and how many checks you put in. Throw the theory away -- security and robustness of complex systems isn't a theoretical subject, unlike cryptography per se, but can only be local application and orthogonality.
Agreed. I'm not sure how much of an issue running as standard user versus Admin + UAC will be on W7. Hopefully more software is now capable of running in standard mode so that it's not the pain it certainly was a few years ago (see Admin Rights Hall of Shame: http://www.threatcode.com/admin_rights.htm). At home there is no way my kids are running as anything but standard users (what they do now on XP--a real pain except they are on a dual boot machine and they prefer openSUSE). And I plan to use a standard user account myself most of the time on W7 just because the attack surface has to be significantly smaller on a pure standard user account than on an admin+UAC account. For most people, most of the time I have to think that admin+UAC violates the principle of least privilege.
A couple of interesting blog posts on the UAC issue:
Russinovich: Malware will thrive, even with Vista's UAC
Defeating UAC with a two-stage malware attack
When the developers of your favorite software stop supporting XP (at Microsoft's "suggestion"?) and require Vista/Win7, you won't have any choice about upgrading to Windows 7 if you want to upgrade your applications.
I've been using Windows 7 64bit RTM for a couple months now, I've been very please with it.
Vista was annoying with the security messages and it didn't install well on various systems. Windows 7 seems to have cut out a majority of the annoyances, has installed with ease on numerous systems, and has the single best advantage that XP didn't for me:
Running as user.
I know you CAN run as a user in XP, but numerous programs don't install or run correctly even if I used the 'run as administrator' option. 7 (and probably Vista, never really stayed with it for long) does this great. I've not had to log off of my user account for any program I've installed today.
The only minor annoyance I've had so far is that some programs automatically put a shortcut on the desktop. I use quicklaunch and start menu for everything and keep the cleared off. Well, a good portion of the shortcuts seem to require administrative access to remove.
Otherwise the system runs great, doesn't crash, and all my programs up to this point have installed without a hitch.
On my desk I have a Vista workstation, an XP netbook, and a Ubuntu workstation. Even when I am not working on the Vista box, it is constantly hammering the hard drive. Also videos and music play better on my Atom netbook than on my Vista dual core PC. On the Vista box mp3's are slow to start playing, tend to stutter near the end, and any player uses massive amounts of CPU, RAM, and Hard Disks resources.
For us it wasn't security; we've had users locked, thus UAC wasn't the dealbreaker.
Our issue was the bloat. Vista was seriously bloated and the hardware requirements were obscene. Maybe 20% of our machines could run it, slowly at that. XP was still viable, so it made no sense (for us anyway) to upgrade.
A recent survey said that 88% of corporate machines could run Windows 7. I think that's about what I'm seeing, and by the time we're ready to roll out Windows 7, all our machines should be capable of running it. Which is a big difference from the position we were in with Vista.
@xpuser: "I don't feel comfortable with a OS forcing DRM issues on a user at the expense of speed and memory footprint."
Ditto. Bogging down a user's machine with taddleware is a questionable move.
Microsoft has unfortunately gotten a bad rap regarding UAC, mostly for bad 3rd party applications which they have no control over.
I have been running Windows Vista on several computers and have had no problems with extra UAC prompts. I also admin several small business networks where all users (Windows XP Pro) are setup as "limited" users (i.e. basic non-admin rights).
The first step to running without admin rights (or to ensure UAC doesn't prompt) is to identify all the crappy third party software that thinks it needs admin rights (when it really doesn't), only because the 3rd party software developers were simply lazy or just plain clueless.
Case in point, how many YEARS did it take the (lazy? clueless?) Intuit developers to figure out that their _book keeping_ program didn't really require admin access to the OS (i.e. admin rights to the Program Files and Windows directories, or admin rights to the registry root). As I recall, it was v9 when they finally figured it out and fixed their bugs.
> You never upgraded to Vista even though improved security was one of the key elements of the release? Seems like that "security" thingy is about as high on your list of thing to consider for purchase decisions as it is for the general public, Bruce.
Vista had security improvements, but I've seen nothing so far to indicate it's definitely better than a hardened XP system. Just because MS makes a few security-oriented efforts isn't cause enough to jump on their latest bandwagon.
Remember that time and experience is the best indicator of security in a product. Anyone for whom security is a big issue will continue to use a system they have worked with and feel is secure over a new system that *claims* to be more secure. I stayed with XP, as did many others who did not have to upgrade, and I trust its security none less than Vista's.
I think that the compatibility shims should be installable/uninstallable. If you don't need them, take the bloat out. By continuing to include them, the OS continues to grow and nothing really forces forward progress. Clearly identifying what needs to be upgraded can help users consciously decide whether it is time to upgrade or discard an app. It would also give additional transition to enterprise customers who may not want a whole XP VM.
I, too, am worried about DRM in Windows 7. I think a lot of people have avoided Vista over negative publicity concerning its DRM functions, but publicity for Windows 7 has been mostly positive (due, I think, to the "anything is better than Vista" phenomenon). I'm afraid that, even if it turns out Windows 7 has the same or worse DRM features as Vista, few people will avoid Windows 7 due to lack of media focus on the negative aspects of Windows 7 and due to the fact that it isn't Vista.
@BadRap: "The first step to running without admin rights (or to ensure UAC doesn't prompt) is to identify all the crappy third party software that thinks it needs admin rights (when it really doesn't), only because the 3rd party software developers were simply lazy or just plain clueless."
Agreed but Microsoft wrote some of those apps and facilitated the culture of crappy-admin requiring software.
What always amazed me is that a lot of anti-virus software wouldn't run properly without admin rights until fairly recently. How clueless is that? Pay us money so we can make you less secure! eWeek did an experiment a few years ago. They set up a couple of XP SP2 boxes, one running as admin and one as user and then they surfed unsavory parts of the net (I guess you needed to go looking for trouble then...). Results here: http://nonadmin.editme.com/WhyNonAdmin
I, too, mainly heard about slowness, bloat, and driver issues via word-of-mouth. Hardly a peep about security warnings.
I've heard good enough things about Windows 7 that I decided to wait to get a netbook until after October 22nd. (My home PC is sticking with XP for now, though.)
@Petrea: "I, too, mainly heard about slowness, bloat, and driver issues via word-of-mouth. Hardly a peep about security warnings."
Windows 7 More Popular Than Harry Potter
By Jared Newman - Wed Oct 21, 2009
A day before Microsoft releases Windows 7, Amazon UK said the operating system has become the best-selling pre-order product of all time, according to TG Daily. The retailer didn't provide any numbers.
Bloatware for me. I don't care about the security warnings but then I'm not my mother either, and those things would scare the boojums out of her.
What I hate are the endless look-and-feel redesigns that are pointless obfuscations of OS features to which we are already accustomed (just display my list of applications already, wouldja?), the stupid side-bar with photo slideshow and clock that must be disposed of in order to reclaim screen real-estate, the default folder display options being set to "you're too stupid to see this," and the manufacturer's bloatware that you must research and disable should you dare buy a PC with the OS pre-installed.
As for the Administrator alerts, I made it a lot easier and simply put no password on the Administrator account. What, you say that undermines security? Exactly my point...
"I think that the compatibility shims should be installable/uninstallable. If you don't need them, take the bloat out."
100% agreement ... but there's MORE!
Do NOT install them by default. FORCE the user to install them (and only the ones needed) IF that user tries to install a crappy app.
For home users, this might not help much.
For corporate users, this will clear so many problems.
If you cannot eliminate the problem 100%, you can still reduce the problem.
UAC suffers from a simple conceptual problem.
Microsoft assumes that the end-user knows what the right answers to the questions that UAC asks.
I'd argue that the typical end-user response to a UAC prompt would be one of these two:
1: Just do what I flipping told you to, punk! Why the f*** are you asking me?
2: Wtf? How the hell would I know if program xyz should be doing this? I don't want my PC to not work, so I'd better allow it.
In other words, UAC is 99% annoyance, 1% almost-good idea.
I actually like vista, except for some compatability issues from having a 64 bit system. The UI is actually much better than XP... after turning off all of the security false-alarms, that is. I think Ballmer's assessment is spot-on.
Oh yeah, the DRM issues are a thing too. I had to install a "fix" for that. Not something I like having to do with software I own. The DRM was causing another program to crash, one that I was not using to illegally copy anything. Not happy with that at all. But still, the UI is better than XP *shrug*.
Well, it appears that Vista was the unstable pre-pre-alpha version of Windows 7. And people paid to have it. Gosh.
I think there needs to be some common sense used in security questions.
I use a car analogy. My car warns me with a little light when I don't have my seat belt on. I can decide whether I want the added protection of the seat belt, it doesn't just strap it on automatically.
Converse that with an air bag. I don't get prompted that "your airbag is about to deploy. Continue? Ok or Cancel" The darn thing just goes off for my own good.
Technology, though a different animal, should work that way. There need to be a bit more thought into what is configured a certain way by default, and it needs to be easier to configure other things for the average user.
Bruce - you're not using Linux as your desktop? I find it hard to believe you're running Windows... !
The car analogy extends well.
Certain cars do have autmatric seat belt strapping in the door frames.
Such automation prevents the anchoring of child safety seats.
It was a choice limiting, therefore customer limiing feature.
The optional decision was made at he point of purchase,
since it couldn't be made inside the product.
Computers have more regret potential, because the average buyer only learns the meaning of choice denial by use.
hence, Bruce is right on target about careful decisions required to decide what to default - i.e. pre-trade off, and what to cognitively leave.
This seems parallel to Chris Crawfords usable opportunity cognitive event metric for video games.
@Peter: "Such automation prevents the anchoring of child safety seats.
It was a choice limiting, therefore customer limiing feature."
Exactly. As with cars, computer are often driven by usability and the pleasantness of the usage experience, at least initial.
The security problems become apparent and exacerbated later, as exploits are reported and the bad PR that comes with it, computers are bogged down or destroyed through the exploits. Customers quit coming, or bail.
Microsoft needs to accept to bring security to a high enough level that some customers won't be happy and not come, but also needs to accept that if they want any users the product needs to function smoothly enough that some security incidents are just going to happen.
To quote a proverb: if you try not to lose anything, you end up losing everything. Or, he who tries to please everyone ends up pleasing no one.
@Linux User: As a general rule, you get the OS that runs the apps you want to use (or have to use). That's why I went with Linux, but not everybody shares my app preference. It's very likely that Bruce needs to run Windows apps natively, and therefore needs at least one Windows box.
@ Linux User,
"Bruce - you're not using Linux as your desktop? I find it hard to believe you're running Windows... !"
First off why should anybody run any OS in particular?
The answer of course is it provides either the best support for the work they are doing, or it provides required compatability with other users.
It is this later aspect that usually dictates what happens in the modern "office" based world of business.
However when you get out of the office even as far as the wiring closet the former is usually true.
I run over 15 different operating systems at any one time (some I wrote myself for hardware I have developed).
Amongst these are MS OS's from Win 3.11 upwards and DOS 4 upwards and NT 3.51 etc.
Why because I still have to support them on what is effectivly "embedded" kit.
I even have to support some Apple II UCSD P systems as well, simply because they are connected to machines that have very high dollar value and very very long ROI times (and the original company has long since disapeared into the mists of time).
Oddly although I like it I tend not to use "desktop" Linux that much as it has little to offer me in that respect (CLI mode on a serial line is a differnt story altogether, but hey I like headless mode).
However like many embedded developers I tend to use BSD simply due to simpler licencing issues.
It is always "horses for courses" and a racing horse won't get the beer deliverd where as a Shire will...
Don't hold your breath for W7 being any better. I was at an MS 7 event held in Slovenia. A techie there told me "I had to load windows 7 on 32 netbooks this morning. Fifteen of them wouldn’t work. I had to take them all apart and replace cables and stuff, then put them back together and reinstall 7.”
Then, at a press conference, when asked about some of windows 7’s new security features, MS chief security analyst Gibson quipped: “I’d demonstrate for you, but we don’t have two hours for windows to boot up.”
Full story here: http://bobarno.com/thiefhunters/2009/06/slovenia/
> I suspect that this migration omission is Balmer's way of exacting retribution on all the Vista critics -- lose things in transition or pay the price (time and $) for a double transition (XP-Vista-Win7).
Then he's nuts. I have no intention of migrating to Win7, either.
@ Linux User,
> Bruce - you're not using Linux as your desktop? I find it hard to believe you're running Windows... !
His explanation is that his company (CounterPane, before he sold it) used Windows for their business, so he used it too because for simplicity. I guess that was because he could use configurations and research his company did, and integrate with any MS-based network tools they set up.
What's Ballmer smoking? For my personal use, the dealbreakers were the obnoxious DRM and unacceptable restrictions on VM deployment. For business use, it was usability and resource usage. It sounds like they've addressed the business objections, but I haven't heard much about DRM. I'm sure not spending money on it until I know more. I'm perfectly happy using Windows XP, isolated, for the few things for which I need Windows, until I retire (which sadly is a long way off).
Stop telling me what I can't do with my OS and I'll reconsider.
Since my Vista sucked enough of my computer's resources, I switched to a Linux distro. It is called Ubuntu and I'm able to get my mail, get in touch with the Internet, write mail, and moreover it doesn't get any viruses. If MS can make any OS that will not get infected by decade old viruses, I may get back.
By the way I'm sure Anti-Virus companies are trying to write viruses for Linux platform. I personally saw one of them distributing a virus, in 1992, called Sigalit. The standard-of-the-time CPAV was distributing it. People with BIOS protection noticed that.
I already upgraded from Vista Ultimate to Ubuntu and there is no way in hell I will be downgrading again. Security is a very small part of what is wrong with Windows for me. Windows is built with the purpose of being friendly for the average computer user, it was not built with IT literate people in mind. I wouldn't recommend Windows to IT professionals in the same way I wouldn't recommend Ubuntu to my mum (despite how much she liked the look of Desktop Cube and Gnome-Do).
I don't get all this whining about Vista. I run Vista 64 and "love it".
Admittedly, I migrated from win2K, in March (so many of the earlier Vista problems had been sorted) this year on a home built high end PC. It's worked flawlessly for me, nice enough interface and the UAC is just about right - I only get asked for admin rights for tasks I'd usually su on a Unix box.
i knew it. it's customers who don't want it to be secure.
such responsible design decisions and then they get unevenly received. bleh.
If Ballmer really believes that was the reason Vista pancaked, then his market research is so poor that 7 is a guaranteed dead duck, too.
The number one reason is that for most people, and nearly all organisations, the process of changing operating systems is an absolute PITA and a substantial cost as well. Sure, there's a tiny minority of people who run out and immediately start beta testing for Redmond , but most of us will only change OSes -- "upgrade" in Redmond marketese -- if there is a compelling advantage to doing so (or a serious disadvantage in not doing so.)
And Vista didn't have one. It was following a tough act: giving credit where it is due, XP was a massive improvement in MS's line of desktop OSes, and fixed the great majority of the complaints that had adhered to earlier versions. It was vastly more stable, much more secure (or at least, can easily be configured to be much more secure), easier to install hardware, easier to maintain, fixed a lot of "dll hell" problems, AND is surprisingly efficient for a MS OS.
So with XP, the average Joe finally had an MS OS that didn't constantly piss him off. He had no *reason* to "upgrade", he was happy with XP!
Think about it: pay actual real money for an OS when you already have a perfectly good one; go through the sweat and pain of installing it (hopefully without losing any data or killing your favourite screensaver), then go through the additional sweat and pain of learning a new, pointlessly altered user interface, and you get: what, exactly?
From Vista, you got DRM, reduced performance, extra levels of "Undo" in MS Paint, and Stacks. Everything else is either invisible to the user, pointless frippery, or trivial. Now, I'll admit that Stacks are a good idea; they just ain't 1% of good enough to justify Vista.
So where does this leave Windows 7? As I read through the new features list, I see a whole lot of junk, frippery, and nonsense. Improved handwriting recognition!?! Puh-leeese!
Performance on independent benchmarks is better than Vista, but still worse than XP on most tests. The new security features added in Vista have been largely crippled (presumably until the next big worm epidemic.) The user interface has been changed yet again, but unlike Vista this is no longer optional (business users: start adding up those re-training dollars!)
The one feature where it really shone was boot-up time. Unfortunately for 7, that is no longer all that important: thanks to APM and the stability of XP, people who care about boot times already have Stand-by mode, which is even faster.
In short, just as with Vista, Windows 7 simply offers no good reason to go through the pain, suffering and actual financial cost of obtaining it. I predict that few businesses or home users will adopt it until they are blackmailed into it by MS ceasing support for XP (currently scheduled for 8 April, 2014.) In the meantime, it is no longer painful to install Linux on a laptop ...
1. In the Windows world, this means "installing the new OS before the first Service Pack has been released."
The problem with Vista is that it was as STUPID OS .. it asked your permission to do normal things at least twice -- even 3 times
Drivers were a problem -- MSFT to their esteem arrogance refused to support old programs on 64 bits (one-note intergration with browser etc.)
But the main reason Vista turned out to be a disaster is that it forgot people don't use OS! They use applications and Vista almost always came in the way.
Windows 7 is somewhat improved in this regard -- it doesn't bother you far less -- and the real reason Vista failed -- Steve Ballmer
I thought Vista was a great OS. I don't get people's sentiment about UAC - especially on a security blog. It prompts you if a program requires administrator access, that's what it SHOULD do. People who will just click-thru to get to what they want will also just enter their password to get what they want, and it's a bad idea to have users enter a password all the time because it increases the likelyness that they will get it spoofed and stolen, and most users use the same password everywhere. Default UAC in Win 7 is a little weakened, but it's a simple matter to select the more secure settings. But really any one who cares about security should be running as a standard user, with ctrl-alt-del required to elevate so the password can't be spoofed and stolen.
I had no major problems with Vista in 2 years, it ran 24/7 and was ultra-stable, XP couldn't dream of being that stable, and there is a lot of anti-exploit technology in Windows Vista, so much in fact that unpatched Vista boxes have less malware rates than PATCHED XP boxes. I'd call that a major success, even if Vista got held back by negative and ignorant PR.
On UAC it is interesting to note that Microsoft hired Cripin Cowan, a Linux security expert, last year to run its UAC team. Hah, well there's a smart move! Read Cowan's blog for a no-bs account of what UAC is all about. See in particular: "UAC: Desert Topping, or Floor Wax?"
"Vista/UAC says “no” far too often precisely because the idea of running lots of software without Administrator privileges is new to the Windows community, and so a lot of applications are using excessive privilege that they don’t need. We’re making a concerted effort to reduce the number of unnecessary UAC prompts in the future by improving the middleware and applications software to avoid performing privileged operations as much as possible. Making it possible for everyone to run as Standard User is the real long term security value."
Also see his PDC talk from last October:
UAC is a "transitional technology" (his words) to wean Windows users and Windows programmers off what they've been used to doing.
Was it really that hard to put a "sudo" option into Vista? Yes, thanks for the reminder that I don't have admin rights at the command line- but instead of hitting the up arrow, the home key, and typing "sudo " on the front of the command, I have to click Start, type "Command Prompt", right-click on it, and choose "Run as Administrator". There are also a lot of places where Microsoft didn't put a useful little "Unlock" button to elevate like Ubuntu does most places- you had to close the dialog and right-click to open it back up with admin rights.
It wasn't "security"- it was the way it was implemented. Hell, I prefer Ubuntu requiring me to type my password to Vista's "Continue" button just because it always ASKS automatically when admin is needed- even though the Continue button is more convenient WHEN IT APPEARS.
The performance is really what kills it though. The only reason I can think of for it being so slow is the DRM processes that constantly check that the media-playing channels are "secure". There's really no other excuse for the huge hit. My Core 2 Duo machine should not have ANY trouble playing low-quality videos.
@Powerpoint of Cowan's PDC talk
He nails it when he says programmers want to develop it quickly. Programmers don't get grief, at least initially, when things run with too much authority, because no one really knows it. It's hidden. Whereas if it does not have the authority for even a minor function, it is noticed.
The same could be said for users, administrators, most definitetely management. Improperly configured security can be in place for years without complaint, functions that don't work are immediately apparent.
This is also largely why external contractors run so many programs with Admin...they get few support calls after they hand the product over.
You can drive a car with a malfunctioned airbag or seat belt for years and be completely happy with the car. Yet you notice pretty quickly when the brakes are not working properly.
Security engineer involvement testing on windows, as mentioned in the OP, would go a long way in reducing security problems. However, a change in the mentality of users who want to do what they want to do when they want to do it without problem, who don't think much about security, won't happen overnight.
There are several reason why we at Microsoft do not support an in place upgrade from Windows XP to Windows 7. We realized at the start of this project that the “upgrade” from XP would not be an experience we think would yield the best results. There are simply too many changes in how PCs have been configured (applets, hardware support, driver model, etc.) that having all of that support carry forth to Windows 7 would not be nearly as high quality as a clean install. This is something many of you know and already practice. We do provide support for moving files and settings and will prompt at setup time, but applications will need to be reinstalled. We know that for a set of customers this tradeoff seems less than perfect, but we think the upfront time is well worth it.
For additional assistance with the migration of Windows XP to Windows 7, please go here: http://tinyurl.com/mhbep4
Microsoft Windows Client Team
But maybe the whole point of the "design decisions" was to generate a certain amount of bad mouthing.
Customer annoynace with UAC allowed Microsoft to put the screws to third-party vendors to force them to clean up their code. UAC is a EPA-style Superfund program for the Windows ecosystem.
Check this out:
"As the ecosystem has updated their software, far fewer applications are requiring admin privileges. Customer Experience Improvement Program data from August 2008 indicates the number of applications and tasks generating a prompt has declined from 775,312 to 168,149 [since Aug 2007]."
UAC may have been a painful design decision but probably less painful than going either cold turkey or maintaining the status quo (at least in the long run).
Hey Bruce, delete yet some another postings and blare a round...
Vista's security warnings are NOT false alarms, not as such. 99% of the time they tell you an application is doing something it ought not to do. (The other 1% of the time you're doing an administrative function on purpose, in which case you should expect to be prompted for confirmation; gksudo prompts Linux users for a password in those situations. And Vista's UAC asks for an administrative password if things are set up correctly, i.e., if the user is running out of a non-administrative "limited" user account.)
The reason why you got so MANY gratuitous UAC warnings when Vista was new was because most of the Windows applications people use were originally developed for Windows 98 and ported to Windows XP as an afterthought. If you try running out of a limited user account on XP, which *should* be standard practice, you frequently run into problems. Even security-conscious companies like Symantec (and even Microsoft itself) wrote software that can't gracefully handle the situation wherein the user who logs in most of the time is not the system administrator. (Just one example: Live Update gets stuck every few updates and won't go any further until someone logs in as Administrator or, on Vista, enters an admin password into UAC.)
What Microsoft needs to do about this is stick to its guns. Application developers *will* fall into line eventually, because apps that don't generate a lot of gratuitous UAC warnings are more convenient for users than ones that do. You don't see Unix applications generating gksudo prompts all the time, because developers who write for that platform have always assume the user won't have admin privileges. So they don't do things like store the user's settings in a system directory (just one example of the kind of stunt Windows applications often pull). The security crackdown in Vista was initially quite painful for users, but it was a necessary transition. There's no other way to get application developers to toe the line.
Actually, Vista doesn't go far enough, because you can still run as an "Administrative" user, and although UAC still prompts you, it doesn't ask for a password. It should. Every time. Because then the "easy" way to get rid of the dialog, and the thing most users will do, is to click "no" (which is what they SHOULD do if they don't understand the reason for the prompt), and then and only then application developers will fix their junk.
The impression from Cowan is that the goal is to eventually do away with "Administrator Approval Mode" i.e. logging in with user and admin tokens and using elevation prompts but this depends on more developers fixing their code. Maybe in W8 the install will set users up by default with standard user accounts.
@Jonadab: Most UAC warnings are, from the user's point of view, superfluous. As a user, it doesn't really matter if an application I wanted does something I want it to do in an unapproved way. It's only useful when something bad for the user is going on, and by that time the user is conditioned to allow everything, if the user hasn't actually disabled the prompts.
Nor is it going to be easy or straightforward to get all applications changed. New ones are more likely to be written correctly, true, but old ones aren't likely to be rewritten. In some cases, the source code is lost or there's nobody around who can change it. In many cases, the company that produced it won't be around, or will have discontinued the product, or will charge for a version that doesn't work any better from the user's point of view (except for fewer UAC prompts) and creates compatibility issues.
It has some value on the whole because it does put pressure on application developers to test with normal user accounts. On the other hand, this was already happening with business software, as many businesses limit privileges on most user accounts.
It has no particular value to any specific user. Most prompts will be for things that are in fact harmless, and the average user won't be able to pick out the cases where it's warning of a real problem. The only legitimate way a user has to deal with constant UAC prompts is to pay additional money for software that the user already has.
UAC may be the answer, but in that case I have no idea what the question is.
UAC warnings probably annoyed many people, but I have not heard anyone mentioning it as the main reason (after all, you can always turn them off), but incompatibility with existing applications, changes in the user interface, general sluggishness were far more common complains. If you use Windows XP, usually you do not need more than 512Mb, so many computers did not at the time when Vista was released, but for normal work with Vista 1Gb is absolute minimum (or even more if your video card uses main memory). And it is not just that Vista requires more memory but it also needs faster access to it. Also, many computers did not have power enough video cards at that time. Now, hardware has caught up, many applications have been corrected, also Microsoft did some improvements where it was a low-hanging fruit. So, Windows 7 will be received far better, but in essence, Windows 7 is the same Vista with some minor improvements.
For software developers, having to do things in a different manner may or may not be an easy change. Of course, requiring a change in development techniques could benefit users. When it comes to the Windows platform (or at least certain versions of the Windows OS), and efforts (or lack thereof) on preserving backwards compatibility for software, the following writing may be of interest:
How Microsoft Lost the API War - Joel on Software
http://www.joelonsoftware.com/articles/... (the notable information starts with the "The Two Forces at Microsoft" section)
I had a laptop which came with Vista. I tried it for about 2 weeks, but due to constant screaming frustration I quickly dumped it and switched back to XP (note to evil MS lawyer types - I have far more licenses than PCs; when I discard a PC I keep the license so ever since I gave up building boxes from scratch I have been accumulating licenses since each new box comes with one). But I digress - this was billed as a "home entertainment laptop" perfect for my needs because I was using it to transcribe home videos on 8mm analog videotape to DvDs.
Unfortunately this meant I had no proof of copyright, and the built in DRM constantly trashed my videos. Plus there was no "vid cap" software per se, rather it depended on Media Player to do it and MP -WILL NOT- let you just record ad hoc from the RCA jacks; it INSISTS that you provide a start time and stop time and channel number (none of which exists, other than "when I press the start key you dumbass" as a start time) which makes it a cast-iron bitch to start recording at the right place.
Fortunately I was able to install my pre-existing vid cap software with the new hardware once I put XP on it; freeing up the extra ~1GB of RAM which all this sadistic, antichrist-like, al-quaeda inspired DRM had been gobbling up was simply an added bonus. I suspect if you could extract out and measure the wasteful DRM overhead from Vista, the ratios would be similar to that of drink:ice in fast food restaurants...
I suspect XP will be my last MS operating system because the idea of an OS consuming 70% of the resources I paid for in order to thwart my lawful pursuits is anathema to me.
@Issac and others.
Yes, the builtin DRM was a big one for me, and many others. Even though I run linux mostly. Why?
Because it's also the "driver problem". MS now requires drivers to be signed by them, and hardware for multimedia to eat encrypted data streams, so you can't write a screen scraper and put it in the driver stack like you used to be able to do if you bought the expensive MSDN subscription.
The result is all the hardware manufs had to redesign things, and add capability to accept and decrypt multimedia data. This means that hardware for my linux machines also costs more, and uses more power, even though I don't use those "features". Yet another MS "tax".
OF course, we can think our congress critters for the best laws the **AA's money can buy -- MS could have been sued for "aiding and abetting" or some such had they not come up with a brute force solution that looked like a best effort.
UAC's never bothered me, but I understand they are a weak substitute for the linux privilege control systems and filesystems. They wouldn't dare to be caught copying the more advanced open source stuff on that, as that might open their code to looking at what else they've stolen. Remember they started as dumpster divers with stolen code.
Their model is simply totally out of date -- hard on the outside (in their dreams) and soft and chewy on the inside, from the days when no one thought anyone would try to break such a neato thing as a computer. They are still "gee whiz, I can add this feature" based on old tech that is fundamentally insecure, and can't ditch backwards compatability for more than one reason -- they don't have the talent to produce a new Word, say, from scratch if they had to.
Spoken as someone who made millions "fixing" the problems MS has generated -- I know windows from the drivers up, had to write some in fact. Now that I'm retired, I run Linux, except for a couple XP's in virtual box to support some dumb vendors of scientific gear who won't support linux or even let me do it.
Thanks for the memories, Bill, and the bucks, but bye-bye for now. Your stuff just isn't worth it even if it's free.
Nice spin by Ballmer trying to blame Vista's failure on security but I don't buy it. The big issue was that Vista was a solution without a problem and indeed introduced its own problems. More gunk piled onto the OS that didn't need to be there, overloading systems designed for an XP or 2000 load.
I almost never log onto my W2K systems as admin. Running as a normal user is just fine for nearly every application I've ever used -- and I reject the ones that demand for me to run as root.
Windows 7 suffers from the same problem as Office 2007 and Vista. Too much fluff and eye candy that really seriously gets in the way of usability. For instance, where was the customer outcry that led Microsoft to remove the QuickStart bar on the task bar? The new task bar, showing every open document, makes it harder than the one good thing in Vista: showing a snapshot of the window so you can select the right instance.
Vista and Windows 7 are both great examples of moving the cheese but the complaints are legitimate. Change for change's sake. That's all.
Moving user data and some UI settings from XP to Windows 7 may be helpful but is not the most time consuming task at all.
Just consider a developer system with different programming tools like Visual Studio and Visual Source Safe tweaked up to highest level, highly adopted Office programs, Outlook with different accounts in an Exchange environment, CorelDraw with configured docking windows, settings, layouts ect., VMware workstation running a bunch of virtual machines for debugging purpose, network printers with adopted driver settings, a lot of different browsers needed i.e. for web design, many small but helpful tools or registry tweaks ect. ect. ........
My start menu programs folder has 3 columns filled up on a 19 inch display with high resolution.
Reinstall all that stuff and then reconfigure it for your needs is the real pain and challenge. I simply have not the time to do that, so I will stay with XP.
For me the deal breaker was DRM. It gives me the impression that the OS is working against me. Me being a "local IT guru" for friends and family, I've told each time asked that I keep using XP and will not support nor give advice when using Vista or 7; for now, 10 vista installations have been turned to XP because of that. Also, in the software piracy world, you still see lots of "all in one" and "unattended edition" XP packages, and almost all Vista and 7 packages are just the OS unchanged. That means that the pirates distributing vista/7 are the dumb ones. The more seasoned are massively gravitating towards linux: that can only spell disaster to microsoft in the long run, since that people are forced to leave office behind. Those are the people that, like me, make friends, family and all the small businesses (or no so small) use the IT they know. And the greatest economical advantage of MS is that people know their products, so your employer does not need to expend time and money training you... when that is lost, and it is being lost very fast, MS will be at a disadvantage and that will be the end of it. IBM lost its incredibly dominating position because it let people stop using and knowing its products. It is happening again, this time to MS.
In fact, security never has been the real problem. There are security problems because lots of people is using its products (and paying MS lots of money).
What killed vista in my book was the
DRM and the constant having to reactivate
every 6 months when the damn security certificate expired. Crippling the OS was the major killer.
The second major killer, and the reason we prohibited it from our network, was that they moved things all around in Vista, and unless you jumped through hoops, access to a machine over the network was user status only. It made it damn near impossible to migrate someone from XP to Vista, and to maintain them over the network.
I would bet that windows 7 has the same problems.
"Vista's endless security warnings [snip] being nothing but an annoyance"
I think that's being rather dismissive of the problem - users become habituated into confirming that the programs can be run, thus completely undermining the intention of the functionality.
The formula for a successful operating system is so simple. Every generation needs to become easier and more intuitive, more efficient and thus faster, and of course more secure. Albeit, prettier, cool applications and a general "more fun" are nice to have too.
If every iteration of Windows got closer to those goals, Steve Ballmer could quit making excuses. Vis-a-vis when was the last time anyone saw Steve Jobs making apologies for the Mac OS ? Steve B. - ARE YOU LISTENING?
I just wanna say, that in 2013, in a dual boot of Windows Vista and 7 Ultimate 64 bit, Vista doesn't run that bad. Certainly not as good as my 7, but still good enough for me not to go back to XP. Switched my dual boot from XP 7, to Vista 7 because XP is going to lose support soon, whereas Vista has a few more years, and plays my legacy games as good as XP does. I have a Linux copy that i'm hesitant to try simply because I don't want to attempt to figure out command lines and all that mumbo jumbo. I use My computer for gaming mostly. Some schoolwork. and internet.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.