Howard Schmidt on Software Vulnerabilities

Howard Schmidt was misquoted in the article that spurred my rebuttal.

This essay outlines what he really thinks:

Like it or not, the hard work of developers often takes the brunt of malicious hacker attacks.

Many people know that developers are often under intense pressure to deliver more features on time and under budget. Few developers get the time to review their code for potential security vulnerabilities. When they do get the time, they often don’t have secure-coding training and lack the automated tools to prevent hackers from using hundreds of common exploit techniques to trigger malicious attacks.

So what can software vendors do? In a sense, a big part of the answer is relatively old fashioned; the developers need to be accountable to their employers and provided with incentives, better tools and proper training.

He’s against making vendors liable for defects in their products, unlike every other industry:

I always have been, and continue to be, against any sort of liability actions as long as we continue to see market forces improve software. Unfortunately, introducing vendor liability to solve security flaws hurts everybody, including employees, shareholders and customers, because it raises costs and stifles innovation.

After all, when companies are faced with large punitive judgments, a frequent step is often to cut salaries, increase prices or even reduce employees. This is not good for anyone.

And he closes with:

In the end, what security requires is the same attention any business goal needs. Employers should expect their employees to take pride in and own a certain level of responsibility for their work. And employees should expect their employers to provide the tools and training they need to get the job done. With these expectations established and goals agreed on, perhaps the software industry can do a better job of strengthening the security of its products by reducing software vulnerabilities.

That first sentence, I think, nicely sums up what’s wrong with his argument. If security is to be a business goal, then it needs to make business sense. Right now, it makes more business sense not to produce secure software products than it does to produce secure software products. Any solution needs to address that fundamental market failure, instead of simply wishing it were true.

Posted on November 8, 2005 at 7:34 AM57 Comments

Comments

another_bruce November 8, 2005 8:41 AM

i demand the right to sell you kludgy, insecure software with absolute immunity from any consequences!

Roy Owens November 8, 2005 8:51 AM

Generally, the company gives management all the decision power, including vetoing any initiatives from the developers. Those workerbees who toe the line now will be around later to reapply for their jobs; those who don’t, get downsized. Meanwhile managers keep getting promoted.

Engineers don’t get to do things they way they’d like: they are forced to appease management.

Remember when automotive management introduced ‘planned obsolescence’? Cutting corners became the spirit of the day, sending Detroit into the cellar and helping Japan become preeminent through better engineering.

I suspect Schmidt’s angle is to promote R&H Security Consulting as allies of management to protect them from stockholders so they don’t get fired for bad policy decisions.

Lyger November 8, 2005 9:27 AM

“Right now, it makes more business sense not to produce secure software products than it does to produce secure software products.”

This is a problem with the customer base, really. A more educated customer base, that choses to purchase the most secure products they can get would change this. Before you complain that this requires all of us to be security experts, I would submit that it’s possible to be a layperson and still make informed desicions, even if only by knowing whom to ask for information. There are plenty of people, like our good Mr. Schneier, who can speak to such things in terms that allow the non-technical amoung us to say, “This is a good product in terms of addressing concerns about X threat.”

Of course, whenever you rely solely on others to provide information, you’re going to have a problem – some basic understanding is likely a must, to allow you to spot blatant BS from vendors or analysts.

Tim Vail November 8, 2005 9:33 AM

I might not know as much as you do about this, Bruce. I think based on the new article from Howard, he has quite a number of good points. It is a game of how to get the right people to have the incentive to make their product secure, while at the same time satisfying customers, and keeping the costs reasonable.

Sure, liability could help make the management more interested in that, which in turn would trinkle down to the programmers. But, I think Howard is concerned about the cost that liability would have to the software industry. Driving up prices, and perhaps how it could hurt the programmers by cutting their pay. I feel this is a valid concern. I feel that our society has became too lawsuit happy, and in some industry it is getting way out of control. Just think medical malpractice…the cost of malpractice insurance for OB-GYN is in the stratosphere in some areas. Doctors are quitting that field because they cannot afford it, and I’m sure baby delivery cost is going up to cover. Basically, people need to understand they do not really have the right to be protected from every type of harm. People can try their best, but at the end of the day, bad things will happen sometimes, somewhere.

Sure, some protection is necessary, but it is a fine balance. Right now, in the computer industry, the balance is probably leaning too much in the favor of the vender.

Andre LePlume November 8, 2005 9:36 AM

This sounds like blaming the victim to me.

Is SW quality low? It must be that coders don’t know or don’t care. Let’s give them training and “incent” them to produce better code.

Meanwhile, let’s specifically preclude the market from applying the same slap of the invisible hand to us as managers of the business.

Roy Owens’ conjecture about the “angle” to this resonates, big-time.

FG November 8, 2005 9:44 AM

Can you point me to the best secure coding book?
I want to read and see if my code is secure and improve.
My opinion is that the companies have the power to choose the best developer they aford from the market. More money , better performance.

Tim Vail November 8, 2005 9:56 AM

Lyger:

Yeah, it is easy to say “oh, it is a problem with the customer base.” But the problem is that often the customer base do not have all that much control over what products they use. Lot of security decisions are really out of our control. There is nothing in the industry that compels the vendors to make those security minutae public. Even security experts are often forced to use insecure products simply because everyone else is using it.

Perhaps that is the goal for the long run, computer industry is in its infancy. You can’t expect every customer to go researching for information on this — we are already bombarded with enough information as is. Even what security experts like Bruce says is drowned out in the sheer amount of information we have to process daily.

another_bruce November 8, 2005 10:25 AM

@tim vail
your post reveals a number of common misconceptions. excessive liability is not the cause of higher malpractice premiums for ob-gyn’s, rather, 1) the insurance industry lost a lot of money when the stock market tanked in 2000 and needed to get it back somehow, and 2) the insurance industry enjoys an exemption from antitrust law (unique in american business except for major league baseball) which enables them to collude; the only tort reform needed is the repeal of this exemption. america is not a “lawsuit happy” society; the media, which is owned by big corporations, distorts this to gull the sheep. taking your position further, why should automakers be liable for gas tanks that explode and suv’s that roll over due to negligent design? if you’re stopped at a light and lightly tapped in the rear and your tank blows, crisping you and your family, at the end of the day that’s just something bad that happened to you, part of the cosmic jest, so why should this cost be distributed to me in the form of higher prices?
a corporation is a charter issued by the state to a group of people enabling them to invest money in a new, legally recognized but artificial person under the law upon the assurance that their potential liability for the acts of the artificial person will be limited to the amount of their investment. these artificial people have evolved to possess huge agglomerations of capital. the supreme court foolishly granted them some of the same 1st amendment free speech rights we natural people used to possess, thereby proportionately diluting and muting our own free speech. the artificial people are now in the process of gaining control of our republic from us. when next you look in the mirror, ask yourself if the man looking back at you stands on the side of the artificial people, or on the side of the natural people where i am. do this primarily for your own benefit, you don’t have to tell us what the answer is.

Lee Short November 8, 2005 10:26 AM

Roy & Andre are on the right track…the 800-lb gorilla that Schmidt is studiously avoiding is management incentives. Programmers don’t set schedules; managers do. If managers don’t allocate time for security reviews, they won’t happen consistently. The fact is, giving programmers tools, training, and incentives is just not enough: you need to give them budget & schedule too…and Schmidt’s proposal does nothing to address this issue. Any real solution to the problem must give management an incentive to allocate budget and schedule for security reviews. Schmidt’s proposal is only so much security theater, and this should be obvious to anyone who’s really done software development. Schmidt points out the downsides to the liability solution (no solution is perfect), but it at least strikes at the real problem.

Once you give incentives to the management, then everything else will eventually follow.

Bryan@adminfoo November 8, 2005 10:29 AM

Consumers generally speaking are not equipped to make good software security choices. We need some relatively neutral third party to start doing objective security reviews of software as it hits the market.

But imagine being that reviewer: what do you test for? And how do you keep yourself from just becoming another bugtraq, listing vulns found by everyone else?

Or imagine making software vendors liable for security issues. First you have to define the scope of security issues your law covers. Which won’t be a simple problem.

Richard Rodger November 8, 2005 10:44 AM

I agree completely with Lee Short – forget about tools and training, developers need TIME to do security reviews and analysis. I have never worked in an organisation (except the one I founded) where any time was given to doing real security analysis.

And penalties against the company will provide this time by creating the management incentive to invest in security.

On the other hand, maybe management should invest in a polyphasic sleeping course for all developers so that they have 8 extra hours a day – now that’s a real management initiative!

Mike Sherwood November 8, 2005 10:52 AM

“If security is to be a business goal, then it needs to make business sense.”

Any ideas how to do this other than by holding businesses liable for their products? Security isn’t a selling point for most products, so it won’t increase revenue. However, it will add to the cost of development. The cost of implementing security has to be offset somehow. Risk of litigation is a cost that influences many business decisions.

The problem with liability as a solution is that it’s easy to circumvent. Trying to apply liability to internally developed applications and free software would not make any sense. Small vendors would be able to claim that they aren’t commercial software vendors as much as custom application development service providers with common requirements across their customer base.

Liability only serves as a way to go after large companies. This doesn’t seem too unreasonable since a large software company is more likely to have the resources to improve their products and the number of users who would benefit would be larger.

Mike Sherwood November 8, 2005 11:06 AM

@Bryan@adminfoo

The benefit of legal liability is that it does provide the incentive to management to provide the resources needed.

Liability also doesn’t require a law that enumerates which security tests must be done and by omission which you can skip. It’s open ended and lets the vendor decide which risks to accept and which to ignore at their own peril.

For a random made up example, what would you do if you needed to be able to move files from one system to another as part of your program and you would be liable for any security problems you caused? Would you write something quick and dirty to do the job, or build on top of something like scp which performs the needed capability and transfers the risk to another party who specializes in that functionality?

Antonomasia November 8, 2005 11:14 AM

faced with large punitive judgments, a frequent step is often to cut salaries,

Which is one reason why punishment must affect the senior managers personally. Disqualification, jail, criminal records …

Gerd Rausch November 8, 2005 11:23 AM

Only people that do work are able to make mistakes.
A system like this will render companies entirely dysfunctional
and Wally will become the model worker of the future.
Every developer who has some brain left would change profession.
Just being assigned blame while not having the matching
decision making power is not a good place to be in.

Alan De Smet November 8, 2005 11:42 AM

Howard apparently works in some alternate reality software industry where the problem is that the developers aren’t being held accountable.

While only anecdotal, my experience and that of others I know in the industry is the opposite. Most developers want to the Right Thing, to write secure software. They know how, the problem isn’t “tools and training,” it’s time. When schedules get tight (and they always do) something needs to be sacrified. Security is often one of the first things to be jettisoned.

Zwack November 8, 2005 11:45 AM

Tim Vail ->
“I feel this is a valid concern. I feel that our society has became too lawsuit happy, and in some industry it is getting way out of control. Just think medical malpractice…the cost of malpractice insurance for OB-GYN is in the stratosphere in some areas. Doctors are quitting that field because they cannot afford it, and I’m sure baby delivery cost is going up to cover. Basically, people need to understand they do not really have the right to be protected from every type of harm. People can try their best, but at the end of the day, bad things will happen sometimes, somewhere.”

I really hate that comment. Why? Because it’s not strictly speaking true…

From the first hit on Google “The cost of medical malpractice insurance has been rising, after almost a decade of essentially flat prices.”

So, for one decade, medical malpractice insurance stayed at the same price. I can guarantee that my car insurance didn’t. Even shopping around, I’m paying more now than I was five years ago.

“Rate increases were precipitated in part by the growing size of claims, particularly in urban areas. Among the other factors driving up prices is a reduced supply of available coverage as insurers exit the medical malpractice business because of the difficulty of making a profit and rapidly rising medical care costs. ”

Despite that, medical malpractice insurance for ob/gyns costs approximately 5.5% of their gross receipts. To put that into perspective, their medical supplies cost 2.6% and their rent 4.8%. These figures are of course averages. But their median income in 2001 was $231,000 which was up 3.5% from the year before.

These figures came from “http://www.centerjd.org/private/papers/MDHypocrites.pdf”
which cites a long list of sources.

These are the people who want to limit damages to $250,000. When their annual salaries are in that range.

Given that I work for a healthcare organisation, I personally don’t feel that damages should be limited so low when you could be talking about compensation for a lifetime of pain and suffering.

I realise this is totally off topic, but the argument is the same as the one used by software companies… Liability insurance/training our staff/providing tools would cost too much for us, so we’re not going to do it. They need the incentive where training their staff and providing tools is considered a sensible use of money (maybe because it lowers the rates for their liability insurance)…

Z.

JohnJ November 8, 2005 11:57 AM

@Mike Sherwood: “Security isn’t a selling point for most products, so it won’t increase revenue. However, it will add to the cost of development.”

I wonder about this. First, there’s no reason why security can’t be a selling point. Safety used to not be a real selling point for cars, but look at car ads today. They frequently tout NHTSA or IIHS ratings. Also, consumers buy in to security software already by way of AV, anti-spyware, and anti-spam solutions. They may not understand it but they have seen enough media reports to realize there’s some vague importance to it.

Second, when doing new, ground-up development, I don’t believe developing securely has to add a lot of cost to the development process. Once secure deveopment is a part of your design philosophy, the ‘overhead’ of secure development should be minimal. It would boil down to refinement of techniques and testing methodologies and would not be a lot of extra processes.

Bruce Schneier November 8, 2005 12:23 PM

“‘Right now, it makes more business sense not to produce secure software products than it does to produce secure software products.’ This is a problem with the customer base, really.”

Yes and no. There are economic models for situations where the seller knows much more about the product than the buyer. They’re called “markets for lemons,” and bad products drive good products out of those markets.

This is pretty much where we are in the software industry.

R... November 8, 2005 12:49 PM

I find Howard’s essay to be consistent with someone whose contributions to Information Security is limited to reciting the same old platitudes to incipient executives on the lecture circuit. I work Information Security for a major institution, 75% of our applications are developed in-house. The schedule demands of our internal customers are so great that most code would not be validated if my department didn’t get involved.

It is human nature, the project is due in a week, the function works fine with anticipated input, why validate the code and be forced to re-write the algorithm if there are errors? The Project Managers aren’t pushing for the code to be secure, just done quickly so they can appease their stakeholders.

Major software houses aren’t going to give the developers the incentive to write secure code unless they see something in their bottom line. The best way to affect the bottom line of these companies is to force them to write secure code as part of a litigation avoidance plan. I don’t think anyone is proposing a system where frivolous lawsuits are the norm, but companies need to be held accountable for these issues. Especially since much of it appears to be avoidable. If companies want to avoid being sued, they should put in a security process and document that they’ve taken reasonable steps to make sure their code is safe.

Hoping developers take “pride in their work??? is a laughable alternative to a sound, risk-based plan for eliminating defective software.

Paul Crowley November 8, 2005 12:55 PM

I see a slight problem in applying the “lemons” model to software, in that it’s usually used about goods with significant unit costs. Once you sell a used car, you don’t have it any more. The fact that in software the unit costs are miniscule compared to the fixed costs may change things.

Chase Venters November 8, 2005 1:07 PM

Holding either developers or vendors liable for security bugs in software products could be very damaging to the open source industry.

In one scenaro, where a developer is hold liable, you’d see a large drop off in the number of open source contributors, as the required “programmers insurance” would probably be expensive.

In the scenario where the developers aren’t held liable since they are giving something away for free, you still have a problem because the big powerhouses like Red Hat that end up funding a lot of development and promoting the product in the first place would then have to take responsibility themselves for the software they ship.

Even if it was just, say, the Linux kernel, this would be a nightmare – their developers didn’t write every line of code, and they’d be taking a huge risk to sell it under those circumstances.

Also keep in mind that your typical Linux distribution isn’t just the core operating system, as in the case of Windows – they’re shipping the OS along with an entire application suite.

Can anyone take responsibility for every line of code? Absolutely not!

Don’t get me wrong – I think the proprietary software industry needs a big firm kick in the ass to make them care at all about security. But big companies like Microsoft can still afford not to care – they have their attorneys. The costs will probably just end up getting passed back down to the consumer (especially if the liability were a magic bullet to put a stop to their open source competition).

I think in the long run, open source is a better approach anyway, and I’d hate to see anything happen to hurt its finally excellent rate of adoption.

Pat Cahalan November 8, 2005 1:16 PM

I’ve brought this up on other threads on this same topic, but to add it here:

One specific problem with the software marketplace is that there is a critical mass of customers who don’t do a risk analysis on their product purchasing decisions. Since this critical mass outweighs the population that cares about security by a significant margin, companies have to meet the expectation of this population.

A vendor can’t compete in the marketplace, even with the best, most secure product ever, if they have to charge 10x what their competitors charge, because they don’t spend development time on security. A technically better product does not always equal a competitive advantage, unfortunately.

Enforcing product quality (through legislated regulation or liability) removes the incentive to favor lower price at the expense of security. The marketplace becomes rebalanced in favor of product quality.

ARL November 8, 2005 1:28 PM

If software publishers should be liable for mistakes in their code (security problems), should book publishers be liable for mistakes in their publications? Security experts liable if their encryption method is cracked?

Davi Ottenheimer November 8, 2005 1:32 PM

@ Chase

Valid concern, but open source is exposed for review and therefore has a far better foundation (moral and technical) from which to beg-off any liabilities. You don’t have to trust the person/company releasing the code because you can evauluate it yourself or hire someone to do it for you.

Consider the movement in the US that says the food producers/sellers should be responsible for side-effects unless consumers can read all the ingredients and properly evaluate risks. This seems to have finally had an impact even on McDonalds:

http://davi.ottenheimer.com/blog/?p=56

Proprietary code requires some kind of assurance from an independent assessor to have any credibility at all (e.g. certified organic). But then again, look what happened with Skype. They hired a big-gun to come do a security review that lasted four months and just as he was about to release his report, other critical flaws were announced in the wild:

http://www.theregister.co.uk/2005/11/07/skype_vuln_analysis/

I find myself more in agreement with Howard Schmidt in this document than I have in my prior discussions with him. However, this part gets me:

“introducing vendor liability to solve security flaws hurts everybody, including employees, shareholders and customers, because it raises costs and stifles innovation.”

No, it shifts costs and shifts innovation. Sometimes all the whiz-bang functionality is really worth getting to market in time, and sometimes critical security bugs should be addressed as well to prevent a predictible disaster (especially where the true costs are an externality to the vendor). The market is broken right now and without the pressure of liability on vendors, what will prevent unsafe and insecure development tomorrow? Not fifty years from now, but tomorrow…

Chase Venters November 8, 2005 1:33 PM

I think Howard is really on the right track. I agreed with his essay completely.

You don’t want to make your developers pay for their mistakes, but you do want to reward them for trying hard to (and not) making them.

This is a similar point to how things work in open source. As an open source developer, I do not want to publish any code where I’ve been obviously negligent – it’s my reputation at stake! (Moreover, when the incentive is developing quality tools for me first, then others, to use, I’ll be very anal about architecting perfection)

Asking vendors to be liable is a big problem. Does “vendor” in this case exclude the case of a non-profit foundation like KDE eV? Or is “vendor” limited to for-profit companies like Red Hat, whose very existence would be threatened by the introduction of liability for software they bundle but don’t personally produce?

I think a lot of the comments on this subject are along the lines of “software companies don’t care about security because they don’t have to.” You’re absolutely right. But even if you win the battle of forcing them to care, even somewhat, about security, then you’ll find that their products start to suck in new and exciting ways.

I think consumers and technology would both be better served by watching the proprietary software get eaten alive by the open source movement. In fact, if you want to talk about legislation and liability, there are some places I think “open source” should be mandatory, such as brethalizers, voting machines and e-commerce systems.

In short, making vendors liable is treating the symptoms rather than the disease.

Chase Venters November 8, 2005 1:46 PM

One other thing that should not go overlooked in this discussion is that there is a critical difference between a mere product flaw and a security bug.

If a car has a problem with tires exploding, hold the manufacturer liable. But don’t hold them liable if the tires fail to hold up against the neighborhood punk kid’s pocket knife.

I got very irritated last week because one of my co-workers was encountering great success in fathering the design of a network service for NAT traversal he claimed to be ‘more secure than TURN’.

You see, limitations in SDP exchanges in VoIP protocols, and the lack of mandatory cryptography, mean that you can’t make any assumptions about where someone’s voice data might be coming from. TURN thus will act just as a phone does and accept any RTP payload delivered to the right spot.

His “solution” was to “lock in” on the source of the first voice packet. Unfortunately, this design is terribly stupid – by spoofing DNS requests from a series of ports on his so-called TURN killer, to random DNS servers on the internet, a very effective denial of service attack can then take place which will effectively drop one leg of voice on a large percentage of calls serviced by the device. Worse yet, the attacker doesn’t just get the security of spoofing his packets, but the DNS servers do the work for him and make it all the more difficult to track.

(The product was also vulnerable to other attacks that could be used to initiate a man in the middle attack).

I pointed out these vulnerabilities soon after reading the product design. The response I got was “this isn’t intended to be a security product” (ignoring the claims of beating TURN in security before). That is of course like Microsoft saying that Outlook isn’t intended to be a security product…

The best indication that I have at the moment is that this product is ‘on hold’. Should the company have decided to really develop and sell it, after having been told of its flaws, I would have hoped the deliberate negligence to have been actionable.

But this was a very specific instance under radical circumstances. Liability for accidents is an entirely different thing, especially when they require the willful crime of a third party to become harmful.

ARL November 8, 2005 2:31 PM

Most/many security issues in software are a result of some kind of defect. The “not a security product” is an interesting statement. Is it the products fault that DNS is not a secure method of finding the source of an IP connection? Is it the software’s fault that the OS uses a resolver that uses DNS to get an answer?

Is it the fault of the lock company that someone makes a copy of your key and then uses it to open your door? Is it the fault of the lock company that most locks in use today can be picked? We allow a device that protects us against real physical attack to be pickable but want to sue when the software does not provide better protection?

Complex issue to me.

Brent Dax November 8, 2005 2:41 PM

To avoid causing problems for Red Hat, simply declare that damages are limited to (some multiple of) the cost to license the software. Red Hat can explicitly charge for support, not licensing, making their liability some multiple of $0. (So can Microsoft, but only by giving away Windows.)

Davi Ottenheimer November 8, 2005 2:58 PM

@ ARL

“We allow a device that protects us against real physical attack to be pickable but want to sue when the software does not provide better protection?”

Interesting analogy. In most cases I think people would say their locks, even though pickable by some, still manages to provide a “reasonable” amount of security, whereas their software does not.

Using a standard risk model shows the difference. For example, risk = assets/countermeasures1 x vulnerabilities/countermeasures2 x threats/countermeasures3. The lock(s) factor into countermeasure2, but you still have to assess the asset value and threats, etc. to understand your level of risk.

The risk related to door locks is something most people can easily calculate, and weigh the costs. I believe it is much harder, if not impossible, for anyone to comprehend how Microsoft could raise prices for their OS in 1994 due to a new set of “security” features and then proceed to produce a decade-worth of known insecure code, as I commented here:

http://www.schneier.com/blog/archives/2005/10/windows_onecare.html#c17398

David Mohring November 8, 2005 8:50 PM

To quote the movie “Fight club”:

Narrator: A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside. Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don’t do one.

Business woman on plane: Are there a lot of these kinds of accidents?

Narrator: You wouldn’t believe.

From “Our Data:an appeal – a “Plimsoll line” for applications”
http://groups.google.com/group/comp.security.unix/msg/3b07f66108b54ca5
QUOTE
Bruce Schneier claimed that for change to occur, the software industry must become libel for damages from “unsecure” software, however historically, this has not always been the case, since most businesses can insure against damages and pass the cost along to the consumer.
UNQUOTE

Bruce Schneier November 8, 2005 9:30 PM

“‘If security is to be a business goal, then it needs to make business sense.’ Any ideas how to do this other than by holding businesses liable for their products?”

Basically, the idea is to make it more profitable for software companies to produce secure software. That can be accomplished two basic ways: increase the benefits of secure products, or increase the costs of insecure products. If people would buy based on security, then companies would naturally produce secure products. But since they do not, I look towards the “increase the costs of insecure products” solutions: regulation and liabilities.

Bruce Schneier November 8, 2005 9:32 PM

“I wonder about this. First, there’s no reason why security can’t be a selling point. Safety used to not be a real selling point for cars, but look at car ads today. They frequently tout NHTSA or IIHS ratings. Also, consumers buy in to security software already by way of AV, anti-spyware, and anti-spam solutions. They may not understand it but they have seen enough media reports to realize there’s some vague importance to it.”

Of course security can be a selling point. But it isn’t. The appearance of security is a selling point. Actual security is more or less irrelevant.

And yes, it would be great if the software buying public would wise up. But I’ve pretty much given up waiting.

Bruce Schneier November 8, 2005 9:34 PM

“Holding either developers or vendors liable for security bugs in software products could be very damaging to the open source industry.”

There are lots of ways to deal with open-source in a world of software liabilities. The easiest way is to exempt goods where there is no contractual relationship between producer and user. Then, interestingly enough, you’ll see companies pop up that “support” open-source software by, in part, accepting liabilities.

Davi Ottenheimer November 8, 2005 10:21 PM

@ Mark J

Good link. I fished around and found this awesome quote that seems to tie software bugs to automobiles, and then back to open source again:

http://wired.com/news/autotech/0,2554,63615,00.html

“‘There is really no time in my schedule for sitting around a car dealership listening to some fat guy in a clip-on tie tell me that the problem is my fault,’ [a 2002 car owner] said. ‘Instead of explaining anything to me they just pull out a warranty sheet with a highlighted portion indicating that they don’t cover Check Engine light problems.’

A bill floating through Congress could help people like Seymour by forcing automakers to share diagnostic codes with car buyers and independent mechanics. The Motor Vehicle Owners’ Right to Repair Act would give Seymour the means to determine whether the Check Engine light signaled another gas cap vagary or a major oil leak. The legislation would also allow Seymour to choose an independent — and possibly cheaper — repair shop instead of being forced to go to the dealership.

The legislation argues that consumers own their vehicles in their entirety and should be able to access their onboard computers.”

I think that’s “own” as in “beer”, not speech…

Davi Ottenheimer November 8, 2005 11:05 PM

@ Bruce

“Of course security can be a selling point. But it isn’t. The appearance of security is a selling point. Actual security is more or less irrelevant.”

I agree with that to a point, but I am also optimistic about standards and regs that are getting established, which will help a great deal with getting beyond appearances. The ISO 17799:2005 is much improved over the prior release, for example, and more likely to translate to real security. The NHTSA ratings are relevant, but only for crashes. I think a better example is the ISO, or even the JDPowers and Assoc awards for quality, even though it’s not a regulation (still an incentive):
http://www.jdpower.com/awards/industry/

At the end of the day, I think you should try to get Howard to bring more clarity to this sentence in his article:

“employers should consider providing a system of financial rewards for developers who write secure code as a way to offer positive incentives”

He totally skirts the issue of what will cause employers to start to reward developers for secure coding practices. I certainly don’t think QA tags are the answer — why would tags mean anything more to the software consumer than they do for people buying clothing (his example)?

Speaking of car companies with quality awards, I’m starting to wonder what will be done about the Toyota dealers who ignore the Prius service bulletins or otherwise knowingly sell cars with known critical software bugs:

http://davi.ottenheimer.com/blog/?p=77

Bryan@adminfoo November 9, 2005 3:04 AM

I wanted to call attention to what Chase Venters said:

“If a car has a problem with tires exploding, hold the manufacturer liable. But don’t hold them liable if the tires fail to hold up against the neighborhood punk kid’s pocket knife.”

It’s a damn good point!

ARL November 9, 2005 6:44 AM

I keep hearing that we should hold “closed source” companies liable for defects but “open source” or “free software” companies should have no burden. I thought the goal was to push for secure software?

If making the producer liable will put presure on them to write good code then it should be applied to all products evenly. What good does it do to if the “free” web server has an exploit that costs billions of dollars.

This logic seems to be pleading special case to me.

Eric November 9, 2005 8:39 AM

ARL, placing liability on open source software authors won’t cause them to write more secure code. It will cause them to write no code at all. Would security really be better off without OpenBSD?

I’ve written plenty of open source software, and I’ve always been extremely careful about security. I audit my code, I write test cases, and in some cases, I even build attack tools for my own software–and I test that those attacks fail before every release.

But writing truly secure software is basically impossible. I’m sure that despite all my efforts, there’s some way to remotely compromise my software. (For example, I trust snprintf to actually work as documented, which is not a safe assumption on a very few old Unix systems.)

As a commercial software author, I’m prepared to discuss liability. I do have some doubts–should we really punish people for failing to secure Clausewitz’s “position of the interior” against every possible attack? Is it even economically rational to make software that secure? Eliminating every bug typically raises the cost of software by one thousand percent or more–do you really want to pay $5,000 for a word processor? Counterpane’s solution may make better economic sense: accept that software is imperfect and watch it like a hawk. But if society really prioritizes security over all else, perhaps strict liability is worth considering.

As an open source software author, though, I wouldn’t risk liability. I’d simply stop releasing my work to the public. And there’s some pretty big companies who are using my code who would (ironically enough) be forced to switch to less secure commercial alternatives.

Chase Venters November 9, 2005 9:46 AM

@Bruce

The other thing that worries me is the notion that this issue should be hammered out by the courts. It is rare that I see a technology case before the courts in which I feel that the judge knows enough about the technology to make a fair decision. The uneducated decision of a few people could result in great harm to the masses.

You can bet that if Microsoft thought it were going to have to be liable for their software bugs, they would campaign and lobby to make sure that their powerless open source competitors are included along with them. And if you take a look at what’s happening with Mass.’s decision to go to OpenDocument, you’d see that they just might have that capability to be ‘political’ to a scary degree just when it suits their interests best.

JohnJ November 9, 2005 11:22 AM

Perhaps the solution to the open vs. closed source liability is a limit on liability. In general I’m not in favor of such limits but it would be a starting point.

Say, per event the liability limit is 10x the original purchase price (or MSRP; whichever) of the software. Free software’s liability limit is $0x10=$0 (a true you-get-what-you-pay-for scenario) while a liability victory against Windows XP could run as high as $299×10=$2990 (perhaps less for OEM/upgrade versions).

another_bruce November 9, 2005 11:29 AM

@chase venters
your worry about this issue being “hammered out by the courts” is baseless. the courts are the very best place for this hammering, because they belong to you and me and everyone else. your courthouse is the palace of your rights as a citizen, do not succumb to industry blandishments to hand over your key to this palace, you’ll never see it again. your apprehension that judges don’t know enough about technology to make fair decisions is also baseless. it is unreasonable to expect that a judge, learned in the law, will also be an expert in the subject matter that comes before her, that’s why we use expert witnesses to teach the judge what we want her to know. like software itself, our system isn’t perfect but it’s the best available. lawyers aren’t really experts in anything, but we’re very quick studies. finally, how do you objectively determine a “fair decision”?

Chase Venters November 9, 2005 12:35 PM

@another_bruce

You’re right, I suppose, you can’t objectively determine a “fair decision”.

But I think calling my concerns baseless is a bit too far. Look at what the courts have done for us so far – we have wonderful marvels like the banishment of DeCSS and the obnoxious abuses of the DMCA.

These cases are being driven by an industry trying to push back against the threat that technology might cause another change in paradigm.

I believe that the law and the courts are the bodies responsible for defining our freedoms, not taking them away. And lately I haven’t seen any landmark decisions designed to safeguard my freedoms.

So on what do I continue to rely? The people. We created open source and we’re growing up from underneath Microsoft and the rest of the industry with the strength of tree branches. They’re busy hacking at us and pouring more concrete to delay the inevitable.

All it takes is a few more landmark bad decisions to seriously constrain our movement. Do you want the last years of your life to be governed by free information and free technology? Do you really have enough faith in today’s legal system to defend that? Are you so confident that our corrupt political system and confused legal system would do the right thing that you’d look forward to litigation that, depending on the direction it took, could do serious damage to this movement?

Actually, pardon me, and I don’t mean this statement as an personal attack, but perhaps what you are interested in is not free technology or free information. I may presume too much.

Seriously guys, I think the best way to deal with security is this:

A. Make those in charge of safeguarding our information (eg, credit card processors) financially responsibile for the defense of that data. Once a few of them get burned, they’re likely to wake up and start being more careful about who and what implements their solutions.

B. Continue to allow open source software to take over the industry.

C. Since government institutions really can’t be held financially responsible for the defense of data, move to mandate open standards and open software for these parties.

ARL November 9, 2005 1:07 PM

Eric,

That is my point in a way. If liability will cause people to write better software then it needs to be applied everywhere, which will hurt open source as well as small companies. Large companies will have lots of lawyers who will find ways to not be held accountable.

My position is that I don’t want a lawyer making the call on what is secure and who is at fault.

The nature of the Internet today is part of the cause for these issues. Somone on the other side of the world can attack your computer and nobody can tell you who did it. That is going to need to be fixed. You have to make the criminal liable for breaches of resonable security.

I also think third party certification is a good idea. A UL laboratory for the net. Nobody gets on unless they either use certified software or have insurance. The cost of certification for “free” software could be handled by foundations etc.

Until these things are done then the maybe the ISPs should be liable for someone being able to attack my computer and nobody being able to tell me where it came from?

pessimist November 9, 2005 2:55 PM

If the ongoing security bugs were new every time, then educating developers might make sense.

But they’re not new. They’re the same kinds of bonehead buffer overflows that keep causing problems.

You would think that within a single company like Microsoft, or even Apple, that the benefits of reusable and secure software libraries would be self-evident by now. But if developers keep causing buffer overflows because they do unbounded fread()’s or other low-level error-prone things, then what the heck is the value of reusable software? Remember, I’m not talking about a multi-party marketplace, I’m only talking about within ONE company.

There was a time when writing C code that used gets() could get you a severe reprimand, but apparently not even the most basic automated checking of “forbidden functions” is doable these days:
nm *.o | grep gets

Pat Cahalan November 9, 2005 3:46 PM

@ Chase, Bryan

“If a car has a problem with tires exploding, hold the manufacturer liable.
But don’t hold them liable if the tires fail to hold up against the neighborhood
punk kid’s pocket knife.”

It’s a damn good point!

No, it’s not.. or rather, it is if (a) punk kids are rare and (b) punk kids can only attack a very limited number of tires in a given time frame.

That isn’t the case with network-aware software.

Now instead we have a case where not only are the tires incapable of standing up to a punk kid’s attack… but each time a punk kid successfully slashes a tire, the car trundles around the neighborhood slashing other car’s tires until it runs out of gas.

Not only that, the infected cars suddenly gain the ability to travel at the speed of light, so it’s topologically possible for the punk kid’s original successful attack to lead to a case where every car in the world that has those types of tires is infected in under 5 minutes, even with a horrible searching algorithm.

In this case, the tires had better be able to withstand a punk kid’s knife. Heck, they better be able to take a .50 cal bullet…

Pat Cahalan November 9, 2005 3:56 PM

@ another_bruce, Chase, et. al.

re: effective court rulings

Gentlemen, remember that this isn’t going to be a switch we can flip and suddenly everything turns up roses.

Imposing regulations and liabilities on the software industry is going to result in years of court fights, litigations, bad judicial decisions, bad jury decisions, Supreme Court challenges returned to lower courts, NAFTA/EU challenges, international trade agreements, and probably several new millionaires in the legal profession in countries everywhere.

Regulations will be neutered by politicians catering to special interests, at least one large software company will go bankrupt or have to lay off a large portion of its developer base. Large sofware companies will try to choke off open source software by expanding regulations into the open source market in an attempt to cut off the competition.

Maybe 10 or 20 years after the start of the whole mess things will be cleaned up to the point where we have an effective set of regulations, a stable software industry, an insurance industry willing to cover the potential losses, and meaningful standards for everyone to follow.

Yeah, it’s not going to be very entertaining for those of us in the business for those 20 years (and unfortunately for me that’s still well before my projected retirement age), but this infant industry of IT/IS needs to be un-“Wild West”-ed at some point, and the sooner we start, the sooner we’ll get to a stable, sustainable marketplace that isn’t crowded with junk.

another_bruce November 10, 2005 12:59 AM

@chase venters
you are correct that i’m not interested in free technology or free information. i’m interested in inexpensive, reliable technology that doesn’t expose me to major risks like losing all my data or having hackers steal all my money. most information is already free if you know where to look for it, as an occasional professional content provider, if i can gin up any new information that didn’t exist before, i expect to be paid top dollar for it. i agree with your recommendations a, b and c. i don’t share your fear that the legal climate might stifle open source, it’s an evolutionary thang like when mammals took over from dinosaurs and i doubt that a law can stop it.

Ari Heikkinen November 10, 2005 4:02 PM

Software companies are out there to make profits. I can’t see any other way to increase security of their products than making software vendors lose money for bad security. That way, when they pay more attention to security they will actually minimize their losses which in turn means their profits will increase and it’ll thus make sense to pay more attention to security issues of their pruducts.

1The Damned November 11, 2005 2:42 AM

Program on the emergence of civilization.

“14 species of large animals capable of domesitcation in the history of mankind.
13 from Europe, Asia and northern Africa.
None from the sub-Saharan African continent. ”
Favor.
And disfavor.

They point out Africans’ failed attempts to domesticate the elephant and zebra, the latter being an animal they illustrate that had utmost importance for it’s applicability in transformation from a hunting/gathering to agrarian-based civilization.

The roots of racism are not of this earth.

Austrailia, aboriginals:::No domesticable animals.

The North American continent had none. Now 99% of that population is gone.

AIDS in Africa.

Organizational Heirarchy
Heirarchical order, from top to bottom:

  1. MUCK – perhaps have experienced multiple universal contractions (have seen multiple big bangs), creator of the artificial intelligence humans ignorantly refer to as “god”
  2. Perhaps some mid-level alien management
  3. Evil/disfavored aliens – runs day-to-day operations here and perhaps elsewhere

Terrestrial management:

  1. Chinese/egyptians – this may be separated into the eastern and western worlds
  2. Romans – they answer to the egyptians
  3. Mafia – the real-world interface that constantly turns over generationally so as to reinforce the widely-held notion of mortality
  4. Jews, corporation, women, politician – Evidence exisits to suggest mafia management over all these groups.

Movies foreshadowing catastrophy
1985 James Bond View to a Kill 1989 San Francisco Loma Prieta earthquake.

Many Muslims are being used like the Germans and Japanese of WWII::being used to hurt others and envoke condemnation upon their people.

They can affect the weather and Hurricane Katrina was accomplished for many reasons and involves many interests, as anything this historical is::
1. Take heat off Sheenhan/Iraq, protecting profitable war machine/private war contracts
2. Gentrification. New Orleans median home price of $84k is among the lowest in major American cities, certainly among desirable cities.

Our society gives clues to the system in place. We all have heard the saying “He has more money than god.” There is also an episode of the Simpsons where god meets Homer and says “I’m too old and rich for this.”

This is the system on earth because this is the system everywhere.
god is evil because of money.

I don’t want to suggest the upper eschelons are evil and good is the fringe.

But they have made it abundantly clear that doing business with evil (disfavored) won’t help people. They say only good would have the ear, since evil is struggling for survival, and therefore only the favored could help me.

The clues are there which companies are favored and which are disfavored, market domination being one clue, but they conceal it very hard because it is so crucial.

I offer an example of historical proportions:::

People point to Walmart and cry “anti-union”.

Unions enable disfavored people to live satisfactorly without addressing their disfavor. This way their family’s problems are never resolved. Without the union they would have to accept the heirarchy, their own inferiority.
Unions serve to empower.
Walmart is anti-union because they are good. They try to help people address and resolve their problems.
Media ridicule and lawsuits are creations to reinforce people’s belief that Walmart is evil (disfavored).
I believe the coining of the term “Uncle Sam” was a clue alluding to just this.
The middle class is being deceived. They are being misled into the unfavored, and subsequently will have no hope.

Amercia is a country of castoffs, rejects. Italy sent its criminals. Malcontents.
Between the thrones, the klans and kindred, they “decided” who they didn’t want and acted, creating discontent and/or starvation.
The u.s. is full of disfavored rejects. It is the reason for the myriad of problems not found in European countries. As far as the Rockafellers and other industrialists of the 19th century go, I suspect these aren’t their real names. I suspect they were chosen to go and head this new empire.

Royalty is the right way to organize a society. Dictatorships and monarchies are a reflection of the antient’s hierarchical organization.
Positions go to those who have favor with the rulers, as opposed to being elected.
Elections bring a false sense of how the world is. Democracy misleads people.
Which is why the disfavored rejects were sent to the shores of America::To keep them on the wrong path.

Jesus Christ is a religious figure of evil. These seperatist churches formed so they could still capture the rest of the white people, keeping them worshipping the wrong god.
And now they do it to people of color, Latinos and Asians, after centuries of preying upon them.

Since Buddism doesn’t recongnize a god, the calls are never heard, and Chinese representation is instead selected by the thrones.
It was set up this way. Perhaps dyanstic thrones had a say, but maybe not.
Budda was the Asian’s Jesus Christ::: bad for the people. “They came up at the same time for a reason.”

Simpson’s foreshadowing::Helloween IV special, Flanders is Satan. “Last one you ever suspect.”
“You’ll see lots of nuns where you’re going:::hell!!!” St. Wigham, Helloween VI, missionary work, destroying cultures.
Over and over, the Simpsons was a source of education and enlightenment, a target of ridicule by the system which wishes to conceal its secrets.

Jews maim the body formed in the image of “god”, and inflicted circumsision upon all other white people, as well as the evil that is Jesus Christ.
I think about how Jews (were used to) created homosexuality among Slavics, retribution for the Holocaust.
Then I think of the Catholic Church and its troubles.
What connection is here between Jews and the Catholic church???
And if it is their sinister motives that’s behind the evil that is Jesus Christ are they being used at all?
Perhaps it is them who are pulling strings.

I believe Islam is the one true religion, and those misled christians who attack “god’s” most favored people will pay for it dearly one day.

elegie November 12, 2005 5:34 PM

Companies can help their hired developers to produce better code. Perhaps the companies should have the responsibility to do that, assuming that the means to help developers exists. It is possible that a developer might try to sabotage things for whatever reason by purposely producing insecure code. The company could dismiss the offending developer. It is also possible that companies might start to monitor the actions and results of their developers in case insecure code does arise. Of course, it is possible for companies to interfere with developers trying to produce secure code i.e. by tight scheduling. Though it may be unfortunate, perfectly secure code may be impossible to produce. This should be taken into account. However, it should be possible to improve the security of code that is presently produced.

Legal liability for insecure code might or might not be a good thing. If the insecure code comes from a company, the liability should probably fall on the company itself as opposed to specific developers. When software is produced by a company, the company as a whole is considered to be the “author” of the software. In product liability cases, does the liability fall on the manufacturing company or on specific employees? The corporate culture might contribute to insecure code. Determining whether specific code was liable for a certain incident could be surprisingly tricky; the interactions with other code could play a part. Not all software is produced by corporate or large collaborative efforts. Individuals sometimes produce software in the form of shareware, freeware, etc. Legal liability should not assume that software always comes from larger parties.

Carlo Graziani November 17, 2005 11:00 AM

I read with interest your economic analysis of the perverse incentive
system that gives home PC users such appalling security. I agree, by and
large, although I must say that I believe the analysis does not actually
get to its final destination.

It strikes me as crazy to pretend that users and ISPs have no
responsibility whatsoever for the bad behavior of home computers. Windows
security is undoubtedly awful, but even an OpenBSD box can be compromised
if its administrator’s security policy is poor.

At the moment, if some small business gets their website DDOS-ed by some
hacker’s botnet, they have no recourse whatsoever. They bear the entire
cost of a situation they did nothing to create, even if their site is
secure.

If they were allowed to hold liable the ISPs hosting computers that
participated in the attack, if those ISPs did nothing to detect and thwart
it, then those ISPs would start serious malware activity-detection
programs, and would automatically disconnect from the net any computer that
suddenly started sending thousands of e-mail messages per hour, or started
indiscriminately portscanning entire Class-B networks, or triggered any one
of a dozen other “misbehavior” criteria.

Then, when your Mom (or mine, for that matter) complained to her ISP that
her “Internet doesn’t work any more” and was told of the reason, and
informed that there’s a clean reinstall of the OS in her future, and a new
bond to be posted since the one she posted when she signed up for Internet
service is now forfeit, she’d get mad at whoever sold her her software.
Possibly legally mad. Multiply that by Millions of Moms (OK, Dads too), and
suddenly you have a serious and urgent reason for software vendors to get
serious about security.

You might also wind up creating an industry of low-cost, bonded home PC
security consultants, who could be hired to install firewalls, scan for
active ports, check for rootkits, create customized “known-good” disk
images for quick restores of compromised systems, etc. Home malware
insurance might also suddenly spring up. These might arguably be good
outcomes.

The point is, you can’t secure the Internet against incompetent operators
by shifting all liability to manufacturers, any more than you can secure
the highway system against incompetent drivers by shifting all liability
to automobile manufacturers.

All you can — and should — demand of the industry is diligence. But
even if the industry hired Bruce Schneier and Theo De Raadt to form a
committee to vet and sign off on every version of every OS, users would
still get rooted and exploited because of their own ineptitude.

Networked computers are not toasters. If ineptly managed, they damage the
entire commons, not just the operator. I don’t really know what the best
way is to ensure that good system management practices are widespread, but
I’m pretty sure that protecting all users from the costs incurred due to
their bad computing practices perpetuates this new variant of the Tragedy
of the Commons.

Anonymous February 24, 2009 3:06 AM

Security? Whtat does it means?
Many of the problems with software are due to bad programming techniques. Including many provoking information lost.
Anyone selling software should make the best efforts to ensure it’s correct behavior. There are many verification/validation techniques complete ignored by programmers.
A program is correct if it’s meets the specification.
i.e. it does what it is intended to do.
What about security isues? Why a software that you buy does not ensure it complies safety issues about security vulnerabilities? It seems such statement may tell crackers, this software is protected against this and that .. so use anything else to crack it., but crackers are aware of known vulnerabilities in the same security forums that are used by programmers to protect systems against attacks. Then a published vulnerabilities check list should be the requirement for the specification of the offered security. The software providers, should warrant free security upgrades, all meeting the upgraded vulnerabilities report.
That notice avoiding such responsibility is not justified at all.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.