Liabilities and Software Vulnerabilities

My fourth column for Wired discusses liability for software vulnerabilities. Howard Schmidt argued that individual programmers should be liable for vulnerabilities in their code. (There’s a Slashdot thread on Schmidt’s comments.) I say that it should be the software vendors that should be liable, not the individual programmers.

Click on the essay for the whole argument, but here’s the critical point:

If end users can sue software manufacturers for product defects, then the cost of those defects to the software manufacturers rises. Manufacturers are now paying the true economic cost for poor software, and not just a piece of it. So when they’re balancing the cost of making their software secure versus the cost of leaving their software insecure, there are more costs on the latter side. This will provide an incentive for them to make their software more secure.

To be sure, making software more secure will cost money, and manufacturers will have to pass those costs on to users in the form of higher prices. But users are already paying extra costs for insecure software: costs of third-party security products, costs of consultants and security-services companies, direct and indirect costs of losses. Making software manufacturers liable moves those costs around, and as a byproduct causes the quality of software to improve.

This is why Schmidt’s idea won’t work. He wants individual software developers to be liable, and not the corporations. This will certainly give pissed-off users someone to sue, but it won’t reduce the externality and it won’t result in more-secure software.

EDITED TO ADD: Dan Farber has a good commentary on my essay. He says I got Schmidt wrong, that Schmidt wants programmers to be accountable but not liable. Be that as it may, I still think that making software vendors liable is a good idea.

There has been some confusion about this in the comments, that somehow this means that software vendors will be expected to achieve perfection and that they will be 100% liable for anything short of that. Clearly that’s ridiculous, and that’s not the way liabilities work. But equally ridiculous is the notion that software vendors should be 0% liable for defects. Somewhere in the middle there is a reasonable amount of liablity, and that’s what I want the courts to figure out.

EDITED TO ADD: Howard Schmidt writes: “It is unfortunate that my comments were reported inaccurately; at least Dan Farber has been trying to correct the inaccurate reports with his blog. I do not support PERSONAL LIABILITY for the developers NOR do I support liability against vendors. Vendors are nothing more then people (employees included) and anything against them hurts the very people who need to be given better tools, training and support.”

Howard wrote an essay on the topic.

Posted on October 20, 2005 at 5:19 AM76 Comments

Comments

Sorvi October 20, 2005 6:01 AM

I don’t think anyone should be sued until certain conditions under which a lawsuit is acceptable are clearly defined. Otherwise we’ll have a lawsuit everytime someone’s XP crashes. (“I lost data worth millions!!”)
In fact, I would be more comfortable defining a narrow range of things you could sue for, and declaring everything else off limits by default. I’m leaning toward suing for hidden spyware, that kind of thing that demonstrates malicious intent, not just the inevitable bugs that show up. And certainly not allowing a lawsuit if someone discovers personal information is included in every Word file and claims he didn’t want it and it should be out, not in, by default and now he wants a million dollars..
Otherwise nobody will develop softare anymore.

Julian Morrison October 20, 2005 6:10 AM

Liability is just another commodity: it’s basically bundled insurance. The right person to decide if insurance is useful would be the paying customer. The proper means of determining what’s on offer by default is according to what customers typically prefer to buy.

To force an industry to always bundle liability is to force prices up and options down, when (quite obviously) there’s plenty of market for the product on its own.

Yes, other industries are forced to bundle liability. No, it isn’t a good idea there either.

Tim October 20, 2005 6:13 AM

It’s difficult to see how the software industry could continue to operate in anywhere near it’s current state if such a law was passed. The cost of developing software would be so huge that most home users would be completely priced out of the market.

Ian Eiloart October 20, 2005 6:15 AM

Making the software manufacturers liable will help corporate user to sue the right people, but what about domestic consumers? A consumer isn’t going to sue a software manufacturer when they get a virus.

Here in the UK, retailers are liable for replacing, repairing or refunding faulty goods. The choice of remedy is the consumer’s.

If I could take my virus infected PC back to the retailer for a replacement, then the retailer would have some incentive to sell more secure systems.

Ian Eiloart October 20, 2005 6:20 AM

In the article, you ask why software manufacturers rarely market their products as secure. I’ve often wondered, for example, why Apple never advertise their software as virus free.

I think the answer is threefold. First, one can never be sure any software has no security holes. Second, they’d then become a prime target, if just for the kudos. Third, if a virus hit, then consumers could argue that they’d been mis-sold, and would have a basis for asking for their money back.

Erik N October 20, 2005 6:38 AM

When writing general purpose software the developer, software company or distributer has no realistic chance of evaluating the risks involved in the vast number of posible uses. Nor have they any realistic chance of evaluating how the software will work when other software is installed on the same system. There are simply too many combinations.

Taking responsibility have an extra cost, are consumers willing to pay that extra cost? I think not, the reason is that neither can the consumers compare that extra costs against the possible losses they get insured against. The less you are willing to pay, the more you must take upon your responsibility. It’s like an ordinary insurrance: If you don’t have it when accident happens, there’s only your self to cover to costs.

Ofcourse, the developer should not take the lack of legal responsibility as a pillow but always do his best, as should anyone else in whatever they do. But also, the user must take the required time to learn how to use the product correctly and safely.

For driving, people are required to take a drivers licence, learn the rules of the road, take courses to stay in control on slipery roads etc. They are also educated about security features, the need for seatbelts and the benefit of airbags.

There is no such requirement for the use of computers, even when these are connected to the internet. People just connect to the internet and ofcourse they crash! And then they go whining about it being somebody elses fault that they didn’t read the manual.

Whenever I try to explain security people will say it’s too difficult to understand or maybe at best they recognize the problem but deny that they should take part in the solution.

People perfectly understands the need to take their car to authorized mechanics, yet when they have a problem with their computer they let their next door neighbours 12 year old kid “fix” it. And people continuously ignores the need for updating their system, when new updates are out: Problems are only when they can’t get on with what they’re doing.

You can only hold vendors liable for insecure software if you also hold users liable when they do not follow secure practices. If a car theif steals your car and crashes, it’s your car insurrance covers. Why shouldn’t there be the same kind of responsibility if a hacker installs a bot on your pc and sends spam or hack some other site?

Marc October 20, 2005 7:19 AM

Making companies liable (with some reasonable limits) is a good idea, but making programmers liable is not.

I am a software engineer, and 99% of the time you are working somewhere where no matter how much you wave your hands about how insecure systems are, they will not give you the time / permission to work on these issues. It’s always “more important” to get the next product out of the door rather than improve the quality of existing products or versions.

Michael Thompson October 20, 2005 7:32 AM

What a lot of people don’t seem to understand is that the quality of software is dependent on many more things than just the programmer. If an engineer had to build a bridge but was told to be finished in a month with a tiny budget, do you think it would be safe to drive on? The developer rarely gets to choose the platform, tools and definitely not the schedule (which is almost always driven by the marketplace need). If I’m given a short amount to time to develop part of an application, it will clearly have more bugs than if I had more time to develop meaningful tests, spend more time up front on design, etc.

There’s no way I should be liable for writing something that I was TOLD to write while under possible unreasonable contraints.

This, of course, is a competely different issue than that of someone hacking together a virus or phishing site. These people should be flogged!

ARL October 20, 2005 7:54 AM

Software problems are an intersting issue. The idea that you could sue for a software defect would put an end to free software systems. After all, who wants to write a better web browser, only to find you are liable for damages. Exempting “free” software would be a bad idea as well, special privledges are not good for anyone.

Small companies would have an equal problem. Who wants to get involved in an industry where a complex situation could result in a huge lawsuit? Only large companies, with lots of lawyers, would be able to play. And they would demand contracts that limited liablity and we would be back in the same boat.

Turning the security problem over to lawyers and courts seems like a bad idea to me.

The bulk of the security threats to “defective” software (is the lock on your front door defective because it is easy to pick?) is a result of the nature of the Internet. It is very hard to trace where something comes from. I can’t make a crank phone call without someone being able to trace the source of the call, but the same is not true for email. ISPs are going to need to provide the same ability if we want to be able to hold criminals accountable and not software vendors. They are getting there, but privacy goes away.

mpd October 20, 2005 8:12 AM

Instead of assigning blame we should be spending our time finding solutions to the problem.

Current developers need to get up to speed on computer security and then be held to a certain standard. Maybe we need to revisit the idea of registering/licensing programmers?

New programmers need to be better educated. Schools need to require security courses in their curriculums. Though where they’re going to get qualified instructors, I don’t know.

And software companies need to bite the bullet and take the time and money needed to tighten up their software. They’ll just pass the cost on to us. We’re already spending tons of money on buggy software, I don’t think people would mind spending a bit more with a “Good Security Standards” stamp on it.

Finally, the end users need to take responsibility for what they do with their machines. They need to better understand what malware really is, how it works (even at a layman’s level) and how to fight it. AOL doesn’t help with their commercials that essentially say “let us worry about it for you.”

Once we get there we can start thinking about who’s to blame.

And instead of using negative reinforcemnet (threat of litigation) to get companies to start this stuff, why not use positive (tax incentives)?

gulfie October 20, 2005 8:27 AM

Take a hypothetical security bug in XP, I know there are none, but imagine it.

For A lawsuit to be correctly target to a developer I would have to

A) track down the bug ( get/learn the source )
B) track down every developer that has ever touched the code
C) attempt to figure out who is to blame in possibly millions of lines of code
D) Is American law binding in Bangalore? And how much money do they have to give?

A) It’s very hard to track down a bug in without source code. The only code this would be possible on is OSS… Unless you can convince Microsoft to lend you a copy of XP’s source for doing some legal discovery.

B) You’ll also need a fair amount of business information from any one the code came from. You’ll need the complete revision history of all files involved, and probably all the specification documentation at least. Now you not only have the problem of sorting through millions of lines of code, but you need to understand them as they change, grow and get extended over time.

C) If there is a simple security bug like a string on the stack that is incorrectly bounded, well then that is relatively easy to assign fault… or is it? After the string has been passed around a bunch of different subsystems and APIs, who’s fault is it? If an intermediary API changes it’s behavior and thus exposes a security bug in software the API or library developer has never seen and has no access to modify, can it be their fault?

D) Outsourcing? Is American Law binding in Inda? Even if it was, say any law changes make there way into some GATT or WTO treaty, how much can you really recover from a programer living in Bangalore? Especially after lawyers fees?

The results of such a proposal are interesting.

Programers would have to start carrying liability insurance like engineers or doctors.  The cost of American programers would go up, and thus speed outsourcing.  OSS coders might also need to get liability insurance.  Most programers will probably only be able to carry a few million dollars in insurance, and most of them are not rich so the available damages to anyone bringing a lawsuit is going to be relatively small.  Businesses will have successfully externalized the cost of there poor quality control and effectively nothing will change for large companies.  Workers will suffer more.  Smaller companies a OSS developers will get hurt more than the big guys. 

Look at where the guy has come from. He’s not looking for workable solutions to the problems we have. I’m not sure what he’s looking for, maybe fear, but it dosn’t look like a solution to our problem.

Shura October 20, 2005 8:46 AM

gulfie wrote:
:Is American Law binding in Inda?

Strange question. Have you ever stopped to consider whether your actions are legal under Indian law? No? There you have it. 🙂

Also, for the record, there is no such thing as “American law”, anyway – there is “US-American law”, but that’s it (“America”, as opposed to the USA, consists of many countries, from Chile to Canada). (Yeah, I know, it’s nitpicking, but it’s a pet peeve of mine.)

Dan Holzman October 20, 2005 8:50 AM

I know I wouldn’t be willing to work as a programmer if I was liable for the bugs in the code I wrote, but had no control over the amount of debugging and QA that took place before product shipped.

J. Oquendo October 20, 2005 8:58 AM

// He wants individual software developers to be
// liable, and not the corporations. This will
// certainly give pissed-off users someone to sue,
// but it won’t reduce the externality and it won’t
// result in more-secure software.

When will the line be drawn on this. So MS releases shoddy software then releases a patch to absolve them. What will it take to minimize shoddy coding millions of class action lawsuits? What about when greed kicks in and other companies in the industry fall prey to greed. E.g. “Well it wouldn’t have traversed the network if COMPANY’s routers took the necessary precautions. Or as Bruce pointed out on the register, “ISP’s should bear the burden”. Should they, why not make the router companies responsible for not taking the time to create a better filter before traffic passes through anywhere. This method works better since if it were a standard, ISP’s would have less to worry about.

Bruce Schneier October 20, 2005 9:18 AM

“This reads like a call to make development of free software legally impossible.”

Is not. There are lots of ways to deal with free in a liability environment. You could imagine, for example, that free software does not carry with it any liabilities, and that anyone using it has to take the liability themselves. Then you could imagine companies emerging, a la Red Hat, that take free software and sell liability protection around it. You could also imagine contracts that move liability from one party to another. You could also imagine something like a “Good Samaritan” law protecting those who write free software.

Bruce Schneier October 20, 2005 9:19 AM

“Liability is just another commodity: it’s basically bundled insurance. The right person to decide if insurance is useful would be the paying customer. The proper means of determining what’s on offer by default is according to what customers typically prefer to buy.”

Almost. The problem is that there is an externality here, and therefor you have a market failure. If you just look at the buyer and the seller, you end up with insecurity.

“To force an industry to always bundle liability is to force prices up and options down, when (quite obviously) there’s plenty of market for the product on its own.”

Yes, it forces prices up. But not bundling liability also forces prices up. Honestly, if I’m going to pay the higher prices I’d rather the money not go to the criminals.

Bruce Schneier October 20, 2005 9:20 AM

“It’s difficult to see how the software industry could continue to operate in anywhere near it’s current state if such a law was passed. The cost of developing software would be so huge that most home users would be completely priced out of the market.”

I honestly don’t think so. Many industries are highly regulated and operate in a high-liability environment, and they manage.

The “kill the golden goose” arguments come from those that don’t want to be fettered with the responsibility.

J October 20, 2005 9:21 AM

I’m a little puzzled by this whole brou-ha-ha. I work for a software company (as a developer — I fix bugs), and one of the first things I noted about the biz is that when selling “solutions” that are not shrink-wrapped, the terms of each sale (usually based on licensing) allow for all kinds of agreements on software quality. These guys have already solved this problem, to a greater or lesser extent.

We have “repair or replace” options for bugs in the form of upgrades and patches, and there are parameters all parties agree to that describe what a failure is, and who is responsible.

I’m just saying that there is a clear line between typical end-user stuff and enterprise level software. Basically, the more money you pay, the more liability the vendor takes on.

If we want more security and less bugs from our shrink-wrapped software vendors, there is an obvious business model to adopt: carrot-and-stick.

We can certainly introduce subtle and useful laws that bolster existing consumer rights (remember those?). This will only go so far, as a look in any department store will show you. Most consumer grade consumables (and I suggest that most software is a consumable) is barely acceptable in terms of safety and quality. But it is acceptable by the laws of the land, and (mostly) children and small animals are not killed as a result of poor design or build because of these laws. That’s the stick.

The carrot will be money. Our software, all of it, will have to cost more, and free-upgrades might transform into a subscription model.

The model is already in place, and working along quietly while you read this. Corps pay more money for software, and extract promises from their vendors, so as to cover their own asses when things go pear-shaped.

If consumers want a little of that, they will have to pay more. Until now the end-user software model has been cheap and cheerful, with only the “pro” apps going much over the magic $69 mark (and even those guys don’t offer much liability.) If rules like this are applied to software and consumers can be sold on the need for such changes to the laws of the land, will they also accept a %50 increase in cost? %100? More?

Because if the commodity software companies have to incur more legal risk (because bugs will always happen, and bug-free software is, right now, an unrealizable dream) then costs will go up, and those costs will be dumped right onto the customer.

Like so much of life, this is going to be a trade-off between “bug-free enough” and “too expensive for what we really do with the software”

By the way, my company makes software management tools to help vendors, well, make better software (i.e., less buggy) for lower overall cost (i.e., less time between releases, less regressions.) So, we stand to benefit from any changes to the commodity software landscape.

Bruce Schneier October 20, 2005 9:23 AM

“Instead of assigning blame we should be spending our time finding solutions to the problem.”

The neat thing about dealing with the economic externality is that you set the natural innovation mechanics of capitalism to work. Once you align the business motivation properly, there will be all kinds of advances and innovations in secure software design. Because it will be in the economic best interests of companies to come up with them.

If you don’t align the business motivations properly, all the clever PhD theses on secure software design will remain in libraries.

Ryan Sommers October 20, 2005 10:12 AM

Great column in Wired.

One thing I haven’t been able to answer though is how you get from security meltdown to court. In the aftermath of security breaches they are almost never public and if they do go public it’s not because the company would have wanted it. Afterall, electronic security failure is still believed by the masses to be a failure of the victim rather than what the victim’s tools. In order for something like this to be taken to court the companies would need to go public with their security claims. What do you think would be necessary for that to happen?

ARL October 20, 2005 10:43 AM

Allowing free software to not have liability but commercial software to bear the cost of liability is a very bad idea. You allow the free software to continue to gain popularity with no reason at all to address security issues, the cost stays with the users.

Liability companies would have a hard time finding software perfect enough to insure.

If cost is the issue, then the consumer will start to move away from insecure products when the cost gets high enough.

I see the same in home security. The locks on the doors are “good enough” for the majority of problems. The solution to the rest is long and expensive for the homeowner.

Just as people in rural areas had to learn to lock their doors, the citizens of the ‘net will have to learn to do the same. After all security is, for the most part, a people problem.

Preetam October 20, 2005 10:46 AM

Its definitely not fair to hold the developers responsible for the security vulnerabilities. Being a developer myself for the past 5 years, I have seen that the pressure on the developers by the management, to complete the coding is quite high. This forces developers to somehow complete the work and release the product.

Security testing and Security design are given the least priority !

Brock October 20, 2005 10:52 AM

You said:

“In economics, this is known as an externality: an effect of a decision not borne by the decision maker.”

No doubt you know a lot more about IT security than I do, but this statement is ass-backwards wrong. There is no externality here. The economic decision-maker in this equation is the purchaser of software. They are free to buy cheap, insecure software or expensive, secure software. They’re also free to buy cheap, insecure software and then spend more money on third-party software to upgrade the security. In any case they fully internalize the costs of software vs. the risk of insecurity.

The real question is: Who bears the risk of insecure software: the user or the developer? At the moment, the buyer/ user bears the risk. You’re argument is that the developer should bear the risk. I disagree, for many of the reasons already put forward, but it’s a valid argument. That doesn’t mean there’s any externality.

Jim McNeely October 20, 2005 11:15 AM

I agree that Schmidt’s idea won’t work, however I disagree with the proposed solution.

I run a small custom database development company, and we do projects for clients from 3 employees up to American Airlines and Lockheed Martin. I think the problem with affixing blame to software companies for security problems is that security problems are dependent on many environmental variables. You have to look at the physical environment; I had a client with very sensitive information that was running a cable out the door up some stairs into another room, all publicly available. How is that any software vendor’s fault? It could be users surfing russian porn/gambling sites; they are inviting problems. It could be users sharing their passwords with people they shouldn’t. I had another client with servers out in the open who got broken into and all their physical equipment stolen. Many businesses I encounter have consumer grade routers that are not set up properly with 10 to 20 users hammering them. Many have an “IT guy” who doesn’t really seem to understand what he is doing.

These problems can be solved of course. However, the whole environment must be considered, not just particular pieces of software. I am concerned that because I installed a little contact management/invoicing system for some small client, and they are too cheap to take care of their security environment, ignoring our recommendations, that we will be spuriously targeted for lawsuits. We are already careful about what we touch in a client’s environment – we become responsible for everything we change and everything our changes touch. If their contacts get hacked, exactly who is responsible for it? My company? The firewall company? Microsoft? The employee who took screen shots and printed them off? If we start getting targeted for lawsuits for this I’m going to make a career change, that is for certain. It will become a MAJOR impediment to innovation and growth.

So I am perhaps a bit laissez faire on this, but I think the market will take care of the problem. I think that operating systems and software that are inherently insecure will sting enough people that they will either be forced to change or will fail in the market place. If I build a database that is insecure I should get spanked. We already see this with Microsoft; why are they so concerned about security? They are big enough, they don’t need to worry, right? That’s why we need the government, right? But Microsoft worries, because the market demands it, and they are subject to the pressure. It is purely anecdotal, but I have seen major clients like banks and defense contractors going to the Mac; they want the stability and the security.

Also, companies that get burned for bad security practices should not go looking anywhere but to themselves for their ills. In their own market it is probably quite competitive, and the companies that see to these issues and take care of business will win. We use our security knowledge as a selling point, and we get work because of it. Why pull the government into this? It is generally because of bad management and IT practices that security problems happen, not software. Legislation is the wrong answer; this is nothing more than a politician recognizing a point of pain, and trying to politically capitalize on it. It would ultimately be of no benefit to anyone. We need to collectively say NO to all of it, even for large companies. The market itself will hold them accountable in a much more meaningful way than the government could.

Jim McNeely

President, New Century Data Inc.
http://www.newcenturydata.com

ARL October 20, 2005 11:19 AM

One issuenot thought of for this issue is what about damage done to a third party? For example if someone breaks into your machine and then uses it to DOS attack someones web server.

In that case the third party is damaged as a result of the software creator but not as a result of their choice.

I think we need something like a UL for software. Hopefuly then people would want to buy the software that has undergone independent testing. If not then they could be liable if their machine caused another a problem.

Pat Cahalan October 20, 2005 11:21 AM

@ Brock

There is no externality here. The economic decision-maker in this
equation is the purchaser of software. They are free to buy cheap,
insecure software or expensive, secure software.

You’re kidding, right? You may argue that liability isn’t the right way to go, but as the column points out:

Normally, you would expect users to respond by favoring secure products
over insecure products — after all, they’re making their own buying
decisions based on the same capitalist model. But that’s not
generally possible. In some cases, software monopolies limit the
available product choice; in other cases, the “lock-in effect” created
by proprietary file formats or existing infrastructure or compatibility
requirements makes it harder to switch; and in still other cases, none
of the competing companies have made security a differentiating
characteristic.

This is a fairly spot on analysis of the state of the market right now. The general populace doesn’t understand security, isn’t driven to purchase software based upon security decisions (or isn’t able to for the reasons above), and as such the market doesn’t reward security-minded products. This means that the security-minded consumer has no market to draw from.

FreeBSD, OpenBSD, NetBSD, Solaris, Windows, Linuxes galore -> they all show up in CERT advisories fairly regularly. Sure, you can make all of them “more” secure than they are “out of the box”, but where, pray tell, is the secure operating system I’m supposed to buy? Marcus might argue I’m supposed to write my own, but that’s a market of 1.

Pat Cahalan October 20, 2005 11:39 AM

@ Jim

A couple of points.

I think the problem with affixing blame to software companies for security
problems is that security problems are dependent on many environmental variables

This is true. But this is why regulation (as opposed to just liability) is a good idea -> if there are acceptable standards for databases, for example, and you as a DBA contractor follow the standards, you might be named in a lawsuit and have to deal with hiring a lawyer, but you’ll have the defense of, “My product follows USSSM practices for data security in a DBMS, my client was violating the use model for my product, and as such I have no liability. Go sue them.”

Seat belt manufacturers aren’t held liable if a driver is stupid enough to wrap the belt around their neck 🙂

Also, there is a significant problem right now in that IT professionals don’t get to make security decisions. How often to IT professionals get overruled by management, who don’t understand technical issues? On the other hand, if your CEO comes to you and says, “I want it to be this way,” and you can say, “We can’t do it that way without violating the USSSM standard for our accounting practices, and the next time we get audited we’ll get slammed with a huge fine,” you’re much more likely to get your point across than if all you have to say is, “We might get hacked if we do it that way, boss.”

I think that operating systems and software that are inherently insecure
will sting enough people that they will either be forced to change or will fail
in the market place.

This may be the case. It’s hard to say for sure, as the individual networked PC hasn’t really been a part of the business world for very long, relatively speaking. However, the number of computers hooked up to the Internet has grown rapidly since 1993, and the general quality of software (re: security) has not gone up. 10-15 years of garbage is probably indicative of a persistent problem.

sidelobe October 20, 2005 11:54 AM

A human person, not a corporate “person”, still needs to be liable. That can be either the company’s president or a responsible person within the company. This latter opens the topic of licensed software engineers.

There is no point in licensing the individual developer. He isn’t responsible for the architecture, development environment, schedule, or budget. The person who is responsible for these things is the development manager or architect. I advocate licensing these professionals the same way we license building architects, lawyers, doctors, and engineers.

Not all software needs to be built under the overview of a licensed software engineer. End-user games clearly fall under this category. At the other end of the spectrum, commercial aircraft flight control software clearly requires certification of the responsible engineer. Under Sarbanes-Oxley in the US, financial software could benefit from licensing of software engineers.

I suggest that companies would naturally seek licensed software engineers if they produce software that carries a significant liability. They will then be able to manage their risk. This is a well-tried method that has worked well enough for other industries. Why should software be any different?

Chris October 20, 2005 12:29 PM

I have two words for this idea: nuisance lawsuit.

At what point do we declare a “bug”, “crash”, or “software failure” so minor that it doesn’t warrant action? There’s no objective answer to this; every claim will have to be evaluated on a case-by-case basis. Has Johnny’s future earnings potential been harmed because Word crashed and he had to retype his term paper and lost 10 points because it was late? Johnny saved his document every ten minutes, but when he tried to open it up again it wouldn’t — Is this a bug in Word, a failure of a sector on his hard drive? Maybe his power supply is old and can’t put out enough juice to run his computer after he upgraded his video card to play Shoot ‘Em Up XII. Maybe its the Mozilla Foundation because their browser let some little piece of malicious software in and we haven’t found it yet? Is his B- in American History because he played video games too much or because his computer has a problem?

Remember, in the U.S. you can sue anyone for anything if you’re willing to file the papers. Sure, the claim may be entirely baseless and without merit. And that may be obvious to technically savvy people such as those that read this blog. But courts aren’t in the habit of dismissing suits because one side asserts they’re without merit. You’ve got to convince a judge and the other side gets to rebut. Throw in a bunch of potential plaintiffs and coach your argument in the right language and, well, now there’s enough here to let a jury decide.

There’s a reason many companies pay off nuisance suits — it’s cheaper than fighting even if you’re sure you’ll win. The plaintiff has only his own time on the line; the defendant may have millions to lose and has to dot his I’s and cross his T’s to be sure not to lose on a technicality. Lawyers can be disbarred for frivolous litigation; plaintiffs can be counter-sued. But this hardly ever happens because it’s hard to prove and, yes, costs even more money to do.

Now, once you’ve been sued, there’s the issue of complying with discovery motions. Suddenly every QA test you ever ran has to be turned over. That “minor defect” that’s “too small” to delay your release is suddenly a smoking gun. What about the really obscure condition nobody thought to test? Try explaining to a jury formal proofs and that, no, Intel and AMD can’t prove that their CPU chips actually work. And you sure as hell don’t want to take any initiative and keep going over your software looking for problems and issuing patches for things nobody has reported. That would just inform a whole host of potential litigants that they may have a cause of action.

jammit October 20, 2005 12:33 PM

Maybe I oversimplify things, but isn’t this a case of buyer beware? Yes, there is software out there that runs the full gamut of crappy+cheap, crappy+expensive, secure+cheap, secure+expensive. To use the seatbelt analogy, if I don’t wear my seatbelt, it’s my fault. But is it up to the laws to make every car manufacturer include seatbelts? Does not wearing a seatbelt have to be a crime? And if I do wear a seatbelt and the manufacture includes a good seatbelt and I still die in a wreck, who’s fault is it now? It gets too complicated to nail it down to user or engineer error. Maybe adding an Underwriters Security Label to something may mean it follows the basic security requirements but isn’t fully secure. All the UL listed stuff I have pass UL and the manufacture also states it’ll handle twice the current necessary to pass UL. Maybe having a standard acceptability of secure will cause others to focus on meeting and exceeding the requirements to try and outsell a competitor.

mpd October 20, 2005 12:36 PM

@sidelobe

I don’t think you can differentiate the roles a person plays.

There is no point in licensing the individual developer. He isn’t responsible for the architecture,
development environment, schedule, or budget.

In some environments, the same person does all, most or parts of these things. Where do you draw the line?

Maybe we just need certifications like the MCSE. Where a developer can say, I worked through these industry-approved programs and know my stuff. Are there any security certifications for developers? I’ve seen them for Sys. Admins.

Joe Buck October 20, 2005 12:42 PM

The way to deal with the free software issue is to require either that the software manufacturer be legally responsible for fixing all problems, or else that they impose no barriers on others that want to fix the problems. “Impose no barriers” means that they provide source code as well as the legal right for third parties to publish derivative works that fix any problems.

tim fong October 20, 2005 1:05 PM

As far as liablity in free software: we could look at this from a reasonable expectation point of view. The user of a freely downloaded software package would have no reasonable expectation that the thing would not destroy his hard drive. He paid nothing for it, he would have no reasonable expecation that it would be secure.

Give me Laws October 20, 2005 1:08 PM

I agree that software manufacturers need to be held liable for their products in the same way manufacturers of physical goods are liable. Whether this liability extends to include holding the engineers that design/develop the manufactured software, depends on the intended environment and use of the software.

I would expect that any laws specific to software, will manifest as consumer protection laws, like we have for other consumer products.

For the same reason that no one has to actually read the reams of documents associated with signing a mortgage, or the owners manual that comes with their automobile, no one should ever have to read a software products EULA, to know that they are “safe” in their use of the product.

There are consumer protection laws that, for the most part, guarantee that the language in mortgage documents will not attempt to defraud you, or to ensure that an automobile will operate as expected. If these products don’t live up to the general expectations as outlined in the consumer protection laws, and the consumer ends up in court, the consumer protection laws will still generally favor the consumer in any outcome.

Davi Ottenheimer October 20, 2005 3:07 PM

Wow. Lots of great comments. I think Marc put it best right at the start of the thread. Unless a developer is acting alone and releasing code directly to you, it is absolutely ludicrous to hold the poor developer accountable for the myriad of management decisions.

This should be obvious to anyone who has tried to introduce security into software written in the bowels of a corporation, and brings us back to the discussion about Airbus engineers being fired for trying to do the right thing:

http://www.schneier.com/blog/archives/2005/10/potential_airbu.html

@ Bruce

Suing individual developers does in fact shift the liability, and therefore moves the externalities onto the corp, but it creates a scape-goat incentive rather than an incentive to fix the software.

In other words if a corporation is formed to build bridges, and their bridges fail, the corporation needs to be held liable from the top. If they are allowed to shift blame to someone buried in the organization (someone who does not make final decisions and therefore unable to influence the safety of future bridges, or someone like Oliver North) then this is no different to them than no liability at all.

SOX is unique in this way as it specifically calls out the CEO and CFO as personally liable for fraud during their watch.

Incidentally, as I’ve said before with regard to Schmidt, bad regulations should not be confused with all regulations.

For example, energy companies are rumored to use an evasive maneuver that completely defeats the purpose of their regulations. Maybe that’s where Schmidt gets his idea from?

Apparently some corporations hire some poor guy to be an executive “felon on staff”. Whenever the corp is found in gross violation of environmental, etc. regulations, they send this pre-designated “felon” off to jail to serve his term on behalf of the corporation. If true, it is the kind of stupid loophole that corporations love because they can appear to be doing the right thing while continuing to externalize risk.

As much as I think Schmidt is a great guy personally, he needs a lot of work on his security policy recommendations.

Bruce Schneier October 20, 2005 3:56 PM

“No doubt you know a lot more about IT security than I do, but this statement is ass-backwards wrong. There is no externality here. The economic decision-maker in this equation is the purchaser of software. They are free to buy cheap, insecure software or expensive, secure software. They’re also free to buy cheap, insecure software and then spend more money on third-party software to upgrade the security. In any case they fully internalize the costs of software vs. the risk of insecurity.”

Unfortunately, there is. Your security is worse because of everyone else out there that isn’t securing their computers. Your security is worse because my mother constantly gets infected with trojans. Our nation’s security is worse because the threats to our infrastructure are greater than the companies buying and selling the computers.

If you like the current state of computer security — both on the product side and on the customer side — then you’re right, there’s no externality. But I think we need more security, and that “more security” is the externality.

And even if there wasn’t, there are clear benefit to forcing vendors to assume liability. One, competition isn’t working. Competition isn’t working because of monopolies. Competition isn’t working because of lock-in. Competition isn’t working because software vendors find it easier, and cheaper, to compete with press releases than with real security. (It’s much the same reason why competition, in the current economic playing field, won’t ever give us reliable cellphone service.) Competition isn’t working because it’s difficult for a customer to tell an actual secure product from an insecure one — until it’s too late. Technology moves to fast for the customers to deal with this.

And even if it weren’t, the economic principle of loss allocation says that you should assign the risk to the party who can mostly cheaply mitigate it. It is better for the economy for the software vendors to have the liability; it’s much more inefficient for the customer to have it.

Bruce Schneier October 20, 2005 3:58 PM

“I have two words for this idea: nuisance lawsuit.”

Agreed. But the fact that the legal system isn’t perfect does not imply that it is useless.

Like any solution, there are losses in the process. Legal fees, and nuisance lawsuits, are part of the losses in this system.

I believe it is still more efficient than the alternatives, although I also believe that regulation also plays a part in the overall solution.

Bruce Schneier October 20, 2005 4:00 PM

“Maybe I oversimplify things, but isn’t this a case of buyer beware?”

Yes, it is. But like the pharmacutical industry, we have buyers who don’t have the expertise to make intelligent buying decisions.

“Buyer beware” is a recipe for, well, for the situation we now have: “Software company A is pleased to announce this splendid and worthwhile press release saying how really impressive our security is now, please believe us, and please forget about all the times we lied to you about security in the past.”

Buyer beware, yep. It’s working just fine….

Dagon October 20, 2005 4:09 PM

Users, not publishers, vendors, or devs, have final responsibility for what they do and what they allow to be done on their systems.

It helps to be clear what externality you want to address. The harm caused to the purchaser (who is presumably the one who would sue under this proposal) is not external in any way. It’s a risk accepted by one of the participants. The harm caused to network owners? They’re not external either – they have VERY restrictive contracts that could restrict what can be run. The harm caused to innocent third parties who share a network? Maybe, but they should perhaps sue the network provider (who failed to take basic precautions to provide a safe environment) or better yet, sue the human who caused the harm by running the software.

Software is just an agent. It’s the human who allowed it to run on her behalf in an environment which caused harm who should be held liable. In some cases, this liability can flow upward to the publisher or developer in cases of fraud or indemnification.

But always be clear. Defective software doesn’t cause harm. Users who run defective (or non-defective but ill-suited to the environment) software without appropriate safeguards cause harm.

Peter October 20, 2005 4:50 PM

When you hear that the lastest virus/trojan cost users X Billion dollars, you are hearing the cost of unsecure software. You’re already paying for that (in)security. You. Your company. Your family member, whose computer you have to fix. You.

Schmidt’s wet dream of making the actual developer liable is ludicrous. And a thinly veiled attack to attempt to destroy open source software. Pick up the book Death March for examples of the norms of software mismanagement.

As an example of how mismanagement screws up software development, I refer you to the Denver Airport baggage handling system. Previous installations took 4 years to install such an automated system. So the Engineering managers said it would take 4 years to build. The sales department said the airport would open in 2 years, so you have 2 years to build it. It took 4 years to build and install. Was it on time? or was it 2 years late?

Brian Thomas October 20, 2005 5:17 PM

What I expected someone to say – especially you, Bruce – is that putting the responsibility on the coder is the best possible way to assure that no coding ever gets done, because the relationship between the real quality of a coder’s work and his risk exposure is too diffuse. Besides, how likely is it that a given flaw can reliably be tracked to exactly one programmer’s work?

Of course, companies would form to share the risks, in the insurance model, and they would mitigate their risk by actuarial methods, leading to standards of assurance, and… bingo! secure software. And captive programmers.

Of course, individual developers working on free software would have no hope, but they could form into guilds that served the same purpose but instead focused on certifying the programmer’s skills and conformance to standards, resulting in no free software. And captive programmers.

But if manufacturers were held accountable for the quality of their goods – an idea I think we can all get behind – then it would be in their interest to find and hire the best programmers and keep them happy, in addition to refining their processes and standards. And then we would have secure software…

Well, we would if there were any hope of getting equitable rules of evidence to establish the actual cause of a security failure…

Julian Morrison October 20, 2005 6:27 PM

“Unfortunately, there is. Your security is worse because of everyone else out there that isn’t securing their computers.”

Ought the same argument to be used to force you to buy a gun and a guard dog, to reduce burglary? Or to fit armoured glass non-openable windows? Where does this stop?

In a society where property rights are recognised and respected, externalities are reduced to an unavoidable minimum. That’s because negating property rights to go on an externality-internalizing crusade is a huge externality in itself! Who will save us from the saviours?

“And even if there wasn’t, there are clear benefit to forcing vendors to assume liability. One, competition isn’t working.”

Competition is delivering to customers the use of their money that they most prefer. What gives you a privileged position to decide on their behalf what does or does not constitute “working”, and reallocate their spending accordingly?

Davi Ottenheimer October 20, 2005 7:39 PM

@ Julian

Those are amusing examples. Yes, in fact, if you feel threatened you can choose to buy a dog (might as well avoid the gun debate, eh) but you can also rely on an insitutionalized set of laws and enforcement of those laws.

What dog is big enough or what institutionalized lever carries enough weight (e.g. in the sense of multi-national corporation) to counter the insecure practices of a software company?

The whole philosophy of placing blame on intelligent consumers for poor choices is so fundamentally flawed that I can’t believe any individual goes for it as a workable system.

Take my bridge example again, since it’s more comparable. Let’s say you’re about to drive over a bridge and you wonder about the safety. Those who want to hold the consumer wholly accountable would say you should inspect every bridge before you cross it. This is not only an unreasonable burden on the individual but it is practically impossible. How would you inspect the rebar welds inside the concrete, for example? And lets just say you did happen to have an x-ray machine handy and you found the welds crackedl; would you hold each individual welder responsible or the bridge building company that allowed the cracks to go unnoticed until now?

Anyone who says software buyers or the individual developers within a large corporation should be held wholly liable for insecure software clearly knows little or nothing about how to actually make software secure.

jammit October 20, 2005 8:45 PM

“isn’t this a case of buyer beware?” I did speak too soon. If the buyer doesn’t have a choice, or doesn’t recieve the correct info because they are buried under marketspeak, then there is a problem with the security. Since I’m a little better at being able to ascertain for myself the security of a system a little better than the “unwashed” masses, I jumped the gun in assuming everybody could make a good descision. The latter half of my previous statement of a UL for security and the manufacturers trying to one up each other still stands for me. Perhaps maybe throw in a Consumer Reports way of comparing security measures taken by companies much like the way they compare cars and TV sets. But I agree that putting the blame on the little guy (programmer) only insulates the big guy (boss) to make bigger mistakes in order to make up for the money lost in replacing the terminated programmer.

sward October 20, 2005 9:59 PM

C’mon, liability lawsuits are a wonderful way to redistribute the “external” costs – just look at all the good they’ve done the health-care industry!

NOT.

I think you’ve got the right approach (economics), but the wrong tool for the job. This helps our legal profession far more than it helps consumers.

Davi Ottenheimer October 20, 2005 11:08 PM

“liability lawsuits are a wonderful way to redistribute the “external” costs – just look at all the good they’ve done the health-care industry”

Ah, a good point. Reports say “misuse” of health-care liability laws, or “frivolous” lawsuits with “excessive” or “no-limit” awards, are a problem. Note that all of those terms point to the abuse of a system by clever lawyers, which means those who craft the laws need to be careful they do not create opportunities for legal abuse.

Health-care liability also seems quite different when you compare the type and magnitude of “harm” that could be claimed.

Sundar S October 20, 2005 11:16 PM

We cannot sue the software vendors, as and when we find a bug in his product. All software vendors are trying their best to deliver a bug-free product. So, we can make the software vendors sign an SLA, to deliver a patch within, say, 1 day of finding the exploit.

Some users go to the internet just to download some virus. When you software is exposed to internet, there is always a chance of getting a problem. We cannot sue the vendor for this.

Suppose, you buy a nice suit from a departmental store that gives you a replacement guarentee, if the suit has any flaw. Can you sue the store, if some mad man in the alley throws a tin of grease on you and spoils your suit?

sward October 20, 2005 11:47 PM

@Davi

“…which means those who craft the laws need to be careful they do not create opportunities for legal abuse.”

Given how many of those who craft the laws are lawyers, relying on their ability to do so seems terribly naive. One might even say culpably negligent… These “abuses” are an inherent feature of the system, not an aberration.

Julian Morrison October 21, 2005 3:12 AM

@ Davi Ottenheimer
“Those who want to hold the consumer wholly accountable would say you should inspect every bridge before you cross it.” – well, no. To allow a driver onto an unsafe bridge in ignorance would be reckless endangerment. But if you as a driver want to cross the bridge and have to press an “I agree” button on a posted disclaimer saying “it’s unsafe” before the barrier lifts to let you on, then the risk is rightly your problem. Caveat drivor, or something.

Bruce Schneier October 21, 2005 4:25 AM

“What I expected someone to say – especially you, Bruce – is that putting the responsibility on the coder is the best possible way to assure that no coding ever gets done, because the relationship between the real quality of a coder’s work and his risk exposure is too diffuse. Besides, how likely is it that a given flaw can reliably be tracked to exactly one programmer’s work?”

That’s certainly true.

Bruce Schneier October 21, 2005 4:26 AM

“‘But I think we need more security, and that ‘more security’ is the externality.’ I’d say better or more effective security, not just more…”

Agreed. That’s a better phrasing.

Bruce Schneier October 21, 2005 4:30 AM

“‘Unfortunately, there is. Your security is worse because of everyone else out there that isn’t securing their computers.’ Ought the same argument to be used to force you to buy a gun and a guard dog, to reduce burglary? Or to fit armoured glass non-openable windows?”

I’m not sure how. In your two examples, there’s no real externality. The security of your home does not affect the security of my home. But the security of your computer directly affects the security of my computer. Think of DDOS attacks. Think of DNS attacks. Think of rapidly spreading Trojans. There are lots of examples where my computer security is affected by the security of computers that I have no control over.

A better analogy is vaccinations. Because of the way many diseases spread, my health is directly affected by whether or not everyone around me is vaccinated or not. Vaccinations are an externality. The smart trade-off for each individual person is for him not to be vaccinated, but for everyone else around him to be. But if everyone followed that logic, no one would be vaccinated and everyone would be worse off. So, the reasonable thing for society to do is to require that everyone get vaccinated. That way, everyone benefits and there’s no free-rider issue.

Computer security is a lot like that.

Bruce Schneier October 21, 2005 4:30 AM

“The whole philosophy of placing blame on intelligent consumers for poor choices is so fundamentally flawed that I can’t believe any individual goes for it as a workable system.”

Hear hear.

Bruce Schneier October 21, 2005 4:31 AM

“I think you’ve got the right approach (economics), but the wrong tool for the job. This helps our legal profession far more than it helps consumers.”

I know a lot more about security than I do about economic tools. My guess is that the correct tools will be a combination of regulation and liability, with some competition thrown in. But I’m interested in new ideas long these lines.

Bruce Schneier October 21, 2005 4:33 AM

“Reports say ‘misuse’ of health-care liability laws, or “frivolous” lawsuits with ‘excessive’ or ‘no-limit’ awards, are a problem. Note that all of those terms point to the abuse of a system by clever lawyers, which means those who craft the laws need to be careful they do not create opportunities for legal abuse.”

Agreed. There are a lot of problems with liability law as implemented, but the basic economic idea is sound.

A decent count-argument here is that we can’t be trusted to do it right, so we shouldn’t even try.

Bruce Schneier October 21, 2005 4:34 AM

“Suppose, you buy a nice suit from a departmental store that gives you a replacement guarentee, if the suit has any flaw. Can you sue the store, if some mad man in the alley throws a tin of grease on you and spoils your suit?”

Of course not. But that’s not a flaw in the suit.

It might be different if the suit had a hidden design error that somehow made the grease extra nasty or something.

Henning Makholm October 21, 2005 8:41 AM

“There are lots of ways to deal with free [software] in a liability environment. You could imagine, for example, that free software does not carry with it any liabilities, and that anyone using it has to take the liability themselves.”

But that is the current state of the world.

Davi Ottenheimer October 21, 2005 11:20 AM

“These ‘abuses’ are an inherent feature of the system, not an aberration.”

Yes, although I might have said “inherent risk” as opposed to “feature”.

I see that as the same fundamental struggle in any system. But I’m not ready to say we should do nothing just because we can’t do something perfectly.

That line of reasoning typically does not go over well in security practices — “sorry, can’t get to 100% accuracy so we’re not going to use any spam filtering, ok?”

I’m also not going to try and suggest a perfect point-in-time answer even needs to exist, given the body of work related to managing constant change via a Shewart cycle: Plan, Do, Check, Act…or something like that.

Davi Ottenheimer October 21, 2005 11:47 AM

“if you as a driver want to cross the bridge and have to press an “I agree” button on a posted disclaimer saying “it’s unsafe” before the barrier lifts to let you on, then the risk is rightly your problem”

Ugh. That’s just about the most unbelievably silly thing I’ve heard in a while.

A button is essentially meaningless when insufficient information is available to make a decision. Are you really chosing to be rich or poor when you buy a single lottery ticket, or is that called “gambling”…?

We drive over bridges because you actually have some level of assurance that they will support us based on knowledge of a system of independent inspection and certification or at least the empirical evidence based on the systems’ successes. And save that, it’s because liability-based regulations (even if informal) ensure that bridge builders do a reasonable job.

Now, if you say we should cross bridges without holding the builder liable at all because you believe that everyone can gather enough information to assess the structural safety from inside a car at 40+ mph with absolutely no specialized knowledge…ha ha ha! ROFL

I’m not even sure why I’m responding to this nonsense. Back to security…

Xander Lebrun October 21, 2005 12:28 PM

There’s at least one other way to put pressure on vendors that won’t wipe out F/OSS on the spot. Why not set up a certification system for warranties, based on how much responsibility the vendor takes? Then the market would be easily able to assess, and therefore to respond to, the vendors’ willingness to stand behind their product. Even a marketing director or a vice-president would be able to tell when a company was attempting to weasel out of all liability, as with Microsoft’s infamous EULA.

Free or Open Source software like Linux wouldn’t be penalised at all, as anyone who wished to download an copy under the (presumably uncertified) GPL could do so whilst anyone who wanted to pay Red Hat et al for a warranty that would match the certification standards could just cough up the cash.

What sort of stuff should be tested by the certification I have no idea. Big, bold letters saying “we accept no liability if our product sends your company into liquidation” would be a no-no, but beyond that I haven’t given it much consideration yet.

Any thoughts? Is this something worth devoting more braintime to, or is there an obvious way in which it’s bad and wrong?

Pat Cahalan October 21, 2005 12:34 PM

@ Bruce

Agreed. There are a lot of problems with liability law as implemented, but the
basic economic idea is sound.

A decent count-argument here is that we can’t be trusted to do it right, so
we shouldn’t even try.

It’s a decent counter-argument, but I don’t buy it (personally). I use the same justification I used on Marcus in the dumbest computer security ideas thread ->

Regulations and liability are usually done badly in the beginning. Things are over-regulated or under-regulated or companies are held liable for things they perhaps should not be, because the populace isn’t well informed. In the long run, however, things get better with regulation and liability (look at the meat packing industry, and compare it now with what it looked like in the days of Upton Sinclair’s “The Jungle”).

Sure, implementing regulation and increasing liability is going to result in some disagreeable consequences in the short run, and some companies are going to be held accountable past all reason and suffer for it. You have to start sometime, unfortunately. It would be nice if we could start after I retire, but I doubt we can wait that long…

Bruce Schneier October 21, 2005 1:15 PM

“‘There are lots of ways to deal with free [software] in a liability environment. You could imagine, for example, that free software does not carry with it any liabilities, and that anyone using it has to take the liability themselves.’ But that is the current state of the world.”

Yes, you could imagine that paid-for software comes with a liability, while free software is the same as the current state of the world.

I’ll take other suggestions. This is clearly something that has to be figured out.

Davi Ottenheimer October 21, 2005 2:01 PM

“imagine that paid-for software comes with a liability, while free software is the same as the current state of the world”

I was actually thinking “open source” software (rather than free — as in beer) would be the same as the current state, whereas closed software would have liability placed on the people responsible for licensing/managing the code.

elegie October 22, 2005 12:26 AM

When a company produces software, the company (and not the hired programmers) is usually viewed as the “author” of the software. There could be something to be said for having the company as a whole being liable instead of specific individuals. Of course, one individual might be more to blame in some cases. Companies might come down harder on hired programmers to have them do more secure code (even if it is not completely or perfectly secure.) This might not be a bad thing for some programmers. The “culture” of a company might affect what the programmers do, including the security of the software produced.

In addition to free (as in freedom) and open source software, there is software produced by individuals in the form of shareware, freeware, postcardware, etc. Liability rules should not assume that all software results from a significant collaborative effort or a large corporate effort.

It would be important to determine exactly which software was at fault. Software can be modified or “enhanced” by users. This could well affect whether a vendor is liable. Some interactions involving multiple items of software might not be easy to anticipate.

Though opinions vary, some would say that legal action involving a larger entity is not like it may seem. It may be difficult and it may become a case of “might makes right.” See http://www.avault.com/developer/getarticle.asp?name=bwardell8 Of course, if someone is hurt then the effort may be worth the compensation (and perhaps the deterrence effect.) It depends.

Bob October 22, 2005 10:49 AM

I largely support Bruce’s view. I have worked in a company in which it has not spent a cent to train their staff. It is not that they have hired people with the required skill.

The management keeps pushing software out without clear understanding of the underlying technology. They even fail to tell a prototype from a properly designed piece of software. The poor consumers are being sold some experimental stuff.

If things compiled and seems to run fine, they declared shippable. They released their software out without any formal beta program.

Developers working in this environment just cranked out stuff which they do not know if technologically right or wrong.

I am not that harsh to advocate one to sue companies for normal human error (for live threatening class of software this is different) but companies blatantly violating of published industry specifications with no remorse or care deserve to be sued by anyone.

Sadly, I do not see any government or consumer protection groups attacking these irresponsible merchants.

If you examine their licence, they literally want to take your money with no responsibility and imposing all forms of restriction that often violates fair-use. The only way to curb this kind of raw deals for consumers and malpractice is by heafty fine.

Software companies’ main role is to make money (producing software is of secondary concern to them) and hence the only way to steer them to be responsible producers is by hitting them where it hurts.

You do not find this kind of consumer abuse in other forms of industry and professional.

Time to clean up the act.

Bob

Abdelhamid Salaheddin October 24, 2005 6:34 AM

I think that there are two points affect the security or the quality of the software:
1- In software development technology advances more rapidly than process does. There are many methods and tools that help developer produce software but the tools and methods used to perform Quality control of this software are not on the same level.
2- Software development has an artistic side that in many cases contradicts with the work environments in many software houses which make the result more vulnerable to errors. Developers in many places have to work many working hours in fixed schedule with sword of deadlines over their necks.
I think that as long as these two points persisit the developers can never be blamed for bugs.

Kevin Davidson October 24, 2005 6:03 PM

As a software vendor, I think that market forces should decide if customers want to pay for invulnerable software. But if software vendors become liable either through market forces or government action, what consumers will pay for is the passed-on cost of liability insurance, not invulnerable software.

I think medical malpractice is a good model to look at when considering liability for software errors.

Anonymous October 25, 2005 10:07 AM

Thank you! Not only would being individually liable for flaws be a disencentive to become a programmer, it would be difficult in many cases to determine whether the programmer knew in advance that his or her code, when combined with the pieces written by others, would result in a security problem. If you can’t prove that the person knew it was problematic, when attacks come from new quarters all the time, how do you prove culpable negligence?

The people who write code are often not of a hacker mentality, and therefore I suspect they don’t necessarily know that what they are writing has a hole. When you are meeting a spec of what the code is supposed to do, your attention is focused on getting it to perform properly for that purpose. Proving that the code is not vulnerable to any of a number of attacks involves looking for a different set of things. If that set of things isn’t written into the spec in the first place, coders would have to be intensively trained in a different skillset to be aware of all possible problems. But even then, the modular nature of most software would make it difficult for the original programmers to know that the totality was secure. It could happen that when written it was secure, then a mod was made to another piece, and suddenly it was no longer secure.

Joerg November 17, 2005 11:06 AM

Yesterday I commented via eMail :

I’m a software developer for about 25 years now and in my opinion
there’s no way to achieve truly ‘secure’ software at the current
level of system complexity and at the current market prices. Now
maybe I’m just a very bad programmer, but I would guess that the
prices could easily rise with an order of magnitude if we would
try to make all the programms ‘secure’. As a result almost all
programs would vanish from the market and since all kinds of
industries from TV broadcasting to aircraft design depend on new
software every day we would see a huge crisis. I would be using
my Office 97 for eternity. So you answer that by ‘secure’ you don’t
mean ‘entirely secure’ … just ‘reasonably secure’ maybe ? But then
what would a lawyer make out of such a definition ? What small
software company would risk to sell anything and then wait for
the big competition to sue them ? Forget it. Our business has been
so successfull over the time because of the shared risk and the
shared knowledge. A nonprofessional has to know what he doesn’t
know. So if he’s crazzy enough to buy and install a wireless network
without knowing how to secure it – his problem. I have a driving
license but I know that I have only little experience. If I go
and rent a Porsche and drive it against a bridge support at 200 –
my problem. Should Porsche be liable ? Then they would limit the
top-speed to 50 and go out of business. Obviously people can live
with the current situation. I couldn’t live with a situation where
software sells for 10x the price and only big companies are able
to sell anything at all – being able to pay the lawyers. Nobody
is forced to use computers or software. Nobody is forced to upgrade
every year. I still use quite old software in order not to touch
the new stuff that is attacked by the new viruses. My keyboard has
a wire, my network goes through wires, my internet access is dial-in
and I disconnect after finishing even though I pay a flatrate.
Of course I’m still totally unsecure – so I have a backup of my
disks in some save locations … Software liability is just one more
regulation on a very high pile. It kills the spirit.

Then Bruce Schneier was so kind to answer :

Liability does not equal 100% liability. Porche is already liable
for their automobiles, unlike software manufacturers. Actually, the
current liability system with automobiles is a great model.

Let me comment on that again …

I agree that my Porsche example was not really good in all aspects since
I confused liability for manufacturing defects with liability for missuse
by the costumer. So indeed the question is ‘Why should all manufacturers
be liable for faulty production with the exception of the software
manufacturers who sell faulty products ?’ … In my opinion the answer
is, that in reality the complexity of current hardware/software systems
is pushing the envelope of human abilities. In certain cases 100% fault-
less systems are required – aircraft control, control of atomic powerplants
and some medical systems. In these cases the costumer willingly trades
functionality and price against security. But in most other areas the
costumer is not willing to do it. The costumer actively seeks more complex
and less tested software (insecure software) if he gets new features and
compatibility. As long as the costumer makes that choice he’s actually
expressing what’s important to him. I for instance so not install new
releases of Real player, Quicktime player and Flash player since the
manufacturers decided to integrate all kinds of backdoors, content management
etc. … I stay with the old versions which limits my access but I decide
for higher security. I also still use Win 98SE on two PCs etc. … Should
serious software liability become law then we would see a shocking dip in
the advantage of technology and industry. Effectively I would even go as
far as to predict, that the consequences would proof to be unbearable for
our society and the liability would be made ineffective somehow

Sandro Rafaeli December 3, 2005 2:30 PM

There is a big problem with helding software programmers liable for software problems: they are not the lonely resposible for the final product. A programmer finishes writing his piece of code and it’s not readily sent to the shop shelves. I mean, in a good software process life cicle, the programmer finishes it, then he tests is. Then he sends it to other programmer to review it. Then it is sent to testers to verify it. Who should be held responsible? It is the manufacturer who is responsible for enforcing the correct cicle.

Dave Morgan February 23, 2006 4:37 AM

In regard to programmer liability, I don’t believe that a programmer should carry any liability for the product in the context of end use but they should be held liable in the context of the process they use to develop that same product. Our industry has been dancing around the issue of professional certification based on foundation standards and basic business ethics for years and I am a strong believer that this is the only practical way to elevate software quality to the next level. If the processes used to develop software improved the consistency, accuracy and quality through application of software engineering standards and those standards are attached to professional certification I believe rational accountability begins and it begins at the right place. The next level of accountability lies in the marketing layer and with that tier heavily populated with salesman I’m not even going to comment.
The way things are now, any attempt to push liability to the programming layer would be a disaster. If legal statutes and precedents start to appear focused on the programmer layer of this industry the legal feeding frenzy would be akin to dropping steaks in the piranha tank. In all cases where legal gets ahead of the industry, the missing professional standards would now be developed predominantly in the courts rather than by those in the industry at an extremely high cost.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.