Schneier on Security
A blog covering security and security technology.
« Speed Cameras Record Every Car |
| Anti-Terrorism Stupidity at Yankee Stadium »
July 23, 2008
Information Security and Liabilities
In my fourth column for the Guardian last Thursday, I talk about information security and liabilities:
Last summer, the House of Lords Science and Technology Committee issued a report on "Personal Internet Security." I was invited to give testimony for that report, and one of my recommendations was that software vendors be held liable when they are at fault. Their final report included that recommendation. The government rejected the recommendations in that report last autumn, and last week the committee issued a report on their follow-up inquiry, which still recommends software liabilities.
Good for them.
I'm not implying that liabilities are easy, or that all the liability for security vulnerabilities should fall on the vendor. But the courts are good at partial liability. Any automobile liability suit has many potential responsible parties: the car, the driver, the road, the weather, possibly another driver and another car, and so on. Similarly, a computer failure has several parties who may be partially responsible: the software vendor, the computer vendor, the network vendor, the user, possibly another hacker, and so on. But we're never going to get there until we start. Software liability is the market force that will incentivise companies to improve their software quality -- and everyone's security.
Posted on July 23, 2008 at 3:09 PM
• 66 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
While not all software flaws should trigger liability, those involving security should, particularly where the vendor knows about the problem and does nothing. For an in-depth article on the current U.S. law on the legal liability of software vendors for security flaws see: "Tort Liability for Vendors of Insecure Software: Has the Time Finally Come?" http://tinyurl.com/43aelw
There is a fundamental problem with what you're proposing with browsers in particular. In many cases the software is not sold to the end user so there is no "vendor". Opera and Firefox, both of which your article mentions fall into that category.
It gets worse with Firefox and other Mozilla based browsers which are released under the Mozilla Public License which specifically denies any warranty whatsoever. If what you are proposing was taken seriously then this software, and all other software development under similar open source licences such as the GPL would have to cease in this country as developers would find themselves liable for something from which most of them are are making no profit. In addition the overseas developers would have to try to bar people in the UK from downloading the software.
Unless you really are a Microsoft fanboy I don't think that's what you intended.
As long as a quick EULA popup can shift liability to the poor schmuck that installs the software, why would anyone waste time making truly secure software or operating systems? It's still absolutely ridiculous, for a stupid example, that calc.exe has no file capabilities or network awareness but it has permission to access the network or the hard drive.
I don't think Bruce is a Microsoft fanboy either.
It's just that Theo de Raadt really pisses him off.
OpenBSD has been dying long enough. Now it's time to kill it off for good!
There are some good arguments for the idea that software vendors should be liable for security flaws.
But I can see at least two major problems:
1) As an open source software developer, I'm very careful about security. But that doesn't mean I want to risk a lawsuit every time I give a library away for free.
2) Security bugs are harder to eliminate than regular bugs, because you have to assume a malicious adversary. Unfortunately, even non-security bugs can only be eliminated at tremendous cost. NASA, for example, has written 750,000-line programs that are virtually bug free--but each line cost something like 10x the industry average. Eliminating all security bugs would be at least as expensive.
As a software user, the patch treadmill is a pain. But if the alternative is a world with (1) no open source software and (2) hugely expensive commercial software, well, the cure might be worse than the disease.
Do software liability proponents have any detailed proposals?
To further the completely inappropriate car analogy, making vendors liable for security flaws is like making car manufacturers liable for each new way criminals figure out how to steal cars or otherwise profit from the automobile ecosystem.
I'm a software developer. In order to meet such liability requirements, we'd require to raise our cost of development at least tenfold, though I suspect it would be a lot higher. This equates to much higher end-user costs.
You can mandate such requirements. But the market can't afford them. Either they get ignored, or the economy goes into a long term recession.
Much as I agree with the idea of Lemon Laws for software there is an important area that needs consideration.
Software is not realy engineered it is made in an artisan fashion.
In much the same way the design of the cart wheel owes it's form not to mathmatical rigour and an understanding of science but to the "fixing of failiers and experementation". Software likewise owes it's form to the experiance of failiers and the "bodge a bit on" mentality of our Victorian for fathers.
Even with all the tools available software is still very much in the artisanal camp not that of engineering or science. In many ways it's issues can be likened to boiler design in the UK a couple of centuries ago.
The boilers used to explode on a regular basis often killing or maiming the operators or others in the visinity. The rising level of carnage pressed Parliment to act.
The result was funding of scientific methods to understand why boilers failed and to a set of inspectors and tests by which a new boiler could have it's safe operating limits established.
Untill software ceases to be art and actualy becomes engineering and science legislation will be at best ineffective at worse a tool for opretion wielded by those with power and influence.
When it comes to security we have not put on our running shoes let alone make it to the starting blocks. We have no metrics that are deserving of the name so how can we realisticaly say how good or bad a piece of software is.
And without a reliable and trusted measuring stick the insurance companies will not underwrite software companies.
Thus the result will be that software development will become to higher risk to carry out in the legaslitive juresdiction.
Therefor the most likley eventuality will be stagnation in the home software market with strong competition outside. The result companies to stay competative will have to buy from abroad with less warenty and less support than they currently get.
What is needed is some method of getting maturity into software to make science and engineering the predominant methodologies not "bodge a bit on and hope" or worse design by "it feels right".
Before I get flamed, I know my words are somewhat harsh but step back from the problem and take an honest look, then think.
Having worked in a field where liability for security defects in software was mandated by the governments in question, I didn't feel that dealing with security flaws was an unreasonable cost. The worst security problem we had was with our Pseudo-Random Number Generator, and that ended up being a patch followed by a much better fix (for those who are curious, one of the potential attacks and the defense are coincidentally described in http://www.schneier.com/paper-prngs.html which was published after this occurred).
"Even with all the tools available software is still very much in the artisanal camp not that of engineering or science."
While I've seen such work, the first thing I typically do with such a program is to reverse engineer it, then re-engineer it. Art is fun, but it doesn't result in comprehensible code (outside of the creator's mind). In my opinion, there are two valid ways top write software: via proof, and via engineering methods.
":When it comes to security we have not put on our running shoes let alone make it to the starting blocks. We have no metrics that are deserving of the name so how can we realistically say how good or bad a piece of software is."
There are tools (like TSP) which can measure (general) defect density. With a reasonable process in place, I suspect that measures for security could be generated, although unless you have an outside auditing mechanism, I wouldn't put much stock in such externally released numbers. One could, however, could create metrics for internal purposes (such as deciding if the software should be released).
Perhaps it can be claimed that people who use open-source software have no reasonable expectation or guarantee of quality, while users of purchased software have more rights. Then, where does this place companies such as Red Hat, who provide paid support for open-source software?
Clive, your words are far from harsh, they are consistent with the very license files distributed with software.
Consider Password Safe. It is usually attributed to Bruce, but I just read that it was originally written by Mark Rosen at Counterpane Lab. It is now available as an open source project. Here is the license section when you install:
"Password Safe is available under the restrictions set forth in the
standard "Artistic License". For more details, see the LICENSE file
that comes with the package."
So there you have it. Your artistic license argument is commonplace. It is right on the money, if you'll pardon the pun. Anyway, the LICENSE file for this software clarifies Bruce's liability in bold text:
"UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."
Bruce, how about you modify that to something that says you believe, as originator of the software, that you should accept liability when you are at fault? Might be a good first step if you really want people to believe in your message.
what is with your web site? For the last week now I can't get to it in Firefox in Linux, have to use Opera. "Page Load Error"
...or you can add an exception
ISTM, that Mr Schneier has been working from a model of the software economy. His model leads him to certain policy prescriptions. And Mr Schneier may feel that he has invested some of his personal credibility in advocating those policy prescriptions.
But, I suspect, the model that he is working from simplifies and approximates. And, I believe, it fails to adequately describe the actual software environment that exists today.
Looking just at Microsoft or Apple or companies like those two, it makes sense that they would insufficiently internalize the costs of lax security. A prescription for vendor liability may make a lot of sense if Microsoft is the premiere exemplar for software vendors.
But, as dominant as Microsoft is, one cannot ignore companies like Internet Software Consortium (ISC), producing software like BIND.
I doubt that ISC could afford the lawyers it would take to defend all the lawsuits that the recent BIND upgrade would spawn in a regime of vendor liability: No more ISC. No more BIND.
Most people --given a mismatch between their model and reality-- will tend to stick with their model. They will simply not see things that refute or discredit their model. It isn't 'wilfull blindness'. Instead it's a perceptual condition.
Security folks can't afford that type of all-to-common blindness. Security folks have to be able to see the potential for exploiting the gaps between theoretical models and deployed implementations.
I like the idea of liability ... I just don't see any way it can easily be instituted with regard to Open Source or Free (like Freedom) software.
Unless there were a government agency that would certify the code and complying with X standards.
That would be fine for some software (such as browsers) but it would be as corruptible as any government agency (and as prone to incompetence).
If the developer or distributor is going to be held liable then there needs to be a way for them to validate their products against a 3rd party standard.
Of course, that standard would be updated on a regular basis. And their current code would be checked against the updated standard.
Since we are talking about code here, the testing process would be completely automated.
The problem would be getting the funding to write the test suites in the first place ... and to keep the funding to keep the test suites up to date.
If the liability were limited to the cost of the software, free software would continue as it is. Only sold software would have any liability, and generally not enough to bankrupt its provider.
Oh, come on, Bruce! You're still advocating that? You're talking about making fundamental changes to the economics of software development, yet you repeatedly fail to adequately address the barriers-to-entry problems that this would create. Will people really be better off if they get only "high-reliability" software... from a small number of vendors? What makes you think that won't happen?
That software *vendors* aren't held liable isn't the problem. I recently saw a job posting that required that the applicant have both a SECRET security clearance and a laptop. Think about that. If some company puts classified information on its employees' personal laptops, and the information gets compromised because of a browser bug, should Microsoft, the Mozilla Foundation, Opera, or J. Random FOSS Developer be held liable because somebody used the software to protect something that it was never designed to protect?
Lots of software is designed to be used in non-adversarial environments. Lots of software also only has a lifespan of a few years. I agree that software would be more reliable if everyone had to pay for life-support-quality software, but most people don't *need* it, and vendor liability amounts to a subsidy of the small number of users who do. Should everyone have to subsidize the life-support software industry, just because a few idiots might one day use their software in a life-support system? I think not.
What about the effect on free and open-source software? Everything I've seen you say on the matter suggests that you barely have a clue about how it's developed. It's not all volunteer work, it's not all free of charge, and a lot of it only happens because the people publishing it don't have to take huge risks or get insurance policies in order to do so.
Like Davi Ottenheimer said above, why don't you accept liability for defects in your own software if you think this is such a great idea? You won't because it's not worth it to you? Neither will anybody else; That's the problem.
I'd like to add that there is one place where I think some narrow liability might help improve security without placing an unreasonable burden on everyone: when vendors make baseless claims about their software.
Where people distributes software with claims that it "keeps hackers out"---when it doesn't---or that "your computer is virus-free"---when it isn't---then I can see a good case for liability. This could encourage vendors to develop reliable metrics to back up their claims without burdening transactions where customers know that they're saving money by using cheaper, more error-prone software. In that case, the *customers* can be held liable if they use the software recklessly, such as by taking software designed to be used only on standalone machines and connecting it directly to the Internet.
In any case, I think it's important not to give an unfair advantage to proprietary software over free/open-source software, since the evidence so far seems to suggest that people are better off when they have access to the latter.
Broad vendor liability like what Bruce seems to be advocating could adversely affect free software, and I'm not convinced that he's really addressed that problem.
"If the liability were limited to the cost of the software, free software would continue as it is. Only sold software would have any liability, and generally not enough to bankrupt its provider."
You're smoking crack if you think you could get a "software vendor liability" bill through any legislature in the world--without having the final legislation shaped by Microsoft's lobbying muscle.
As enacted, it's likely that any liability legislation would bankrupt everyone except Microsoft and few other big "constituents".
Regarding liability and costs:
I'm not sure it makes sense to ask for complete liability from version 1. It makes much more sense to ask for liability according to industry standards.
In particular, a vendor will not be liable for writing a security bug. They will be liable is they get notified of the bug and take their sweet time to fix it.
The Open Source problem is also quite solvable. Just make the liability proportional to the entity's exclusivness in being able to solve the bug. If a non-critical port was left open by mistake, and had a buffer overrun, a vendor could reduce liability by recommending that the specific port be blocked by a firewall. Likewise, if a bug is found in Apache, the Apache foundation is not solely liable for damages because the users are also capable of solving the bug. Each user is somewhat less able to solve the bug than the core developers, but there are lots more users.
The system would have to be tuned so that we do not reach the point where no open source software vendor has any liability ever (also an undesirable result), but as a rule, I think it will stand up to common sense (read - will never get implemented).
Got to admit it makes you think. Prepare for internet insurance, mandatory internet zombie liability insurance, internet drivers licenses and systems accreditation. :)
We should simply make sure that there are lots of attackers.
Legalize all attacks that have been known to the vendor/author for more than two months.
As a result insecure systems will become unusable and stay that way until they get fixed or replaced with competing solutions.
"I'm not implying that liabilities are easy, or that all the liability for security vulnerabilities should fall on the vendor. But the courts are good at partial liability. Any automobile liability suit has many potential responsible parties: the car, the driver, the road, the weather, possibly another driver and another car, and so on..."
Erm... that works with cars because you know who made them, and nobody is making the cars for free.
I usually agree with Bruce's comments, but on this one I squarely disagree.
I belive it is completey unrealistic and workable. Others have already stated the problems, but here's my list anyway:
- Cost: Development costs would rise dramatically. Sale prices would increase dramatically in accordance, and small businesses would no longer be able to afford software. Piracy would surely before more rife.
- Barriers to entry: Small business can no longer afford to develop (or sell) software
I think that the situation we have now is the lesser of two evils. At least now small businesses can afford to both develop and purchase software.
"Any automobile liability suit has many potential responsible parties: the car, the driver, the road, the weather, possibly another driver and another car, and so on."
That works for cars because no one is actively trying to make two cars crash into eachother. I agree that vendors should be made liable for safety issues but not for security issues (the difference being that safety is protection from the environment and accidents while security protects from attackers). Safety can be done; cars can be made safe (you don't die when its hit by lightning), locks can be made safe (they don't explode in your hand when you unlock them). But cars and locks can not be made secure, no amount of vendor liability will defend against a man with an AK-47 shooting at your car, or a man with a sledge hammer breaking your lock.
Because what we really need is bigger government with more control and regulation. *sigh*
If there are laws in place, the lobbyists will just abuse and manipulate the system and we'll not likely be much better off.
Personally, I think it's better to let free market principles handle it. And I also believe that it's working. M$ writes crap code, Firefox market share shoots up. Sure, you're never gonna get perfect software for any purpose -- and Joe Six-Pack's computer will still get owned occasionally. But trying to fix that with government regulation isn't worth the multiple ways said regulation will inevitably be abused/fail and waste taxpayer money.
It's impressive how people are willing to blame others for not having secure products but are themselves unwilling to provide secure software.
as said above software development nowadays is an "art", any 12yo teenager can start the programming ropes via foruns, friends, and so on and at the age of 20 be a "full-fledged, professional" without even caring to know the underlying principles(theories) of computer science. and in my opinion the problem starts there.
one would never ask an artist to design and implement that security door that protects your servers, for example. but in software development it's really hard to distinguish artists from professionals.
liability would, in my opinion, help distinguish between the both.
of course companies like microsoft would never come to be, where the initial moto was "we promised A, we deliver B, but then we will make it better (patching) as we go along". so yeah, it would create barriers to entry. and I think its a good thing.
security programing is not that hard (not saying it's easy) if you start programing with security in mind. a good example is the one given by Daniel Bernstein and his qmail. http://lwn.net/Articles/257004/
of course if you think patching is the way to go, think about building a boat, with holes in the hull, and then start patching when it hits the water...
@ jurgen : the diference between your example and the software development reallity, is that unlike car manufacturers we can perfectly control our environment. this is to say, I can't control if a user puts his password on a post-it on his monitor, but I can control the way I store the passwords in my application. if I use a good hasing function to store the users passwords, and the user gets owned by the use of the post-its it's not my fault, and no malicious intended person will be able to compromise my software's security. but if I, on the other hand, use a faulty password hash, or store them in plain text, or fail to implement some kind of salt, then it really is my fault and therefore I'm liable.
If I write open-source software, and don't make money doing it... Isn't it my First Amendment right to publish my code just the way I want to (with security bugs and everything)?
"as said above software development nowadays is an "art", any 12yo teenager can start the programming ropes via foruns, friends, and so on and at the age of 20 be a "full-fledged, professional" without even caring to know the underlying principles(theories) of computer science. and in my opinion the problem starts there."
I'll take it a step further and say the problem has been in existance for a lot longer than that. There's a ton of developers out there without experience who got their experience from 6-month courses at various schools.
I'm not saying you need to have a degree in computer science to be successful, but there really should be certifications (think of building architects, contractors, etc) and software creation needs to be an engineering discipline.
The difference between art and computers -- art is a luxury/desire. The majority of society runs on computers now (agree or not) and just like buildings there should be some higher control over it.
What about vendors that implement a broken standard that they don't control?
Microsoft has produced many "standards", file formats, etc. that were insecure by design. Any vendor implementing such a standard according to the specification (if any), for the sake of compatibility, would be held liable to someone else's mistakes.
"... if I, on the other hand, use a faulty password hash, or store them in plain text, or fail to implement some kind of salt, then it really is my fault and therefore I'm liable."
In the US, with that kind of liability rule, any halfway competent lawyer can write a complaint that will survive a motion to dismiss.
So, suppose that after some discovery, your lawyer has obtained deposition testimony that the plaintiff's employee routinely wrote his passwords on Post-it® notes which were seen by two of the five people working in the same office.
But, plaintiff has an expert who has sworn in affidavit that MD5 is utterly broken. And that therefore the hashing scheme you used is beneath industry standards.
OK.... So you lost the motion for summary judgement.
Now, how much will you pay to settle this? Get out your checkbook.
@SecureApps: "The majority of society runs on computers now (agree or not) and just like buildings there should be some higher control over it."
The only people I know who insist on firm control, accreditation, licensing, annual fees for guild membership, and so on, are those can't program a computer worth a damn in the first place.
Why can't these people just gather together, set up their own company of Duly Robed Programming Engineers, and create software according to their Sacred Principles, assume full and complete liability for it, and sell it on the open market like everyone else?
Prove the the Blessed Superior Processes you would force upon us actually work.
Until that happy day, I suggest they can take their rent-seeking behaviors and shove 'em.
Offtopic, but does Bruce ever take part in the commentary to hos blog posts? I haven't found an instance of his replying to any of the comments posted, though my Google-fu may be weak.
@Marc: "If I write open-source software, and don't make money doing it... Isn't it my First Amendment right to publish my code just the way I want to (with security bugs and everything)?"
Practically speaking, you are not allowed to build a house all on your own these days. At the minimum you have the Building Inspector to deal with. Try and construct a large enough open space, and you must get a Sanctified Engineer to bless your design (since the inspector can't do that, according to the Rules and Regulations).
So, to answer your question in this analogical framework: "NO!" If we accept this entire argument, and carry it forward, either you would not be allowed to publish it, or, just as equally, no one would be allowed to use it. At least until one of these Noble Programming Engineers gives you the nod.
And how often do you think that will happen?
"Now, how much will you pay to settle this? Get out your checkbook."
If I use MD5 as my password hashing scheme on my applications today, I would "gladly" pay up. but since, I really don't like to give my money away, I would make sure that my next password hashing scheme would be within the industry standarts, and not a broken scheme, or a half-cooked encryption I made on my spare time. thus making ALL my clients safer.
I think that in the end would be better for all of us...
"Microsoft has produced many "standards", file formats, etc. that were insecure by design. Any vendor implementing such a standard according to the specification (if any), for the sake of compatibility, would be held liable to someone else's mistakes."
If liability for software became a reallity, I assure you, that MS would drop most of its defective standards. as they would be a primary target for liability sues, since they have quite alot of ahters and a huge user base. and they do like their money...
"The only people I know who insist on firm control, accreditation, licensing, annual fees for guild membership, and so on, are those can't program a computer worth a damn in the first place."
And the only ones against are the ones who are afraid they wouldn't succeed in obtaining in it. It works both ways.
I suppose you think anyone should be able to make medicine and sell it for how much they want without oversight?
As Bruce says, liability isn't easy. We'd also have to place faith in the court system to properly discriminate between oversight, negligence and criminal negligence, and factor in the price of the software. So an open source developer who gives away software would be at most warned for any oversight, while a major company that neglects to address security would be highly punished.
Anonymous above already makes a comparison with house building, where the final result is inspected.
Another comparison could be made with aviation, where there are different degrees of certification. There are "experimental" aircraft which can be home built and are subject to only minor inspection, making sure that there's little chance to hurt someone else than yourself -- and the pilot has usually a certain self interest in not hurting herself. Then there are different kinds of certification for larger aircraft, with increasing amounts of review (type certificate, production certificate) and inspection schedules.
There are some highly innovative experimental aircraft on the market (usually sold as "kits", so that the manufacturer does not need a production certificate), while there is a high barrier of entry for building commercial aircraft. While it is easy to come up with a revolutionary design on paper (VLJ, supersonic, etc.), it is tremendously hard to get to market. A lot of start-ups have failed trying. This is not usually considered a bad thing -- after all, we want our passenger aircraft to be safe.
So even with liability for software, there is no reason why people could privately (or companies internally) run "experimental" software, but demanding higher standards for commercial software.
@Anonymous at July 24, 2008 8:49 AM
(putting a name down makes it easier to respond to your comments)
Personally, I think (good) processes are worth far more than accreditation or licensing.
A number of companies do just what you describe (if they'd use different terms). Some of them are succeeding nicely.
I'm using Firefox 3.0.1, and I have no problems. Perhaps there's something local at your end?
"If I use MD5 as my password hashing scheme on my applications today, I would "gladly" pay up."
Did I say you used MD5 as your password hashing scheme? No. But I think you did just admit that MD5 is unsuitable for with passwords.
Quoting from the Supreme Court's decision in Celotex Corp. v. Catrett, 477 U.S. 317 (1986) on the Federal Rules (FRCP):
"Under Rule 56(c), summary judgment is proper 'if the pleadings, depositions, answers to interrogatories, and admissions on file, together with the affidavits, if any, show that there is no genuine issue as to any material fact and that the moving party is entitled to a judgment as a matter of law.'"
If you don't settle, then you're going to a jury on the question of whether your choice of password hashing scheme was negligent.
So if I understand correctly (english is not my first language, and I'm not that familiar with US law) you said that even if I don't use MD5 as my password hashing, and I can get "proof" that the user used post-its. just because an expert states that MD5 is broken I still have to go to jury?
seems strange, but even if I still go to a jury (since I'm not using a broken standard) what is the problem?
again I fail to understand, since where I live we are not that keen on being judged by a "jury of our peers", we have a judge (sometimes more than one), lawyers, the law and thats pretty much it. (something like your surpreme court I presume)
"And the only ones against are the ones who are afraid they wouldn't succeed in obtaining in it. It works both ways."
... except for the fact I know many people who _could_ obtain whatever crap 'license to program' you could come up with, but who would not support this kind of rent-seeking BS in the first place.
Maybe I need to lower my standards a bit?
"I suppose you think anyone should be able to make medicine and sell it for how much they want without oversight?"
Anyone who buys and uses medicine without doing their own research, checks, etc deserves what is coming their way.
Either it "works both ways", or it does not. Your choice.
@Fred P: "Personally, I think (good) processes are worth far more than accreditation or licensing."
"A number of companies do just what you describe (if they'd use different terms). Some of them are succeeding nicely."
... and most other companies are probably doing better, while producing code of comparable quality.
But feel free to provide company names and such. If these companies are in fact accepting liability for their bugs, I sense a very lucrative lawsuit!
"seems strange, but even if I still go to a jury (since I'm not using a broken standard) what is the problem?"
In the US system, it typically takes several years for civil litigation (iow, a negligence lawsuit) to reach the summary judgment stage. That's several years of paying expensive lawyers to do "discovery"--that is, to dig out the evidence that will be presented at trial. Several years when you're turning over all your developer email, your source code repository, preparing your developers for depositions, and so on. Several years of big bucks.
The case that I sketched out presumed that you (the defendant) made the motion for summary judgment. That is, your lawyers told the judge, "Look, the essential facts here are plain and we're not arguing over them. And the law says: just get rid of this case."
All that's needed to defeat the summary judgment motion is a real dispute over the facts. If there's a real dispute over the facts needed to resolve the case, then it has to proceed to the "fact-finder". Either party can demand that the "fact-finder" shall be a jury.
After motions for summary judgment have been ruled on by the judge, then typically the real haggling over settlement terms begins. Both sides should have a pretty good idea of what the case is going to look like and what the odds are.
Your average jury won't understand "hashing" and "MD5". They won't speak the language of software engineering or security. They won't really understand the difference between "SHA-1" and "SHA-256".
Instead, the jury will listen to both sides' expert witnesses. And then eventually make up their collective mind about which party is at fault. And which party contributed. They'll do their best to figure out who should be held responsible--who should pay.
Assuming that you aren't bankrupt already, then do you want to gamble on the risks here?
If software liability were actually imposed, expect software vendors to completely specify in what kind of environment their software can be used:
1. Hardware brand, model, firmware version exactly as specified.
2. OS version exactly as specified. OS hot fixes and updates? Forget about it.
3. System drivers and services versions and configuration exactly as specified.
4. Operating temperature and humidity exactly as specified.
5. Shielding against RF interference and cosmic rays as specified.
6. Only pre-approved third party applications. Vendor, software, version, configuration all exactly as specified.
7. Addons, extensions, in-house developement? Are you kiddin' me?!
8. All vendor issued patches must be applied immediately. And we mean *immediately*.
9. All requirements subject to change between version w.x.y.z and w.x.y.z+1. Yes, that applies to required patches as well.
10. Any deviation from the above requirements indemnifies vendor from liability.
11. Oh and by the way, that'll be 100x what you paid last year. Per month.
This idea is sounding a lot like communism (or pure capitalism). Great in theory, but does not pass the reality smell test. The prime beneficiary will be lawyers.
Bruce, I think this is a terrible idea, and actually am quite surprised that you've brought it up so often. Not to mention the horridly irrelevant analogies you seem to be sticking to.
"If an automobile manufacturer has a problem with a car and issues a recall notice, it's a rare occurrence and a big deal – and you can take you car in and get it fixed for free. [...] The reason automobiles are so well designed is that manufacturers face liabilities if they screw up."
This has nearly no connotation, as stated, with anything regarding software. It really doesn't, and you should know this. First of all, many reasons cars are recalled, as well as many of these flaws that the companies are held liable for, directly cause bodily harm, or death, of one or more people in the vicinity of the car under certain circumstances. Find me one piece of software that was *directly responsible for the death of a user (directly). Secondly, I don't see one car manufacturer being sued over the fact that someone was able to plant a car bomb and kill the driver, cut the brake lines, steal the stereo, et al. Do you? Doubtful, but I'm open. Why? Because while the car manufacturer could absolutely armor the living flack out of the automobile, to prevent all kinds of common acts of malice to it or its payload, it's absolutely ludicrous in terms of expense, usability, and convenience. Not to mention statistical relevance... (ever had your brake lines cut? neither have I, although it's definitely a real 'security hole').
"This sounds a lot like blaming the victim: "He should have known not to walk down that deserted street; it's his own fault he was mugged." Of course the victim could have – and quite possibly should have – taken further precautions, but the real blame lies elsewhere."
Here, my first question is "Where does the blame lie?". Well, if you said the mugger, you're probably right, although being that mugging is generally a crime of opportunity, I'd say it's about 70/30 in favor of the victim's 30% here. Either way, another terrible analogy. How would this be prevented by holding someone liable? Obviously the software vendor isn't the 'mugger' in this scenario, nor is the victim. So who is it? The government? The alley-way? So, do we sue the businesses on the street face for not keeping the 'riff-raff' out of the alley? Or do we sue the government for not preventing all possible crime? How about the wallet manufacturer for making such an easy-to-steal wallet? Any of the options are ludicrous and you don't need a point by point to know why.
So why the terrible analogies?
Frankly, I think the answer is simply that there is no good analogy for this issue. It is a fresh field of human interaction that we are only just getting comfortable with, and likening its legislative governance to that of anything comtemporary or historical is not really suitable.
You cannot hold the world at fault for being a victim of a crime, unless the world allows the crime to happen unpunished, free of interference upon awareness of the act. You cannot hold the venue of a crime at fault, unless its owner(s) beared witness to the crime during its occurrence and failed to attempt to stop it, or alert the appropriate authorities. You cannot hold the manufacturer of the weapon used (be it a Grenade, Desert Eagle, or Bic™ fucking pen) unless the manufacturer sold the weapon in question directly to the malicious people knowing full-well their intent.
It takes three to dance to this music, and frankly I really don't think the software vendors should be held responsible for all (or nearly any) forms of malicious actions performed with, on, or against their products.
I think what you propose is a very impatient, messy, and poorly thought out attempt at roping in a wild bull. The end of the next few generations will see a boom in overall end-user awareness, an increase in quality and security of software / networks, as well as increased competency of law enforcement and government agencies to track down and deal with more and more types of 'cyber' crime (or they damn well better, now, since we can't browse anything without being stored in some DB in a bunker somewhere).
Increased transparency of software design, as well as community (and/or consumer) commentary and awareness, is what is needed here. Hold the software vendors liable in terms of competing with better, faster, more secure versions of what it offers (which, at this rate, will generally be the open source versions, free of charge minus the learning curve of some). And yes, we've all heard about your lovely grandmother not being able to keep up with the times. Bleeding hearts aside, you're right of course, but they still have the power and choice to educate themselves, somehow, on the products they use to help secure them without much knowledge / hassle. I don't know shit about how cars acheive better gas mileage from an engineering standpoint, but I sure can do a little footwork on which cars have historically (both from a manufacturer's and driver's perspective) acheived great gas mileage, and choose my purchase from there... no 'inside knowledge' required, other than an acronym (MPG). And I damn sure wouldn't by a car without door locks... You know, to keep on with the terrible car analogy.
I think this country needs a real big effing wake up call if all of its problems need to be solved by allowing the 'victims' to sue 'those at fault'. Because in the end of that model, the only real victims are the middle-to-lower class citizens who have to pay higher taxes (could they get any higher?), and pay through the asshole for products / services (esp. insurance, jesus FCK!) just to cover the costs of lawyers, legislation, increased vigilance and quality control, et al..
This is a criminal problem. Perhaps if they pulled their head out of the non-violent-drug-users' asses, they'd figure a simpler way to lower the incentive for the criminals... which, has, historically, helped lower the rate of the crime in question altogether, but nothing will ever stop all cyber crime, or any other form of crime for that matter. In fact, I think cyber crime will prevail over most other forms in the end because it helps to dissociate the crime from the victim, since you're not facing the victim, just the idea of the victim.
I know how to break in to some types of systems with some types of configurations, I know how to exploit poor programming in many bits of software, hell, I write it for a living, I ought to, like an architect knows the weak portions of his buildings, I also know how to rob a bank, kill someone, blackmail someone, et al..
Do I? No. Never. Honestly, and neither do most (if not all) of you. We all *know how to commit crime, what separates us from doing it is our consciences, our decisions, and our relatively hostile-free environments and comfort levels.
Do I believe that the government should be held responsible in some ways for disciplining and preventing malicious acts and intents? YES!! But that also goes hand in hand with not putting a dollar sign on everything, because dollar signs are generally the root of most, if not the vast oceanic majority of all malicious intent, and let's face it kids, poverty is the number one fueling factor in most petty crimes... and this 99% / 1% economy isn't helping to lower the poverty rates.
Do I believe that the government should be held financially *liable for crime that is committed?!? Hell no... how could society operate in such a way? If every time a criminal succeeded, the 'good' guys, or the justice system always got punished for it? They would wreck the streets, and the lives, of every citizen in this country until we were so under their control that nothing even remotely suspicious happened without 6 gubmt agents being en route to or on the scene, one way or another. Who is the criminal then?
This, to me, is a terrible idea... not to mention the inevitable outcome for (arguably) the most secure software out there (with many exceptions, obviously): open source. What then Bruce? How do we hold them liable?
It takes three to dance to this music, again, and the music shouldn't be punished because it enticed one of the two dancers to get a little fresh with the other one, without permission. I will say that I agree that software needs to be more secure, obviously, and I will also grant that your idea would seriously increase the security of software products out there. However, your idea should be cast away for many of the same types of reasons that the 1st Amendment should never be compromised because of some asshole named Reverend Phelps, or Johnny Nazi using it to their twisted advantage. It's much deeper than a security issue, or a financial issue, it's about the rights of the citizens of this country, and the lessening of power / oversight granted to our government. I truly do think, that in many ways, this country needs to take more effing responsibility for their actions. Educating yourself about anything is always a good thing, being a victim because you're an idiot is how we learn. If you burn your hand on the stove, you know it's gonna get burned there. Either build a new stove that doesn't burn hands, ask someone else to build it for you, or keep your fucking hand off of the stove. The stove company is not at fault, and it's especially not at fault if someone breaks into the person's home, cuts the gas line, and fills the room with gas awaiting for ignition. (You know, that 'security flaw' in most gas stoves that allow for explosions and death?) Seriously.
Now, if the software companies created security flaws with the direct intention of exploiting them themselves, or using a third party to gain from the exploitation, THEN they should be held liable. Burn em' down. Anything other than that simply needs a warning label... or does it? Don't tell me you're all for someone suing the coffee vendor because they burned themselves on the hot coffee? Give me a break. No one ever needed a goddamn warning label to know that "hot things burn", and according to your own statement:
"There's no other industry where shoddy products are sold to a public that expects regular problems"
See? Right there... you said it yourself. The public 'expects regular problems'. Just like I expect the stove to burn my hand if I touch it. Just like I expect hot effing coffee to burn me should I pour it on myself, or it gets jostled in such a way. As long as the manufacturer / vendor gives me a stove that doesn't explode during normal operations (sans third party tampering), and a coffee cup that doesn't have an intentionally placed hole in the bottom, or is known to disintegrate quickly in-hand, I'm TOTALLY fine with knowing the risks / limitations / pros / cons of the products I use, and choosing accordingly. I'm not suing Starbucks® cause some ass flipped the lid off of my cup and seared me with the contents... it's not their fault that human interaction can create havoc, it's kind of a given.
Give it up already. Let the lawyers go hungry for once.
I'll just say this: using the Internet is not a friggin' right. We're a pretty privileged country, but everyone being able to drive a car, and 'surf' the internet is not a requirement. Life went on fine without either of those things. That being said, using a car has (a TON of risks), as does the using the internet, as does WALKING DOWN THE STREET. You make a clear choice to perform these actions knowing some of the risks, and experiencing real-world manifestions of others... you learn. You live, you learn, you adapt, you move on. If I sued a driver or the city every time some asshole cut me off, or rolled through a four-way stop, ran a red light, etc... nothing would ever get done, and no one would drive anywhere. Likewise, if I whined about every shady character I ran into on the street to the police, or the government, and sued them for not 'cleaning it up out there', the world would seriously just crumble into idiocy.
Take some responsibility for hell's sake, trust the Market to create the rest, or if you think it's so important, take a bite out of that American dream and do it your damn self. Pointing fingers and sueing when the only true fault lies in human nature, is simply idiotic, and incredibly daft. If you think Mrs. 90-something with no idea what is going on should be able to safely use an electronic world community without risk, or education, create some little Playskool 'Grandma's First Internet' and market it as the most secure, sweet, cutesy, user-friendly, featureless crap tool there is and make a billion dollars. Don't ask the government to force every software creator in the world who sells a product to assume all risks the user takes when launching out into the big scary world.
I still don't see one product on the market that you can buy that says you are 100% immune from all malicious human intent when walking down the street (or surfing through the internet) while using it, and if they do, we all know it's bullshit. You said it yourself. We know the world isn't safe, we know the internet isn't safe. We also ought to know where in each we can be relatively safe, and how best to balance that safety with our preferred level of convenience. The government has very little business whatsoever making those decisions or tradeoffs for us, and you know it... because you preach about it nearly every day. Punish the guilty, lower the incentive for crime, try to save the impoverished, all the while maintaining the privacy and freedoms essential to our lives.
@Anonymous, then how does anyone, be it a health care professional, or car manufacturer survives one liability sue in the first place, let alone several?
if you can sue, based on the procedure X being wrong, even thou the defendant never even used the said procedure...
one thing you fail to grasp, is that a secure software, is secure independently of it environment. clarifying: password safe, is a secure application, even thou it runs on windows. why? because you have to break the OS first before you can break the application, in this case you have to break windows in order to know what password the user is typing in the password box. so windows is liable not the password safe.
Unsecure applications, those that are at risk of being liable, are those that you don't have to break the environment before you can break them, for instance, to break a web app that is vulnerable to an SQL injection attack, you don't really care what operating system it is running on.
Same happens with cars, if a giant boulder crashes your car they are not responsible for any damages, on the other hand,if on your brand new car the breaks or the airbag fail it's their fault.
thats why you don't see them specifying that you can only drive in this or that road, or region, that you have to wear a special suit to drive the car, and so on...
"Anyone who buys and uses medicine without doing their own research, checks, etc deserves what is coming their way."
So buyer beware? So much for market entry as who is going to be a product without any research/check to do on? New car models? New software? Oh the seller didn't disclose all information? Then what?
What's the difference between direct bodily harm and financial harm? It's both harming and hindering my right to the pursuit of happiness as defined by the first amendment. It seems like you are taking the assumption that all attacks are malicious intent. I use a computer and then sell it. I use a disk cleaner, but it fails to do its job as described. The next user finds my credit card information and bank account information and I'm wiped clean in 2 days before I can react. Shouldn't the software manufacture be held liable to a degree for something?
@Luis: "thats why you don't see them specifying that you can only drive in this or that road, or region, that you have to wear a special suit to drive the car, and so on..."
Yeah, like we don't have the electronics manufacturers carefully specifying and controlling the realm of application for their products solely because of liability issues. Try the common-as-spit LM311 voltage comparator:
(see bottom of page 23). I haven't actually called up the president of National to use an LM311 in a safety-of-life application, but I'm pretty damn sure that the normally $0.20 part will significantly more expensive.
Naturally, this sort of thing will never come to software. I mean, it's completely different. Or something.
@SecureApps: "So buyer beware?"
You mean there is a market where this is not true?
"It's both harming and hindering my right to the pursuit of happiness as defined by the first amendment."
Yes, that there First Amendment of the Declaration of Independence. The famous one you learned about in school. Maybe this should be the first question on the Computer Programmer Accreditation Exam?
Somewhat related: in the Summer 2008 issue of SciTech Lawyer (which doesn't appear to be available online yet--check the libraries instead), there's an interesting article on a growing trend to use the Payment Card Industry Data Security Standard version 1.1 as a legal standard in tort liability. Essentially, when a merchant handling credit card data suffers a security breach, and banks sue to recover the cost of reissuing cards, banks are increasingly pointing to a failure to follow this standard as evidence the merchant didn't take reasonably prudent precautions with the data. Several states are also considering legislation to incorporate part of the PCI into their legal standards for imposing liability.
The author points out a couple difficulties. First, damages of $20-$50 to reissue a card multiplied by the number of cards reissued can wipe out a small or medium-sized merchant. Second, the PCI was drafted as a security standard, not a legal document, so it can raise ambiguities when interpreted from a legal point of view and the sysadmins who follow it could wind up on the wrong end of a legal ambiguity even if they're complying with technical best practices.
The article's light on legal jargon and is a good read to get a sense of some of the practical difficulties it would be necessary to resolve in imposing liability for breaches.
"The next user finds my credit card information and bank account information and I'm wiped clean in 2 days before I can react. Shouldn't the software manufacture be held liable to a degree for something?"
No. Why? Because just like I check that my door is in fact locked and secured to my taste before leaving the vehicle or house, after locking it, it falls on you to check that whatever software YOU purchased after research, performs what YOU want it to perform, before blindly trusting that it does so.
If it fails to do so on a number of occasions, the incentive will be there to make sure it doesn't fail, lest they go out of business.
Do I believe that my lock will prevent all access to my home or vehicle? F#@$ no. If anyone REALLY wants to get into my house or car, they will find a way. What matters is my level of comfort regarding the ease of operation, convenience, and the statistical feeling of safety I have in the area in which I live, and the relative effort it would take to get in. If someone manages to get into my house, I'm going to blame the fates or myself, not the Master™ lock on my door which was supposed to 'secure unauthorized entry by using a unique key'. And, to keep in tempo with your example, if it was later found that the lock itself was actually just a trick lock, that opened for anyone (going against it's advertised security) then curse me twice for trusting what a corporation tells me to get me to buy their product.
Shit, if we believed every single corp. tagline as blindly as you'd have to had trust your disk-cleaning software to screw up like that, we'd have a lot more problems than trojans, ID theft, and password sniffers to worry about I'm afraid.
Wake up. You have a brain. Try to use it. Christ, this country has gotten so lazy that now, no one even wants to have to THINK anymore... just DO DO DO, or more like SIT SIT SIT, ZING ZING ZING, NOW NOW NOW, ME ME ME.
Wake up. Read something. Test things. Be skeptical. Question everything. And for hell's sake stop pointing fingers and looking for a payday every time you get screwed. Even if some jack-a-muffin 'wiped you clean' in 2 days, you've still got it better than most of the world, due simply to geography.
So suck it up, and next time sell your computer to a kindergarten class or a trusted person instead of some shady guy named IcePick0666 on craigslist. Better yet, smash your hard drive and melt down the magnetic platters first, since you should know, as a user, that that is just about the only proven way to 'clear your hard drive'.
There's always a window to break, or a wall to smash, and there are too many 'victims' out there as it is, if you want someone to coddle you throughout your life, move into a retirement home. I, for one, don't need the government hassling me, or escorting me in my daily life or online commerce for my 'own good'.
Just take our current trampling of privacy and freedom due to the 'liability' which the public opinion has placed on our government for 9/11.
Sorry to say, bleeding hearts, that laziness and/or incompetency aside, the government is not responsible or liable for what happened on 9/11. Terrorists are. But we want some kind of guarantee from that very same, incompetent government that it won't happen again (fucking laughable, we want a guarantee that religious zealots will not succeed in performing acts of violence, something that has gone on since the dawn of humanity)... so what happened? WHAT HAPPENED? We lost a shit-ton of privacy rights, went to a useless money-pit of a war, screwed our economy, lost citizen moral, trust in our president, trust in 'foreigners', and just reverted back into a modern-day mini-dark-ages, well on our way to police state funny farms... and have we even caught the terrorists behind the attack???? I don't even have to answer that...
Real good. Liability. Blame. Seems to work... just blame the middle man, or the venue, and everything is right as rain.
"how does anyone, be it a health care professional, or car manufacturer survives one liability sue in the first place, let alone several?"
Both medical professionals and the automotive industry have adapted to the US liability environment--although in somewhat different ways.
Common to both is massive government regulation. That helps to insulate from liability, since the defense can be raised that they're just following government-approved procedures.
Consolidation of the automotive industry probably has resulted from --as much as anything else-- the large capital requirements for efficient assembly-line production. But that consolidation has left only a few large producers for the American market, and those manufacturers are well-known for retaining some of the meanest, nastiest lawyers around. There aren't any small, nimble innovative automotive manufacturers in America. And --other than style-- automotive designs have been relatively stagnant.
The health care profession has also consolidated. That's arguably more traceable to litigation and malpractice insurance costs. It's debatable how much innovation is actually taking place in medicine--but we might grant that innovation hasn't really suffered to a harmful degree. There is a shortage of health-care professionals serving rural America, and the government has attempted to remedy that situation.
I have no doubt that the software industry would survive a "vendor liability" regime. But the industry would be profoundly changed.
I think it would be changed for the worse. Much worse.
You misunderstand.. I am against frivalous lawsuits and do my own research on thigns. but your presumption that how the data was recovered was by force is invalid. Maybe the installation of a different OS caused the data to be recovered unintentionally. Maybe installing an app caused it. Who knows. We don't. What we do know is the vendor said it will prevent data recovery and it failed to do its job -- at worst that is false advertising.
There is a clear difference. Based on what you are saying, Ford should not have been held liable for the issues with the Pinto's fuel tank. After all, if the person wasn't rear-ended there would have been no damage.
All I am saying is that there are cases where software vendors need to be held liable and their EULA precluding it does not mean they escape it.
"Yes, that there First Amendment of the Declaration of Independence. The famous one you learned about in school. Maybe this should be the first question on the Computer Programmer Accreditation Exam?"
Your assumption I'm a beliver in accreditation exam as the solution for licensing is amusing since I never said how I would do it. But I suppose you'd be okay if people who did contracting work were uninsured and unlicensed come work in your house as well.
"Wake up. Read something. Test things. Be skeptical. Question everything. And for hell's sake stop pointing fingers and looking for a payday every time you get screwed. Even if some jack-a-muffin 'wiped you clean' in 2 days, you've still got it better than most of the world, due simply to geography. "
So you check to see if all your pills have the right amount of whatever they are supposed to have, your do you trust whatever comes in the label, because you know that your goverment "got your back"?
I for one have only a simple knowledge of chemistry wich does not enable to check the said pills. and I really don't have time to know every thing I would need to know to be able to test the pills. so I trust...
when you buy, or rent a house/appartment, do you check the plants, if the structure does not have any "holes", or do just look around slightly to see if there are no obvious holes (like a wall being missing) and for the most part you trust that the foundations are within the required security?
again I for one don't have the time/knowledge to "test" the appartment I just rented, and if the building collapses for no good reason (for example an airplane coliding with it) you can be sure as hell I will sue those who built it for liability, in case I don't die... that is...
when a merchant loses your credit card information do you "suck it up" and are thankfull that you were born in the "right" place? your do you go to your credit card company and say that you did not made those purchases and that you want your money back? do you think they will give you the money back just because? no, they will give it ack because they are required by law...
and as these there are alot more exemples where you don't have the knowledge to "test" or even be skeptical and you trust that your "service providers" will test and be skeptical for you, because, if they are not, you will be able to sue them and win.
Most people don't have, and never will have, the time to know everything you have to know to make sure that a certain software is secure or not.
even most computer geeks can't really tell if a piece of software is safe or not - otherwise there would be no unsafe software, since who ever is programing it would program it in a secure fashion - but they spend hours apon hours surfing the net to see the lastest vulnerability realease...
most ppl don't have that luxury.
The fundamental problem is not software liability, but liability in general.
Suppose that a company loses full identifying information on several million people. Right now, in some states, they are legally liable to inform people that they screwed up. That, as far as I know, is as far as it goes.
Now, suppose that such a company was liable for serious money in this case (and couldn't just cover it up, for some reason). That is where liability can enter the picture.
If a company can be held liable, it will try to limit the liability. It will likely seek insurance, and the insurance companies will have standards. It will try to unload some of the liability onto assorted suppliers, and this can be done in a free market way: suppliers that offer greater security (through reputation or partial indemnification) will have a competitive advantage. If they offer partial indemnification, they will have contracts detailing how their software may be used.
Free software will not be affected. If it's economical for the job, somebody will take it and certify it, saving money over writing their own. (Right now, Red Hat will happily sell you OS packages, and support them and stand behind them, despite the fact that they didn't write or pay for most of what's on them.)
Similarly, if people were held liable for at least some of the damage done by their pwned computers connected to the internet, there would be reasons to try to get more secure computers. Right now, there is none; a user may notice a computer getting slower, but that's about the extent of it.
There is no need to slap liability onto software producers per se. All that is necessary is to slap liability for actual harm done, and the rest will follow as the market dictates.
My two thoughts:
Firstly, I can't see how vendor liability could be workable. Is it personal liability for the coder who wrote the line of code with the vulnerability or company liability ? What if the coders were working in pairs ? What if the vulnerability spans several lines of code written by several different people ? Should the liability rest with the company that wrote the code or the company that published the code ? Do the companies only have to live up to their claims or will there be some "minimum standard" for each product type ? Will the "minimum standard" be decided in advance or will the courts define it only when the company is sued ? Can a company transfer their liability by hiring an external security auditing company to audit their code ? How do you objectively measure the security of a product ?
Secondly, companies will find it easier to avoid the liability by moving the relevant part of their operations offshore than by making sure their code is secure.
Vendor liability is a proposal attempting to make the average security of code better. The proposal comes from the perception that the market forces have failed to produce secure code. I think that market forces haven't failed. They just haven't succeeded yet.
Despite all of this, I think that there could still be some advantage in having legislation that covers code security. (Maybe not vendor liability but something else that affect security) It should be fairly soft to start with and only provide mild pressure on smaller vendors (effectively none at all on large vendors) but this pressure would make at least some of them improve. Once they have spent money on security they will want to market this security. Greater marketing focus on security will polarise some of the consumers which will have an effect on the marketability of security and therefore the incentive for other vendors to provide and market their own security.
The legislation should be able to be tuned to match the current state of security in the industry.
"All that is necessary is to slap liability for actual harm done, and the rest will follow as the market dictates."
With high transaction costs imposed by the legal system, imperfect information among market participants, emerging technology, and a dynamically evolving threat environment--with all that--there's just no reason to believe that any initial allocation of liabilities will result in a system which approaches a stable, efficient equilibrium.
Unless --of course-- you happen to believe in the magical market fairy.
Correct me if I am wrong but most of these arguments boil down to,
1) A software product fails
2) This incures a loss
3) The software user wishes to recover the loss
4) They aproach the software vendor
5) The vendor points to End User Licence.
And the result is stalemate.
There are two "free market" solutions,
1) Do not buy software
2) Only buy software from vendor with liability protection and appropriate end user licence.
Which as 2 is usually unavailable (except bespoke which only Governments can afford these days) means you effectivly have only option 1 is available, which in a vendor only world is the same as going without.
Variations on the "Restrictive Licence" problem was one of the reasons the Free/Open software movement got going.
There is however another solution other than Free/Open software, we use it when we drive a car and usually for our homes and holidays etc. It is called Insurance.
In the case of Drivers in most countries you are required to have insurance before you get behind the wheel of a car. Likewise most sensible people insure their homes and have Holiday insurance for their posesions and health whilst traveling.
The price we as users pay for insurance is usually dictated by "known risk" and is calculated by a very well known process.
So why not require the following,
1) Software vendors have product liability insurance.
2) Users have loss/liability insurance.
All that is actualy required is a method (Metrics) that can be used by the insurance companies to rate products. As many know the Underwriters lab UL does this currently for all sorts of products.
To encorage users to use more secure products (ie window locks alarms etc) the insurance companies offer discounts for their use. Likewise if you have high risk items you need to make that known and pay a higher premium for the increased risk.
Perhaps we should try mandatory insurance for users before they connect up to the Internet etc (yes I know it has a lot of issues within it's self) as an alternative method to making vendors 100% liable for all eventualities.
"Offtopic, but does Bruce ever take part in the commentary to hos blog posts? I haven't found an instance of his replying to any of the comments posted, though my Google-fu may be weak."
Yes, I do.
I read the comments here, and occasionally I respond. Unfortunately, the comment sections have gotten so big and I have gotten so busy that I don't have time to keep up with 100% of it. Often, by the time I have a change to read a thread, everyone else has moved on.
Regarding software liabilities and free software, I am preparing a post right now. We can debate it there.
But honestly, I worry about the knee-jerk "lawyers are evil" people. Lawyers are neither good nor evil. Like everything else in a market economy, they are self-interested. And, like everything else in a market economy, they work if their self-interest furthers societal interest.
It seems to me that what we need is not a one-size-fits-all legal standard of strict liability, but a marketplace in which liability is one of the points to be bargained over by buyer and seller.
And the best way to get there, in my view, is to explicitly void all "clickwrap licenses" and replace them with the old common law principle that states: if you purchase something without a written contract, then the deal ends when the money and goods change hands (that is, all terms of the deal must be stated by then, or it's too late).
This would mean that Microsoft or McAfee could still offer their software under the extreme one-sided licenses they do now, but only if they print the license on the OUTSIDE of the package so you can read it before you get to the cash register at Fry's. (Similarly for sales-by-downloading, the "clickwrap" would have to occur BEFORE the checkout screen.)
A nice side effect of this requirement would be that consumers might actually take the time to read these things more often before making the purchase -- and would have more of an opportunity to compare the terms offered by different vendors.
I would also like to see some regulation of what the licenses can say, but that will be less necessary once the licenses are available to be read before the decision to purchase has been made. (Mainly, I would ban license terms that prohibit the purchaser from sharing his experiences with that software with other potential buyers, as certain antivirus package publishers have been known to do.)
Actually I'm very interested in
a) a user with little knowledge installs software
b) the software fails to provide basic protection
c) the user's software gets taken over and used for bad things
d) the user doesn't care because they don't realise and wouldn't understand even if they did.
The free market doesn't any influence on this.
Please don't use the word 'incentivise' again, it's a made-up business word. There's a perfectly good alternative which already exists - 'motivate'. Failing that, just be a little more verbose, 'give incentive to'.
PS - Loving your work.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.