Schneier on Security
A blog covering security and security technology.
« Friday Squid Blogging: Pretty Squid |
| Paid Informants in Muslim Communities »
August 13, 2007
House of Lords on Computer Security
The Science and Technology Committee of the UK House of Lords has issued a report (pdf here) on "Personal Internet Security." It's 121 pages long. Richard Clayton, who helped the committee, has a good summary of the report on his blog. Among other things, the Lords recommend various consumer notification standards, a data-breach disclosure law, and a liability regime for software.
Another summary lists:
- Increase the resources and skills available to the police and criminal justice system to catch and prosecute e-criminals.
- Establish a centralised and automated system, administered by law enforcement, for the reporting of e-crime.
- Provide incentives to banks and other companies trading online to improve the data security by establishing a data security breach notification law.
- Improve standards of new software and hardware by moving towards legal liability for damage resulting from security flaws.
- Encourage Internet Service Providers to improve customer security offered by establishing a "kite mark" for internet services.
If that sounds like a lot of the things I've been saying for years, there's a reason for that. Earlier this year, I testified before the committee (transcript here), where I recommended some of these things. (Sadly, I didn't get to wear a powdered wig.)
This report is a long way from anything even closely resembling a law, but it's a start. Clayton writes:
The Select Committee reports are the result of in-depth study of particular topics, by people who reached the top of their professions (who are therefore quick learners, even if they start by knowing little of the topic), and their careful reasoning and endorsement of convincing expert views, carries considerable weight. The Government is obliged to formally respond, and there will, at some point, be a few hours of debate on the report in the House of Lords.
If you're interested, the entire body of evidence the committee considered is here (pdf version here). I don't recommend reading it; it's absolutely huge, and a lot of it is corporate drivel.
EDITED TO ADD (8/13): I have written about software liabilities before, here and here.
EDITED TO ADD (8/22): Good article here:
They agreed 'wholeheartedly' with security guru, and successful author, Bruce Schneier, that the activities of 'legitimate researchers' trying to 'break things to learn to think like the bad guys' should not be criminalized in forthcoming UK legislation, and they supported the pressing need for a data breach reporting law; in drafting such a law, the UK government could learn from lessons learnt in the US states that have such laws. Such a law should cover the banks, and other sectors, and not simply apply to "communication providers" — a proposal presently under consideration by the EU Commission, which the peers clearly believed would be ineffective in creating incentives to improve security across the board.
Posted on August 13, 2007 at 6:35 AM
• 21 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Unfortunatly in the UK at the moment the (elected) Government wants to take the oposit aproach.
1, The Banks should investigate card fraud.
2, Auction houses etc should investigate art fraud.
and so on.
I guess we should let the winning politition investigate any posible electral fraud enquiry as well...
>(Sadly, I didn't get to wear a powdered wig.)
A sad day indeed. Did you try to get a wig past courthouse security?
"Sadly, I didn't get to wear a powdered wig"
The history of wigs might just make your skin crawl.
Back 300 years ago it was quite common to shave all your hair off as a cure / prevention to body lice and fleas and other such parasites. Which then off course necescitated the use of wiggs.
A lot of wiggs had a base of highly flamable horse hair and wicker, and due to the use of candels for lighting some of which where in chandalers it was not unknown for the more ornate ladies wigs to catch fire. Also they often had amongst otherthings mice living in them.
Also the white powder used on wigs in later years was formd by putting rolls of lead sheet in pots of urine to form a briliant white lead compound, it was also used in lead based paints. As we know lead is very undesirable in the body, in sufficient quantities it can cause infertility (supposedly what killed off the Roman Aristocracy) and many forms of madness, open running sores and other less than plesant medical conditions...
"legal liability for damage resulting from security flaws."
Who would you hold liable? Presumably not just the author of the software, who might be, say, an anonymous contributor to an open source project based in Sweden, and hence untraceable by US courts.
Would liability only exist where there is a commercial contract to supply software code and/or service? Contractors would then supply FOSS if they were willing and able to assume the risk of any flaws in the code. But that doesn't help the sysadmin of an average mid-sized company, who wants to use open source products without contracting an outsider to install and maintain them. Would that mid-sized company expect to sue individual contributors to the linux kernel, or Apache, or Firefox, or whatever? Or would they sue the Mozilla Foundation for giving them defective software?
Open-source or no, if you are paid money to provide a secure system and do not take due care or due diligence to ensure that security, you should be held liable for that failure. Just who gets held liable may be hard to trace, unfortunately. Still, it's taking the easy way out to pretend that nobody can be blamed. As long as you do what you are told and CYA appropriately, you should be able to prove that A) You weren't the weak link, or B) Despite your best efforts this was a failure that could not be prevented.
Following on from this what's the odds of the next American president inviting you into the Oval Office for a chat about Security Theater and how the US can best spend its vast security budget in future?
"Who would you hold liable? Presumably not just the author of the software, who might be, say, an anonymous contributor to an open source project based in Sweden, and hence untraceable by US courts."
Wow, I didn't realise you had a House of Lords in the USA, too!
This is an interesting example of one of the pros of an aristocracy. If one doesn't have to worry about getting re-elected, one doesn't worry about cow-towing to whoever gives you the largest campaign contribution.
Unfortunately, there are a few cons as well.
The UK governement is still leaking out metadata.
From the metadata, the last editor of this document was 'hawkinsm'.
Searching for firstname.lastname@example.org turns up
So the last editor is almost certainly one Meg Hawkins and 'hawkinsm' is almost certainly her network user id.
Anyone with access to the network care to start banging away on this id?
The Wikipedia article on wigs claims that wig powder was in fact made from starch, not lead. It's true that white lead was used for other cosmetic purposes in earlier times, though.
The House of Lords has been moving away from hereditary peers. There are now only 92 left, the rest being appointed or 'spiritual peers'.
It seems likely that the House of Lords will eventually be replaced with a fully or largely elected House, as earlier this year the majority of members of the House of Commons voted for a 100% elected Upper House with a long term of office.
I'd like to do something similar with the Monarch.
@Anonymous, "Wow, I didn't realise you had a House of Lords in the USA, too!"
yeah, but we call it the Senate.
@SteveJ, liability should be assigned to whoever screwed up.
If a vendor sells you a license to use software rife with buffer overflows, they should have some liability for the consequences of their careless development and testing practices.
Perhaps there should be some punitive measures also, things like buffer overflows are so totally preventable that it borders on malice to leave them uncorrected. Arguably, such egregious systemic flaws in the development process should bear more serious penalties than simple coding or implementation errors. Vendors choose not to address the issue for financial reasons, so give a strong financial incentive for them to do the right thing (or a penalty for doing the wrong thing)!
As others have stated, removing the ability for vendors to avoid liability for software errors would have an extremely chilling effect on both hobbyist programmers and open source development. Not many individuals are going to put their assets on the line to contribute to open source development or to release some niche freeware tool.
The right to damages should be constrained to situations involving commercial transactions and cases of malware installed without informed consent. There should be no, or extremely little, liability for individuals except in cases of deliberate backdoors and similar acts of fraud.
I think that liability as strict as you're suggesting would be somewhat chilling of software development. I think it's reasonable to post software publicly, saying, "I don't know whether this is secure or not, but you might find it interesting or useful", or "I know for a fact that this isn't very secure, so I only use it in a sandbox", or "this code is very definitely insecure: I post it as an example of what not to do". But I can't do those things if I'm going to get sued should someone choose to run my code and get owned as a result.
This is not least because it's not possible for a software library or routine to "be secure". Only a complete system can be secure, and the best code in the world can be mis-used and made insecure. I don't think it's reasonable for a programmer to have courts make a presumption that he is liable for the results of someone running code that he offers them. It should be possible to publish source without taking responsibility for it indefinitely.
Companies are allowed to disclaim that their software is not for use in safety-critical systems: I think there should likewise be a way to write software not suitable for use without further security audit.
There's a serious question here, of whether it's reasonable to run code without assuming liability yourself, and conducting a security audit if you think it's appropriate. I think that responsibility should lie with systems designers, not with someone who gives away code for the good of the community. If you sell a "solution", as opposed to giving away source, then I think that's a different matter, and it would be helpful to put some liability on people for the code they sell.
@Anonymous: I am British, I said "US courts" in that post because by "you" I meant Bruce (who also advocates liability for software defects), not the House of Lords.
Liability doesn't belong on software developers, and as much as I consider the needs of security overlooked in our networked era, I will do everything in my meager power to fight software author's liability.
Others have pointed out the chilling effect this could have on hobbyist and open source programmers, but it goes further. If a credit card company uses Windows on their network and that results in a breach that leaks consumer data, we might want to hold Microsoft responsible. But what if the NSA is using Windows to hold top-secret information? Should Microsoft's security have been "so good" that anyone ccould use it for any purpose and expect it to work?
What about disclaimers of warranty? These are present in practically every end-user license agreement, and they appear to have the effect of nullifying any attempted liability law. You might decide that liability laws supercede disclaimers of warranty, but then how could someone possibly provide cheap programming services? Any real warranty comes with huge built in costs (for insurance and engineering). So you've just screwed over self-employed programmers.
Liability belongs with the people who collect sensitive data. They might misuse Windows (say, not keep the software up to date, or not turn on the firewall) and that could very well lead to a breach. They might also have bought the wrong product. Placing the liability on the data miners has two effects:
1. Increased privacy resulting from more diligence on the part of data miners and some companies deciding collection isn't worth the risk, insured or otherwise;
2. Better computer security, resulting from the market demand data miners will put on software vendors to bring their insurance costs down.
Liability again.... There's nothing stopping anyone from buying software for which the vendor accepts complete liability right now.
Oh wait, no one knows how to write software like that at prices people can afford.
Mandating liability will simply create a black market. (legal supply cannot meet demand)
I think the operative concept here should be 'reasonable expectation of security.' If a software product is marketed for purpose X, it should be secure when used for this purpose. Software vendors should only be liable if their products are insecure for the purpose they were marketed towards. If a customer misuses a product, then the liability should fall on their shoulders.
For example, if a bank suffers a data breach because they used software designed only for home use, the bank should be held liable for using inappropriate tools. It is not the software developer's fault that some numbskull used the wrong software for the job.
In contrast, if an end user plugs a WinXX box into a broadband connection and gets owned within minutes, Microsoft should be liable. There should be a reasonable expectation that e.g. XP Home is secure enough not to get owned when used by the typical home user.
To protect hobbyist developers and open source projects, there could be a class of software designed for experimental use. Like experimental aircraft, use would be on an 'at your own risk' basis. Also like experimental aircraft, experimental software should not be made available commercially. If money changes hands for anything more than the cost of distribution, software should not be allowed to be disclaimed as experimental.
Liability for consultants and employees in work-for-hire situations should be left up to their contracts. If a purchaser wants to assume all liability themselves, they should be free to do so.
Do any banks or brokerages let me paste a PGP public key into an account webpage, so that all email they send me is encrypted with that key? What is this infatuation with banks and brokerages demanding and begging me for an email address where they send clear-text email to?
Do any banks or brokerages have a web-based account where I can go to a page and see all the IP addresses where I have supposedly logged in from?
Something fishy is going on. It's not believable that crypto experts are this incompetent at convincing people to implement basic security enhancements. Oh, lets talk about liquids on planes instead. Cool distraction there.
Another problem with software liability laws is that software is made up of many components, possibly from many vendors. Given the heavily litigious nature of America, and the number of class-action ambulance chasers, what happens when a data miner has a break-in? Let's say a solution uses Windows and Oracle's database. Attorneys, looking for a big settlement, will probably sue both companies. If the security problem were Microsoft's fault, should Oracle be forced to pay a settlement, or pay attorney's fees to prove it was Microsoft's fault?
Worse, courts really don't understand technology. Courts also favor parties with more money - not intentionally, but to some natural extent. Trying to litigate the cause of a security breach in an age when software is getting more and more complicated seems doomed to fail. And it will drive the cost of software way up.
Put the liability on the party that loses the data. We need liability there anyway, in case an employee is storing 1,000,000 unencrypted social security numbers on a laptop that they leave at the airport.
I'm really in my arm chair at this point, but I suspect, based on what I've seen in the press, that most breaches of security happen due to bad practice rather than bad programming. That's not to say bad programming isn't rampant, but proper compartmentalization, defense in depth and patching practices are often sufficient to safe-guard data.
Something like limiting software liability to e.g. 1000 times sales cost less physical media cost makes hobbyists immune and also provides a reasonable limit for others. A requirement that administrators demonstrate that they have followed, as far as reasonable, the instructions that came with the software will also help.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.