Schneier on Security
A blog covering security and security technology.
« Details of a Cyberheist |
| More on FinSpy/FinFisher »
May 1, 2013
Google Pays $31,000 for Three Chrome Vulnerabilities
Google is paying bug bounties. This is important; there's a market in vulnerabilities that provides incentives for their being kept secret and exploitable; for Google to buy and patch them makes us all more secure.
The U.S. government should do the same.
Posted on May 1, 2013 at 1:58 PM
• 18 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Sure, why not? Just add it to the whole Federal CyberSecurity budget :P
This will work about as well as the rat bounty did in Ankh-Morpock :-)
Actually, the US government does collect/pay for vulnerabilities. But it's to keep them secret and exploitable...
Setup a honeypot to grab the 0day Google is paying for.
This is nice because this is taking a bug from the black market to the vendor. And this makes money for you.
The TLA are already making such honeypots, but won't sell such 0days to vendors like you may do now.
Google paying for bug bounties has the main effect of increasing the prices that intelligence agencies, bot herders, and other blackhats pay for 0days. These guys will always buy the best (most exploitable, hardest to fix) vulns. If the vulnerability finders need cash, they can always sell their second-tier vulns to Google. So this will result in more vulnerabilities being fixed before they're exploited, but probably won't affect the high-end trade at all (other than increasing prices).
There's a Dilbert cartoon for this, something like:
"I just wrote me a new car".
It's all good till Google employees figure out that they can outsourcing the discovery claims and share the profit.
Google paying for bug bounties has the main effect of increasing the prices that intelligence agencies, bot herders, and other blackhats pay for 0days.
Which will directly affect their cash flow, so is a good thing. More important however is that more folks will jump on the bandwagon given a fully legitimate incentive to compensate for their time and hard work. Not everyone is driven by selling to the highest bidder only. Any vulnerability found, reported to and paid for by a vendor is one less we need to worry about.
As to the US and other governments doing the same, I highly doubt it. There are probably already way too many "cyberdefense" folks, agencies and companies out their making a career and a mint out of weaponising vulnerabilities and exploits. As if they would care about common sense solutions that actually benefit the general public instead of themselves.
Boss: Our goal is to write bug-free software. I'll pay a ten-dollar bonus for every bug you find and fix.
Alice: WE'RE RICH
Wally: YES!!! YES!!! YES!!!
Boss: I hope this drives the right behavior.
Wally: I'm gonna write me a new minivan this afternoon!
IIRC Ankh-Morpock "rat farming" is based on historical events.
For something similar to apply to "bug bounties" it would need to be possible for those claiming for the bugs to somehow create them. Most likely involving conspiracy between claimants and developers.
If you have employees dishonest enough to add security bugs to your product, and then reveal them to 3rd parties so they can cash in and split the winnings, then you have more problems than your monetary loss.
I would have thought this sort of thing would become obvious after a while, because you would find that some people seemed to be introducing more security bugs than you would expect. Also, code review makes it non-trivial to introduce an obvious one; you'd need to be pretty creative.
Mozilla also pays security bug bounties: http://www.mozilla.org/security/bug-bounty.html . We've had a lot of success with it.
@Gervase Markham "Also, code review makes it non-trivial to introduce an obvious one; you'd need to be pretty creative."
underhanded.xcott.com shows that code review is not useful when the evil programmer is payed $100.
In previous comment, replace "payed" by "bribed by a promise of".
A developer can introduce a backdoor into code that can evade code review by simply introducing a business logic vulnerability into the code.
If they introduce a critical vulnerability that is not based on business logic, if it is found by code review it would be chalked up as an accident.
The US Government does pay for security bugs. There are a lot of defense contractors (and consultancies) who do this work for them. This is no secret.
They do it for offense, probably, though they could be using it to detect attacks from foreign powers ahead of time, for counterintelligence purposes. There is enormous value to that, moreso then in the value of attacking foreign countries for spying purposes.
31,000 (10K per bug) is better then the complete crap legal vendors often offer for this. But it is still complete crap.
Companies pay for the creation of ideas and functionality, why shouldn't they pay for vulnerabilities, which are just the inverse of that same type of creative work? I applaud this. Security researchers deserve to get paid for making products more solid. In a perfectly competitive market, the company would have to pay for those services anyway, it's only the frontierist, perma-beta mentality of the software industry that allows them to avoid it. If the software industry was as mature as the thousands of years old carpentry industry, then easily breakable products would not be acceptable and the normative consequence for this negligence would be easily winnable lawsuits.
"The U.S. government should do the same."
Aren't we giving enough to China?
I think it's a great idea. If you haven't checked out Synack (www.synack.com, they're brining this to a larger swath of the enterprise.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.