A Security Market for Lemons

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem—I use PGPdisk—but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary.

Unfortunately, the only impressive aspect of Secustick is its hubris, which was revealed when Tweakers.net completely broke its security. There’s no data self-destruct feature. The password protection can easily be bypassed. The data isn’t even encrypted. As a secure storage device, Secustick is pretty useless.

On the surface, this is just another snake-oil security story. But there’s a deeper question: Why are there so many bad security products out there? It’s not just that designing good security is hard—although it is—and it’s not just that anyone can design a security product that he himself cannot break. Why do mediocre security products beat the good ones in the marketplace?

In 1970, American economist George Akerlof wrote a paper called “The Market for ‘Lemons‘” (abstract and article for pay here), which established asymmetrical information theory. He eventually won a Nobel Prize for his work, which looks at markets where the seller knows a lot more about the product than the buyer.

Akerlof illustrated his ideas with a used car market. A used car market includes both good cars and lousy ones (lemons). The seller knows which is which, but the buyer can’t tell the difference—at least until he’s made his purchase. I’ll spare you the math, but what ends up happening is that the buyer bases his purchase price on the value of a used car of average quality.

This means that the best cars don’t get sold; their prices are too high. Which means that the owners of these best cars don’t put their cars on the market. And then this starts spiraling. The removal of the good cars from the market reduces the average price buyers are willing to pay, and then the very good cars no longer sell, and disappear from the market. And then the good cars, and so on until only the lemons are left.

In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.

The computer security market has a lot of the same characteristics of Akerlof’s lemons market. Take the market for encrypted USB memory sticks. Several companies make encrypted USB drives—Kingston Technology sent me one in the mail a few days ago—but even I couldn’t tell you if Kingston’s offering is better than Secustick. Or if it’s better than any other encrypted USB drives. They use the same encryption algorithms. They make the same security claims. And if I can’t tell the difference, most consumers won’t be able to either.

Of course, it’s more expensive to make an actually secure USB drive. Good security design takes time, and necessarily means limiting functionality. Good security testing takes even more time, especially if the product is any good. This means the less-secure product will be cheaper, sooner to market and have more features. In this market, the more-secure USB drive is going to lose out.

I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that “won” weren’t the most secure firewalls; they were the ones that were easy to set up, easy to use and didn’t annoy users too much. Because buyers couldn’t base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren’t the most secure, because buyers couldn’t tell the difference.

How do you solve this? You need what economists call a “signal,” a way for buyers to tell the difference. Warranties are a common signal. Alternatively, an independent auto mechanic can tell good cars from lemons, and a buyer can hire his expertise. The Secustick story demonstrates this. If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.

Secustick, for one, seems to have been withdrawn from sale.

But security testing is both expensive and slow, and it just isn’t possible for an independent lab to test everything. Unfortunately, the exposure of Secustick is an exception. It was a simple product, and easily exposed once someone bothered to look. A complex software product—a firewall, an IDS—is very hard to test well. And, of course, by the time you have tested it, the vendor has a new version on the market.

In reality, we have to rely on a variety of mediocre signals to differentiate the good security products from the bad. Standardization is one signal. The widely used AES encryption standard has reduced, although not eliminated, the number of lousy encryption algorithms on the market. Reputation is a more common signal; we choose security products based on the reputation of the company selling them, the reputation of some security wizard associated with them, magazine reviews, recommendations from colleagues or general buzz in the media.

All these signals have their problems. Even product reviews, which should be as comprehensive as the Tweakers’ Secustick review, rarely are. Many firewall comparison reviews focus on things the reviewers can easily measure, like packets per second, rather than how secure the products are. In IDS comparisons, you can find the same bogus “number of signatures” comparison. Buyers lap that stuff up; in the absence of deep understanding, they happily accept shallow data.

With so many mediocre security products on the market, and the difficulty of coming up with a strong quality signal, vendors don’t have strong incentives to invest in developing good products. And the vendors that do tend to die a quiet and lonely death.

This essay originally appeared in Wired.

EDITED TO ADD (4/22): Slashdot thread.

Posted on April 19, 2007 at 7:59 AM51 Comments

Comments

Clive Robinson April 19, 2007 8:40 AM

“If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.”

Physical security is usualy rated by the UL (in the U.S.) and given an appropriate rating.

Is there any reason (apart from money and expertiese 😉 why they or somebody similar should not do it for computer security?

I guess the market first has to ask and then encorage it by buying rated products only.

However computer security is (almost) infinatly more difficult to evaluate than physical security and the market is evolving so quickly that getting the expertise required is going to be difficult at best.

So I don’t see it happening any time soon, which kind of sugests that it requires legislation (as with Lemon Cars), but that is a very very thorny issue especialy with Open Source software.

Mike Sherwood April 19, 2007 9:02 AM

There is no market for good security.

Why do you use PGPdisk? Have you personally audited the code? Or is it based on the reputation of PGP? Do you feel that it’s more likely to be based on good security practices because the original author took great personal risk instead of compromising his integrity? Reputation and widespread adoption by people far more knowledgeable than myself are my reasons for trusting PGP. Of course, it didn’t hurt that when I started using PGP, there was no commercial version. It was written by people motivated by something other than profit.

All of the products in any market segment are differentiated by price, features*, price, company reputation, price, politics and price.

Features is a tricky one to compare because you’re really comparing claims of company A against claims of company B. Both companies know this will happen, so the marketing people on both sides will bend the truth (like saying “yes” when “no” would be more factually accurate) to make their product compare favorably.

In any company, price is always going to end up being the most significant deciding factor. The rest of the substance will be lost by the time someone with the money to authorize the purchase decides between Widget X at $A or Widget Y at $B. The lower of $A and $B will be the decision most of the time. The rest of the time can be explained by kickbacks, golf buddies, or other personal relationships that override merit.

When a company looks at security, it’s a return on investment calculation and nothing more. It’s not like the NSA where a lot of people’s lives can be on the line, so it’s quite reasonable to spend several million dollars to design and test something to be as secure as anyone can make it. No company could ever survive in the business world with this type of attitude. The only private companies that get away with it are large scale government contractors. That’s only because the client understands the impacts of purely cost driven decision making.

Mediocre products beat good ones because they’re cheaper. Bottom line price replaced value for expense as a consideration a long time ago.

Timm Murray April 19, 2007 9:09 AM

“In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.”

Interesting, I’ll have to track down that paper. It’s similar to ideas I’ve had kicking around a while. Specifically, capitalism works really well for products that don’t have a large variation in quality. The product from one cotton distributor is more or less identical to the product from another. Distributors can only compete on price and customer service.

However, capitalism tends to break down when the products become more complex. How does the average person choose between computers or home entertainment systems? Even the new car market has these problems. Design and manufacturing problems may not be apparent to anyone (including the manufacturer) for several years. The more complex the product, the more difficult it is to understand if you’ll like it or not.

When that happens, the old argument “if you don’t like that product, don’t buy it” breaks down because it’s difficult or impossible to know if you won’t like it.

Michael Richardson April 19, 2007 9:19 AM

There are now USB sticks that run the “U3”
(I think it is) software. They have a “standard” VFAT partition on which there is a WINDOWS XP driver that installs itself (without being asked), so that you can access the rest of the stick.
These sticks seem to sell for LESS than the equivalent stock sticks.
Naturally, they don’t work in anything other than XP (making USB for transfering files useless), the “driver” is very difficult to get uninstalled (you have to download another program of dubiuous origins to uninstall it. Sony rootkit anyone?), and it’s very hard to deetermine if a given stick for sale at FutureShop is U3 or not, and it’s very difficult to remove the “protection”

What all of this really says is that the whole SCSI over USB with a VFAT file system interface for USB sticks is wrong.
We should have been running NFS over USB, and let the file system structure (and whether it is encrypted or not), up to the USB stick, and it’s myrid of finger printer scanners or whatever.

I’ve been told that among the DRM/TPC crap of the WMA architecture is actually a file system access over USB that is actually pretty good. If someone could convince MS to
liberate and standardize that code, they would get more adoption of WMA into USB devices (including, of course MP3 players), and perhaps one could have a better USB devices.

As for 4G USB sticks. I just say no.
I buy a dozen of the smallest USB sticks I can find (last time it was 128M, but those are gone), or for my camera, the smallest CF cards I can find.
Much better to loose 128M of stuff when the device is lost than 4G of stuff. I bet 90% of USB data loss is not due to loss or theft, but rather, due to laundry machines.
Assess your risks properly.

Sheila April 19, 2007 9:28 AM

This publicly held company has a stick that has been tested worldwide. It also has a feature that causes it to automatically erase itself.

It is interesting that their warranty states the company “makes no representations or warranties that Your Information will not be lost, misused or altered”, even through the company touts that they invented (and has a patent on) the first “transfer-resistant product” in the industry.

From over a million units sold and an estimated 200,000 lost sticks, there has not been a single report of the product being compromised. I always carry one with me.

http://www.revlon.com/Corporate/History1990.aspx

John Neystadt April 19, 2007 9:31 AM

I think that the comparison is inadequate. What characterizes used car market and other lemon markets, is that seller is doing one transaction and doesn’t care about his reputation.

Choice of technology provders, and especially software is often long term, as I expect to upgrade to next version once it is available. Therefore, company who will send a lemon will quickly acquire bad reputation and loose.

Bruce Schneier April 19, 2007 9:34 AM

“Why do you use PGPdisk? Have you personally audited the code? Or is it based on the reputation of PGP?”

It’s not the reputation of the company; it’s the reputation of the technical team working for the company. I know CTO/SCO Jon Callas personally, and trust him. I trust him to hire a good technical team, and develop a secure product. I also know some of the programmers personally.

So yes, it’s trust. But it’s more trust in individuals.

casey April 19, 2007 9:42 AM

Thanks for your insights on “secure USB sticks”. I too have long doubted their security due to the lack of any real specs.

Jack C Lipton April 19, 2007 9:54 AM

Realize that “good quality” is part of “customer service”– those who invest in good quality are investing in good customer service, and, so, don’t survive because there’s no boost to shareholder value from good quality… this financial quarter.

Stephan Samuel April 19, 2007 10:09 AM

In some industries, there are places one can go that only sell reliable equipment. For instance, IBM doesn’t sell consumer equipment to their corporate customers. You pay a premium for everything, but your server hard drives are SCSI and your RAM is registered. Is there something similar in the security world? If not, maybe someone should start one.

bzelbob April 19, 2007 10:11 AM

In this context, good “security” is like good “quality”, i.e. – you aren’t aware of it until something goes wrong. A lot of people aren’t aware that they need security until someone steals their info or does something else with a negative effect on the purchaser of a product.

I remember the big buzz for quality in the 80’s when the work of Deming and others showed that rather than testing, testing, testing we need to build quality in from the outset. The only problem with that approach is that it required re-education and that takes time and effort. This explains why many companies won’t make the effort. Why should they spend money providing something the customer doesn’t even know they need?

Of course, smarter companies could get ahead of the curve. Having some sort of standards even small ones would be a lot more than what we have now.

Roy April 19, 2007 10:43 AM

‘Lemonizing’ was Microsoft’s strategy when they debuted Windows as ‘just as good as a Macintosh — for less money’.

Matt from CT April 19, 2007 10:56 AM

@ Michael Richardson

There are now USB sticks that run >the “U3”

These sticks seem to sell for LESS
than the equivalent stock sticks.
Naturally, they don’t work in anything
other than XP (making USB for
transfering files useless),

I transfer files regularly between XP, Fedora, Ubuntu, and OSX wiht a U3 stick. It’s not a problem whatsoever.

The U3 will only load on supported Windows platform. Annoying since I don’t use those features, but that’s all I consider it.

Misc April 19, 2007 11:02 AM

Jay,
I also use Truecrypt. It’s pretty simple to use its “traveler disk” setup to write truecrypt drivers/autorun scripts to the flash drive, followed by filling up the rest of the drive with a Truecrypt volume.

Roger April 19, 2007 11:06 AM

Physical security is usualy rated by the UL (in the U.S.) and given an appropriate rating. Is there any reason (apart from money and expertiese 😉 why they or somebody similar should not do it for computer security?

Yes. UL does security evaluations because they underwrite insurers, and will lose less money if people have good security products (and good fire protection, which they also evaluate.)

Until computer security insurance becomes a big business underwritten by an organisation like UL, you won’t see the same thing with computer security.

Beta April 19, 2007 11:18 AM

Mr. Schneier, did you ask Kingston Technology for that stick? Or are you telling us that you would take sensitive unencrypted data and pour it into a little device that just turned up in the mail in a respectable-looking box?

Bryan April 19, 2007 11:31 AM

@Roger

That is an excellent point, however, it will be a very different scenario for information “damage” as opposed to physical damage. Most, if not all, software and hardware developers sell their products with extremely strong disclaimers of lost data. In order for insurers to create an underwriter’s lab for software/hardware security, companies must first require insurance against lost/damaged consumer information. Because most damage to consumers information/data has been disclaimed (and most courts uphold such disclaimers) manufacturers, software engineers have no impetus to insure against a risk they do not bear. It will be left to the government to invent this market, i.e., create legislation punishing manufacturers of weak/non-secure software and hardware.

vedaal April 19, 2007 11:38 AM

“… Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem — I use PGPdisk — ”

this requires pgp/pgpdisk to be installed on the host computer

truecrypt has a traveller mode,
which can be run from the usb stick,
but which still requires registry entries
on the host computer

theoretically,
if one can boot form a read-only linux boot cd, and run the usb stick,
then there might be a way to access encrypted data from the usb stick without any traces on the host computer
(although hard-ware key loggers are always a problem)

here was a possible solution i brought up in sci.crypt, but was not followed up further,

http://groups.google.com/group/sci.crypt/msg/38f6725d1c47d1e2?dmode=source&hl=en

anyone have any further suggestions,
and/or recommendations for a self-contained text editor with no registry traces?

tia,

vedaal

markm April 19, 2007 12:14 PM

“It is interesting that their warranty states the company “makes no representations or warranties that Your Information will not be lost, misused or altered”, even through the company touts that they invented (and has a patent on) the first “transfer-resistant product” in the industry.”

The exclusion of lost or altered data is reasonable, as that can easily happen through inadvertent mishandling by the user – one good shot of static, and if you can read anything at all, it’s not necessarily what used to be there before. (And that’s not even considering the people who’d try to collect on a warranty after accidentally hitting Delete, running the stick through the laundry, or breaking it in half.)

OTOH, the exclusion of “mis-used” ought to be a red flag. That means they are not guaranteeing you against hackers breaking into the device – or in other words, that your best security is to be sure you never walk away without putting the stick in your pocket. So is it worth paying more for this than a plain unencrypted memory stick? (And I feel the same way about the typical commercial software (non)warranty – if they won’t even guarantee it works, doesn’t that tell you something about their own evaluation of their quality?)

There is a simple way for a manufacturer to clearly distinguish their product: Write a warranty that says that they will pay out $10,000 (say) to any owner whose data was stolen from the stick, compromised passwords and getting a computer hacked while you’ve got the stick plugged in and unlocked excepted. If someone’s willing to put money behind their assertions of security, it probably is good enough to be worth a little more.

Aaron April 19, 2007 1:01 PM

@ Beta

Given that Bruce isn’t a complete idiot, I suspect that he could find a simple way to test Kingston’s claims that their thumb drive was secure without having to resort to placing any important on it, or compromsing the security of his working computers.

C’mon folks, let’s give the man some credit.

george April 19, 2007 1:15 PM

It all goes back to the fact that most people are lazy and ill-informed. Ever look at websites where they review cars? Especially those reviewed by consumers?

One of the single biggest gripes by American consumers of foreign automobiles is the lack of a cup-holder! The people who do the automotive design at facilities located in other countries laugh at how ridiculous the American consumer is and rightly so. Think about all the complex innerworkings of an automobile and the Americans are concerned about the freakin cup-holders. Other countries have their own little quirks in other areas, no doubt about that. To put it another way, people are deep down just as primitive as our ancestors homo-habilis and homo-erectus.

So when you stop to think how cars are much more simple to understand for the avg Joe than security products: let’s see how that scales up to products that do public key encryption or some whizbang software that uses a zero-knowledge proof…

Based on this, I’m pretty certain that Cisco needs to start rolling out their MARS devices sold in America with cup-holders. Those things would sell like hot-cakes.

Benny April 19, 2007 1:31 PM

@ markm:

Good post, with many great points. It’s just that I think you might’ve missed the joke. Look at the URL, or look at the linked page 🙂

old-fashioned girl April 19, 2007 1:32 PM

@markm:

Did you click on Sheila’s link–or notice the company name–before you commented on her deliciously witty post? Take a look. I think you’ll be amused. It gives a whole new meaning to your use of the term “red flag,” anyway.

“I bet 90% of USB data loss is not due to loss or theft, but rather, due to laundry machines,” comments Michael Richardson. Having put Sheila’s kind of stick through the laundry more than once, I can tell you it’s hard not just on the data, but also on the clothing.

@sheila: Loose lips sink ships.

Don Marti April 19, 2007 1:41 PM

Another signaling method is publishing source code. Not all source-available software is high quality, but software for which both source and support contracts have been available for some time tends not to be embarassingly bad.

jose April 19, 2007 1:57 PM

In 3 days I will post in this blog the crack for the new Kingstone secure datatraveller it was soo ease, on my university team, stay tuned friends,.

Anonymous April 19, 2007 2:00 PM

This correlates well to political aspects of security as well. “Security theater” corresponds to the lemons in this case.

Since verification by the common citizen is limited, the proliferation of impressive looking tactics with little impact proliferate, while the hard work of actual security gets short shrift

Mark in CA April 19, 2007 2:25 PM

I bet 90% of USB data loss is not due to loss or theft, but rather, due to laundry machines.

I once put my Samsung cellphone through a wash cycle, and after removing the battery, opening it up and letting it dry out, it still worked (even the battery!). I don’t see why a USB drive shouldn’t survive, too.

Alex April 19, 2007 5:47 PM

When I was working as a security engineer I thrice asked detailed questions about a professional (not home/SOHO) security product:

  • when I asked some security company guy, visibly a techie, who was pimping his firewall in a box at some presentation, for a list of publicly known exploits that the box was tested to stop, he was behaving as personally offended. This was about ten years ago when firewalls was hot and costly and the box had far outrageous licensing price per IP number protected, but I don’t remember the vendor name sure enough to ridicule it here.

  • When at a trade show I asked SSL VPN vendor to describe a threat model to support his claims that cert was better than password because of no possibility of writing it down. He was unable describe the threat model coherently.

  • My favourite one: we were searching for SSL accelerator that would do more than offload SSL processing from the main web server, Intel rep was saying that his box would do anything that we needed no matter what we asked for. We actually needed client-side certificate auth and the box wouldn’t do this then in any way usable to us. The best part is that from what he said it seemed that they bring the boxes into the country with no accordance to import laws. To explain: according to the law here based on the Wassenaar agreement, crypto hardware is treated like weapons and you need import license and later you have to track its circulation when you sell it, even if you paint it black instead of khaki and write “e-commerce accelerator” on the front panel. At the time they definitely had no import license for this.

Alton Naur April 19, 2007 8:18 PM

When equalizing information is really important, institutions arise to provide trustworthy evaluations using pooled resources that individual customers may not be able to muster. This is why we have Underwriters Laboratories and Consumer Reports, and why they’re non-profit.

In Security, we have ICSA Laboratories and Common Criteria certifications. The fact that customers don’t refuse to buy uncertified products doesn’t mean that certifications are worthless, it means that customers don’t care enough about the properties that are being certified to pay the price premium that the costs of obtaining certifications impose.

It’s the Microsoft first immutable law of security administration: “Nobody believes anything bad can happen to them, until it does.” Until you suffer a security incident personally, the value of security for you is $0.00. The problem isn’t lack of a signal concerning quality, it’s lack of a signal concerning value.

Anonymous April 19, 2007 8:20 PM

One of the biggest problems where I work — as far as security goes, at least — is the wrong person(s) selecting security products pimped by some pre-sales gimp. They make all the promises in the world just to get their foot in the door. Then we find out we need to enable guest accounts, downgrade service packs, no worry about cleartext authenticaiton… there’s a myriad of things wrong, simply because we don’t choose solutions based on their security; we [attempt to] secure products based on [poor, uneducated] choices.

Lawrence D'Oliveiro April 19, 2007 9:26 PM

I want to reinforce the point made by Don Marti. Opening the source code is a good way of offering more information to the customer, or at least to parties with no interest in the transaction, that the customer can trust.

Perhaps we can claim that Open Source is already having an effect this way? All the computer-related examples in the item about inferior products driving superior ones out of the market are of closed-source products. Looking at open-source ones, I cannot think of a single example of a product that was superior in any way being driven out of the market. If there was anything worth salvaging, another open-source project would be bound to pick it up.

Jason April 19, 2007 10:42 PM

Why do you use PGPdisk? Have you personally audited the code? Or is it based on the reputation of PGP?

Agree. We know nothing about the innards of PGP these days. I use TrueCrypt. It’s widely used, suitably paranoid and is open source.

Anonymous April 19, 2007 11:26 PM

Why do you use PGPdisk? Have you personally audited the code? Or is it based on the reputation of PGP?

Bruce has a way of saying one thing and doing another. He has said that Macs are more secure than PCs yet uses Winblows. In similar fashion, Bruce might say, ” I use PGP, but TrueCrypt is more secure”. According to Bruce, security may not be the most important factor of consideration when making a choice.

Anonymous April 19, 2007 11:43 PM

What do you need PGP for? Just use the Bruce Schneier Password Safe to securely store your passwords on any thumbdrive storage device 🙂

harri April 20, 2007 1:41 AM

Good read, most interesting post I’ve seen here for a while. For a long time Schneiers blog has been more about american politics/terrorism/squids while personally I am more interested in the technical side of cryptography/information security etc.

Anton April 20, 2007 1:44 AM

I think the answer to why USB stick security fails is actually very simple: USB port was not designed for security and thus does not support any secure means of transmitting or storing data securely. All the “secure-sticks” are just hacks over the standard USB. That is why “security on the stick” -model will continue to fail.

This issue is perfectly comparable to anti-piracy features on standard CDs. Those discs must remain playable on standard devices that support absolutely no anti-piracy features, and therefore the contents will remain accessible for “the pirates” as well no matter what features the makers build on top.

USB storage can be made secure, but then the security features must reside on the PC, such as a special driver that enforces data encrytion before the USB port driver.

supersnail April 20, 2007 2:06 AM

Can we please get over ” the Bruce uses windows ” thing.

Bruce uses windows for the same reason I and thousands of other IT profesionals who would rather use something else — its the standard environment at most employers/client companies.

The basic WindowsNT security infrastucture is actually very good, and, in many ways superior to the basic UNIX/Linux infrastructure. The weaknesses stem from the Office products and browser divisions making perceived ease of use a priority over basic security.

Large corporate IT departments have years of experience securing the Windows enviroment. A corporate security breach is just as likely to involve Oracel, Sun, or Linux as it is windows, and in general are proporational to the usage and distribution of the platform.

The only exception to this seams to be ye olde IBM mainframe which seldom features in spite of doing most of the IT heavy lifting for the fortune 500. This may be due to the hackers inabilty to read ancient hieraglyphs than any inherent superiority.

Shachar Shemesh April 20, 2007 3:09 AM

First, a note from personal experience.

I used to work at Check Point, and was in charge (i.e. – outlined, designed and implemented) their “TCP sequence verifier” feature. The basic idea is that the firewall look not only at the five things composing a session (protocol, source port, source IP, destination port and destination IP), but also on the TCP sequence number to determine what should be done with a packet.

Let me just state for the record that I was pleasantly surprised by Check Point’s care for actual (as opposed to perceived) security, and the above is, in no way, trying to say bad things about the company or its products.

The main motivation for the feature was purely marketing. Benchmarks had a V next to this feature for some of the competitors, and not for FW-1.

When I sat down to design this feature, I started off by building a model of how TCP works, what each side is likely to see during the session, and when packets of a certain sequence number are likely to be seen.

This was one feature that took quite some time to stabilize. During development I have experienced such impossible things as a “RST storm”, which is like an ACK storm (look it up) only with RST packets – supposed to be totally impossible. It was so common for this feature to cause trouble during the internal builds, that I was habitually receiving visits from disgruntled colleagues blaming it for breaking something for them, often for FW-1 builds that predate the feature. It was so slow to get right that it went backwards in time to break things!

This story relates to this post in two ways. The first is that by the time the feature was done, I had a chance to check the competition. While we did a very tight tracking of the TCP window, they did something very very very loose. Anything in the general vicinity of the TCP window, in its broadest definition possible, would pass as ok. The reviews were not really able to tell the difference.

The second way this relates to the story is in the mere feature conception. After having worked on it for almost half a year, I can tell you without pause for hesitation that this feature does absolutely nothing to enhance your client security. Sure, it goes some (unmeasurable) way towards dropping bad packets, but I know of no attack in the history of TCP/IP where any implementation had a security bug based on out of window packets. It was a feature for the review checkboxes, and the fact that CheckPoint did it best had (and has) very little impact on the actual security the firewall provides.

I’ll mention that the feature is not entirely useless. It does give the firewall better understanding when tracking the connection’s state. Without it, it is slightly easier to cause the firewall’s idea of the connection’s state to come out of sync with the TCP’s idea of said state.

I just want to back the statements made above regarding open source. I, too, am yet to see a bad open source project push out a good one. The community based participation, combined with the openness of the development internals seem to weed out the lemons.

As a general rule, unlike their proprietary counterparts, open source project leaders really tend to be people knowledgeable in their project’s material field. I am, of course, referring to major projects. Something that only a couple of people ever use do not, necessarily, follow the above rule, but that bears little negative consequences.

Shachar

Roy April 20, 2007 6:13 AM

A friend suggested for the neither-knows category — following the lemon and lime metaphors — the term ‘coconut’, since nobody can tell if it’s good or bad until it’s opened.

Alex April 20, 2007 11:12 AM

@Roy: that reminds me of a quite insightful witticism I heard about motorcycle helmets: all motorcycle helmets are equally perfectly good until the crash. Since then, some are better.

Brent April 23, 2007 6:50 PM

George Akerlof was no economist — he was, like many of his colleagues, really a mathematician.

Following his logic (and his unrealistic restricitions by assumption), you can make a case that every product should be crappy. For example, do you really know the entire history of that apple or box of cereal you are thinking about buying? How about the quality of any particular television, radio, or other CE gadget in a store?

Dan April 25, 2007 4:36 PM

He was an economist, and the idea behind economic models (with assumptions) is that it allows you to explain certain phenomena – like lemon markets, adverse selection, and so on. Later on, the assumptions underlying the models are relaxed, and they become more realistic (and still hold). But apparently, Brent would not give Akerlof a Noble prize…
By the way, as some argued before – reputation plays an important role in overcoming information asymmetries – so when you buy a Kellogg’s cerial you more or less know what you’re going to get, as opposed to when you buy “Brent” cerial.

Will April 28, 2007 1:19 PM

Part of the reason you can trust Kellog’s cereal is the FDA. A century or so ago, there was no regulatory body for pure foods or drugs, and adulteration was not uncommon.

One remnant of this is the “mattress tag” that forbids removal except by the customer. Before regulation, mattresses, sofas, etc. were frequently made with reused material that was infested with bedbugs, fleas, or other microscopic vermin. Because the filler material is hidden, and can’t be inspected without destroying the object, there was no way for anyone to determine the actual quality of the filling, and without laws to punish offenders, there was no consequences for using the cheapest material obtainable: stuffing reused from old mattresses of any origin.

Oldtimer May 16, 2007 10:05 AM

Bruce,

I loved your research that connected the theory of lemons to security products and recognize the process you described.
Back in 1995, I was working closely with many clients as a security analyst. Within a year or two, it became clear that good security products were never going to be market leaders and the best we could do for our clients (and thier customers) was to help get them to “good enough”.

It also became clear that most IT shops didn’t want to hear what was really necessary. You don’t know how many times I heard “That’s never happened to us before….” I fear that web services will be going down the same path.

Ichinin May 16, 2007 4:42 PM

I wrote an encryption program for USB memory sticks that required no installation, it have a VERY simple drag and drop interface and no popups asking questions or anything. When we set off to sell it, and everything looked good, but when it appeared in shops it just didnt sell.

Why? Because encryption apparently isnt that important for Joe Average. (Sad story, but what can you do?)

From the customers point of view, USB Memorysticks “already have encryption software”, but they forget to realise that it isnt always easy to use (lots of clicks here and there and technobabble that Joe Average doesnt understand) and the programs are not always free, doesnt comes on a CD/DVD and sometimes cant be redownloaded, or even real encryption (one i tried were just a password protected virtual partition on the drive).

Only use i have for the project now is tell about it on my CV and use it as a secure backup device for my sourcecode in case fire eats my harddrives or something.

If i could get the time back that i spent writing the code, i would spend them playing “GTA San Andreas” instead.

lordbeau May 29, 2007 2:33 AM

The Kingston Datatraveler Secure USB flash drive has one major problem. I bought one, and finds it only accepts passwords up to 16 characters long!

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.