Media Sanitization and Encryption

Last week NIST released Special Publication 800-88, Guidelines for Media Sanitization.

There is a new paragraph in this document (page 7) that was not in the draft version:

Encryption is not a generally accepted means of sanitization. The increasing power of computers decreases the time needed to crack cipher text and therefore the inability to recover the encrypted data can not be assured.

I have to admit that this doesn’t make any sense to me. If the encryption is done properly, and if the key is properly chosen, then erasing the key—and all copies—is equivalent to erasing the files. And if you’re using full-disk encryption, then erasing the key is equivalent to sanitizing the drive. For that not to be true means that the encryption program isn’t secure.

I think NIST is just confused.

Posted on September 11, 2006 at 11:43 AM65 Comments


BLP September 11, 2006 11:57 AM

I’m not sure that’s true. (Although, you are the expert, not me).

If the key >can< be brute forced (even assuming a very Hard Problem requiring googolplexes of cpu cycles), then deleting the key simply makes it impractical to recover the data.

Anonymous September 11, 2006 11:58 AM


“I have to admit that this doesn’t make any sense to me. If the encryption is done properly”

When it comes to very large volumes of data (say a DB) when has encryption been done properly?

Likewise how often have you seen a system where key selection has been done properly?

I suspect the NIST view point is pragmatic based on practical realities.

There are many many times more coders writing bad security code than good as you have pointed out in the odd publication and book 😉

Mike Sherwood September 11, 2006 12:00 PM

Just because an algorithm and key is considered secure today doesn’t mean that vulnerabilities won’t be found in the future. Encryption is a way to buy time. If the attacker needs the data within a short period of time, it prevents that. However, given enough time, every algorithm fails.

FP September 11, 2006 12:01 PM

Or they’re taking the human factor into account. We know that very few people use good passwords.

Not everyone uses Password Safe, and not everyone who uses Password Safe uses a good master password for their database. And some people that use a good master password have it written down on a yellow sticker that is glued to their screen.

It’s probably safer to say to John Q. Public to not depend on the use of encryption for safety.

Still, you’d expect a more balanced evaluation from an institution like NIST.

Carlo Graziani September 11, 2006 12:06 PM

Maybe they know something we don’t know about AES.

Try sending an e-mail that says “Please deliver the nuclear warhead to Baltimore Harbor aboard [Name of Ship] on [Date], encrypted in 256-bit AES, to a dead-letter drop in Iran. If [Name of Ship] gets intercepted at sea, and/or you go on vacation to Guantanamo around [Date], we’ll know NIST wasn’t confused after all.


REB September 11, 2006 12:13 PM

The NIST policy seems quite sensible to me, and does not imply disbelief in the power of encryption. The key that encrypted the data may have a lifetime all its own — separate from the physical media. there is no guarantee that all copies of the key will be destroyed. If that key becomes known after the encrypted data is discarded, then it takes little effort to recover the data, and the media will not have been sanitized after all.

Jason September 11, 2006 12:20 PM

I agree with NIST on this one. Owning the data, even if it is encrypted is better than not owning the data at all. Eventually one is likely to break the encryption, whereas a well sanitized disk (shredded, surface destroyed, etc…) will be impossible to recover data from.

HowLong September 11, 2006 12:22 PM

Using encryption as a means of data disposal depends heavily on the lifetime of the data that was encrypted.

If the data is credit card numbers, then the useful lifetime of the data is only a few years. So any attacker attempting to get that credit card data from a “disposed” hard drive has only a short time to work in.

However, if the data is SSN and other personal data, that data could have a lifetime of many many years (20-30+ years). So, an attacker has a long time to attempt to crack that “disposed” hard drive with the encrypted personal data.

My biggest concern with seeing data encryption as an acceptable “disposal” method is that many companies will suddenly find it convenient to simply “encrypt the drive” and then sell, or whatever that encrypted hard drive to anyone that wants them. Granted, having encrypted data is better than no encryption, but it is not better than physical destruction of a hard drive that contains sensitive data.

Coldguy September 11, 2006 12:25 PM

Honestly the only way to completly erease files is to take a hard drive under a super powerfull magnet, then throw it inside of a hedge chipper, grab all the parts and bury it under the ocean, and then guard that spot from invaders.

On the realistic level, encrytion done with a good algorithm does the job just fine.

RvnPhnx September 11, 2006 12:38 PM

Encryption should not be confused with data disposal. Encryption implies that the data should be recoverable at some future date (and preferably by the desired party), disposal (which is often commingled with the idea of sanitation, thanks to late 1800’s tenement housing–and poor regions of the world today) implies the eradication of information.
So yes, NIST seems confused–about what on the other hand is what we do not agree on.

Nicholas Weaver September 11, 2006 12:42 PM

Considering all the human factors involved (bad keys, unencrypted data, SWAP SPACE! etc etc etc), the “Nuke the disk, its the only way to be sure” is the only right answer.

Nicholas Weaver September 11, 2006 12:49 PM

Its also good to note: what they consider a good paper shredder is 1mm x 5mm crosscut! Thats small pieces!

Dave H September 11, 2006 12:55 PM

Another problem is Microsoft built-in encryption, which includes backdoors meant to help people ‘recover’ from mistakes. (Personally, I think you should make mistakes unrecoverable, that would make people pay attention…)

Ron September 11, 2006 1:36 PM

No, encryption is not good enough. From that same
NIST report:
“Clearing information is a level of media
sanitization that would protect the
confidentiality of information against a
robust keybard attack.”
If somebody had a keyboard logger, or shoulder
surfed, or equivalent, then your encription key
is compromised. Couple that with handing them
a copy of all your data and it’s bonanza time
for the bad guys.

If you want data erased, then erase the data.
That way, you don’t need to worry about compromised
keys or too-easy keys or brute force.

Mike Sax September 11, 2006 1:40 PM

For that not to be true means that
the encryption program isn’t secure.

It may be secure today, but what about ten years from now? Wouldn’t you agree that properly erasing data is more secure because it provides protection against advances in technology, as well as potential future revelations of vulnerabilities?

Marko September 11, 2006 1:50 PM

Bruce is right. If a given encrypted environment is secure enough to protect the files, then it is definitely secure enough to provide sanitization.

brett September 11, 2006 2:07 PM

I think they mean that
1) data encrypted
2) might withstand being cracked today
3) but it might not survive tomorrow
4) if you dont want data revealed tomorrow
5) don’t rely on today’s encryption tomorrow
6) = do something else to sanitize
7) encrypt not= sanitize

derf September 11, 2006 2:16 PM

Sanitizing ensures the data can’t be recovered…ever.

Encryption ensures the data can’t be recovered for the forseeable future.

RvnPhnx September 11, 2006 2:24 PM

It should also be noted that in destorying data one can go overboard. The old IRS rule about completely physically destorying the HDDs they used comes to mind.

KB September 11, 2006 2:31 PM

At a storage conference last year, there was a discussion of the merits of encryption and key destruction as sanitization. The storage folks in the room were not generally inclined to accept its merit, because they had visions of, at some point in the future, a major break in AES that would suddenly ‘unsanitize’ massive amounts of data. That could, in fact, leave them in a lot of trouble.

An important counterpoint to that is that it may be an impossible problem to sanitize data with absolute certainty. There are a great many clever ways to recover supposedly destroyed data, and probably a few that we haven’t thought of yet. There is no guarantee that any sanitizing method won’t be defeated in the future.

The probability of 256-bit AES becoming crackable in our lifetime is negligible. While I understand the concerns around the method, I think there is an unwarranted tendency toward mistrust of encryption systems, and overestimation of other data destruction methods.

Alan September 11, 2006 3:28 PM

Looks like they were confused. The paragraph
has been deleted in an errata.

Interesting. So, is encryption now considered a “generally accepted” method of sanitization?

Sounds like this was as controversial at NIST as it is here.

HowLong September 11, 2006 3:30 PM

“And if you’re using full-disk encryption, then erasing the key is equivalent to sanitizing the drive. For that not to be true means that the encryption program isn’t secure.”

So, is that to say that Bruce would be willing to publically post an image of his personal/company hard disk (containing his SSN, complete personal/private history, medical records, financial account data, confidential company documents, etc.), encrypted with a COTS disk encryption program, installed and configured by a “knowledgable” IT person, who will also delete the encryption key as part of the “disposal” procedure?

Christoph Zurnieden September 11, 2006 3:35 PM

There is no guarantee that any sanitizing method won’t be defeated in the future.

The HDDs produced today are soluble in liquid iron. Only homeopaths would deny a warranty here.

Looks like they were confused. The paragraph has been deleted in an errata.

The wording was a bit too broad and missed the details as shown in some of the posts here, but I wouldn’t throw it out completly. It needs at least a small hint that every encryption may fail–mostly in implementation and/or use, very rarely the algorithm itself–and only carefull physical destruction can guarantee 100% success.

Yes, Bruce, NIST seems a little bit confused and I fear that you are not completly innocent here 😉


Alan September 11, 2006 3:52 PM

The paragraph in question went too far. It communicated that anyone relying on encryption for sanitization is not following best practices. Such a statement ignores the value of what is being protected and the threats that exist against that content. In many (I would venture to say most) applications, destroying the key to an encrypted filesystem is for all practical purposes effective sanitization.

Keep in mind that repeated overwriting of the media might not resist a capable and motivated attacker. Under equal threat models, an encrypted filesystem whose keys have been destroyed is probably safer than one that has been overwritten 3x (perhaps even 32x for that matter) with random data. And it is not realistic to expect the average computer owner to melt their hard drives when they retire their old computers.

Nigel Sedgwick September 11, 2006 4:35 PM

Discs are now cheap.

My favorite sanitizing method is disposal: 4 holes drilled straight through with the Black and Decker; then to the municipal tip. Yes, I actually do this: it’s quite quick.

Now, I don’t have any very serious secrets to hide. If I did:

Overwrite the whole disc with a different low quality pseudo-random bitstream, ten times. Then drill 4 holes with the Black and Decker.

Best regards

ATA September 11, 2006 5:41 PM

Very intersting article.
I see that NIST have removed the offending paragraph.
Personally, my first thought that the original version of the document was a bit closer to best advice.
As has been noted already above, there is always a risk of a coding error that mean weaken an otherwise strong encrytpion system but what about a mathematical crack?
After a bit more research, I came to understand that this probably isn’t going to happen any time soon:
web address ->
web address ->

The NIST document also indcates where to get free disk eraser program from the University of California:
web address ->

@Nigel Sedgewick
Did you notice the section of the NIST document “Special Publication 800-88” referred to by Bruce?

“Purging information is a media sanitization process that protects the confidentiality of information against a laboratory attack. For some media, clearing media would not suffice for purging. However, for ATA disk drives manufactured after 2001 (over 15 GB) the terms clearing and purging have converged.”

That is a slightly turgid way of saying that modern ATA disks only need to be overwritten once to render the data irrecoverable.

Sorry the “web address ->” is just my way of getting past the site spam filter.

Fab September 11, 2006 6:46 PM

Yes, would it not seem that the NIST statement is fairly obviously correct? In that the length of the key is shorter than the length of the data (unless we’re using one-time pads here)?

Suppose, for example, you encrypt 100GB of private data with a, whatever, 512 bit key. Suppose further that a randomized brute-force attempt at decryption happens to spit something out that contains lots of meaningful personal data.
While this is unlikely, the far greater length of the data (compared to the key) ensures that it is by far more unlikely that this data is a random (monkeys on a typewriter) fluke. In other words, you lose plausible deniability.

George Bailey September 11, 2006 7:06 PM

“It should also be noted that in destorying data one can go overboard.”

Completely destorying is not so easy. After all, there are a million stories in the naked city.

ATA September 11, 2006 7:23 PM

@NIST is right

You’re right. I am a bit confused.
The third line of my post was meant to read:
“Personally, my first thought was that the original version of the document was a bit closer to best advice.”
After a quick look at the web articles I quoted, I backed off and came to the conclusion that properly implemented cryptography should be trusted.
I am not sure if you think the current or amended version of the NIST doc is better going by your comment.

Jacob Davies September 11, 2006 7:38 PM

I think their sentiment was right. I assume the scenario is an organization where all hard disks are routinely encrypted; the theory would be that they could then be discarded without doing anything more than destroying the encryption key.

I wouldn’t want to rely on that for a bunch of reasons:

1) If the encryption program isn’t written correctly or a poor key is used the encrypted data may be accessible.

2) By releasing a hard disk with encrypted data on it, you’ve done the hardest part of removing a lot of data for the hypothetical spy: getting a large-capacity storage device out of the facility. Now all that person has to remove from the facility is an unauthorized copy of the encryption key – which is short and innocuous (“Oh, that’s the WEP key for my home network”) – and they have the large volume of sensitive data on that disk.

3) A small risk that someone would keep the disk and that in the future the encryption could be broken.

4) Not particularly easier to “destroy the encryption key” than “overwrite the hard disk” at a facility that controls disposal of disks. In either case you need to have a process in place. Why not use a process guaranteed to leave the data unreadable?

Which isn’t to say that organizations shouldn’t routinely encrypt hard disks if there is any real sensitivity. They should, because of the risk of unauthorized disposal or loss of the disks. But the formal process for disposal might as well use overwriting.

D September 11, 2006 8:08 PM


I think we’d really like to hear your response to some of these comments. I’m on board with most commentors that the encryption only buys you time, not perfect secrecy – especially if you’re paranoid enough to believe people could ever make mistakes implementing encryption algorithms.


Preston R. September 11, 2006 10:43 PM

As was mentioned by several, and as Bruce himself would likely point out, the real risk is not the ability to brute force or crack the encryption algorithm (i.e. AES), but the potential for vulnerability in the application that uses the encryption.

Can anyone say with certainty that all the vendors selling disk encryption products have flawless code surrounding their encryption algorithm (i.e. AES).

From my experience looking at several disk encryption products, almost all provide some form of “key recovery” (in case a user forgets their password). Some have central key servers, some use challenge-response, etc. All it takes is for a vulnerability in the disk encryption vendor’s key management code and it doesn’t matter how strong or uncrackable AES is.

The interesting part is that if a disk encryption vender vulnerability is actually found, and the vendor issues a patch update for their disk encryption product, all the current in-house computers can be updated and protected. However, guess what… it’s too late for all those disk drives with sensitive data that have already left the company (into the wild), encrypted with the now vulnerable disk encryption application.

As one poster pointed out, depending on the life expectancy of the data, the bad guy can warehouse encrypted and disposed disks with known sensitive data on them (i.e. from health insurers, financial intitutions, medical facilities, government facilities, etc.) acquired from computer resellers, recyclers, eBay, etc. The bad guy can keep these disks for many years, knowing which vendor/application was used to encrypt the data and wait until a vulnerability is found. Even worse, the disk encryption vendor could some day go out of business (could be 5-10 years), or be bought, and who knows what could happen to their disk encryption source code.

If some bad guy has a warehouse full of disposed computers/disk drives encrypted with Company X’s disk encryption software, it gives the bad guy even more reason to find creative ways to find vulnerabilities in that vendor’s code.

Also, with typical disk encryption, a company is using the data encryption as a way to mitigate the risk of a very small number of their disk drives with sensitive data that may get into the wild (through loss, theft, etc.). So the risk of data exposure due to a vulnerability found in the disk encryption program of a lost or stolen disk drive is quite small. However, when using data encryption as a disposal method, where a potentially large number of disk drives with sensitive data are intentionally released into the wild, the whole equation changes.

Dale September 11, 2006 10:49 PM

In reference to; “Its also good to note: what they consider a good paper shredder is 1mm x 5mm crosscut! Thats small pieces!”

I have personaly used shredders that I was told shredded paper down to 11 thousands of an inch. If you took a handfull of this dust and threw it into the air it floated around like a small could.

cute little robot boy September 12, 2006 2:23 AM

Haven’t any of you seen Spielberg’s AI? Data is always recoverable … 😉

jk, but really, technology has advanced so quickly, if someone is determined and is recovering data on scale they can probably efficently recover stuff you thought was physically destroyed.

After all, we mapped DNA … look at the technologies used to do that, and now some guy reckons he can do it all over again for $1000.

Anonymous September 12, 2006 3:27 AM

@ nigel:

I agree with your general idea, however I prefer my mossberg brand hole-maker to your black and decker.

William September 12, 2006 3:36 AM

I thought the whole point of encrypting “data at rest” was to take it beyond the reach of unauthorised eyes (internal or external). As such, the data was futher safeguarded by being kept in a secure environment (eg data centre, perhaps even in a strong room) created by appropriate choice of physical, personnel, networking, and access controls. Never is this stuff hosted in the public car park.

Now, if those same drives were decommissioned and just thrown out, the effect would be to strip out all the layers of protection APART from encryption. If someone was sufficiently motivated, I’m sure they could easily get round (ie break) the encryption.

So, for me, I’d still specify physical destruction method(s) – degauss and smelt for highest secrecy data – and have the nice disposal plant create a nice ingot for use as a paperweight. Then I could keep an eye on it.

Final thought … maybe the data on the disk could be protected by encryption. THen, at end of life, just rewrite the disk device with random numbers (a coupe of times for the paranoid). That’d tie up someone’s brute forcing system for some while. Maybe they’d end up discovering the electronic equivalent of the Bible Code !!

Sam Johnston September 12, 2006 4:39 AM

Bruce Schneier’s post on Media Sanitization and Encryption is aptly timed in that I was only today looking at how Trusted Platform Modules (TPMs) can be used to store encryption keys for disk encryption and (simply by deleting the keys) hardware recycling. We’re constantly hearing about companies and people unintentionally sharing their secrets (and perhaps more importantly, those held in trust on behalf of others) by disposing of hardware without first sanitizing storage media (eg by repeatedly writing random data from a PRNG like Mersenne Twister). Unfortunately this takes time (generally hours), effort and expertise so it is rarely done properly if at all.

To illustrate this point, according to this article scientists at Georgia Tech have been working on the problem after a US spy plane had its secrets recovered after a collision with a Chinese fighter in 2001; they tried burning disks with heat-generating thermite, crushing drives in presses, chemically destroying the media or frying them with microwaves, but the only fast, effective method was building a mechanism to wave a neodymium iron-boron magnet with special pole pieces made of esoteric cobalt alloys past the platters. While I’m still interested in hearing about any work done on high speed software based wiping (eg instead of trawling through sector by sector, targeting the most important stuff like file system metadata and sensitive files first so that while the overall wipe time is marginally more, a reasonably effective wipe is achieved in seconds or minutes), my preference is now for encrypting everything and securely deleting the keys when required. Unfortunately this approach is not necessarily compatible with ill conceived laws like the UK’s RIP Act, where not being able to hand over the keys for encrypted data because you deleted them could land you in jail for 2 years (5 if they think you’re a terrorist).

According to Bruce:

Last week NIST released Special Publication 800-88, Guidelines for Media Sanitization.

There is a new paragraph in this document (page 7) that was not in the draft version:

    Encryption is not a generally accepted means of sanitization. The increasing power of computers decreases the time needed to crack cipher text and therefore the inability to recover the encrypted data can not be assured.

The thing about encryption is that you should always select a key length that is appropriate for the life of the data you are intending to protect – for example 128 bit AES is likely to be overkill for financial figures of a publicly traded company that will be released at year end anyway, but according to the NSA’s Suite B Cryptography, information classified top secret requires 256 bit AES. If you have selected a sensible key length then sanitization of encrypted data by destruction of that key is, in my opinion, a valid approach.

Caveat: with disk encryption your cryptanalyst adversary has access to vast amounts of cyphertext for analysis, much of which is predictable – you would want to be fairly sure not only of the algorithm used (eg AES), but also of the implementation. For a good introduction to some of the issues faced (or just for some light bedtime reading) you should check out this paper on Geometry Based Disk Encryption (GBDE) in FreeBSD.

In summary, if you’re an individual or business wanting to protect your private and/or commercial data then media sanitization by deletion of encryption keys is for you (especially if they were safely stashed away in a TPM; you should probably be using block level disk encryption anyway, if only because with journalling filesystems all bets are off when it comes to securely erasing files). If you’re a government wanting to protect top secret data or spy planes then stick with your fancy magnets. And if you’re a criminal or a terrorist then you should forget about covering your tracks and stop what ever it is you’re doing that has people trying to uncover them in the first place.

averros September 12, 2006 6:25 AM

Actually, if one wants to be reasonably sure, he must both encrypt and erase the data.

These two techniques guard against different threats, and complement each other pretty well: erasure by nearly any method may leave some bits readable, but these bits will be all but useless if the data was encrypted.

bob September 12, 2006 7:05 AM

Given enough time and CPUs, any encryption can be brute force decrypted. A wipedisk cant be decrypted.

Physical attacks are still feasible, but outside the scope of the issue.

Seems to me the choice is obvious – if you have important information, erase (not just delete) it.

If you want, once you are done, you can still encrypt whats left as a red herring.

jeremiah johnson September 12, 2006 8:44 AM

You guys are mixing the idea of a keyphrase with the idea of a key.

Keyloggers cannot capture a key unless you use a really weak key and a really weak algorithm.

It takes a long time to break even a 128bit key. Longer than any of us will live, even given all the computing power available to humanity currently.

I can’t find it now, but not long ago Bruce posted something describing the time it would take to break a key with brute force.

I agree with Bruce. If you have good encryption, and you destroy the keys that allow you to access that data, that data is effectively gone. Brute forcing a key is so unbelievably impractical as to be impossible.

The analogy in the link I can’t find is something like this.

Imagine a computer the size of a grain of sand that can test a key against a ciphertext in the amount of time it takes for light to pass through it. Now imagine that the entire Earth is covered with enough of those computers to be 1 meter deep (including the oceans). In order to brute force a 256 bit key, that system would need something like 200 years to completely exhaust the keyspace.

Damn, I wish I could find that link.

Bassplayer September 12, 2006 8:45 AM

Wouldn’t an overwite over excryption muddy the waters enough to prevent decryption? My cryptanalysis is rusty, but with a good algorithm, losing one bit in the cipher text can affect many bits in the clear text.

Clive Robinson September 12, 2006 9:37 AM

One point,

If you assume the disk was unencrypted when in normal use, when it started to become a bit dodgy or was nolonger large enough you just encrypted the disk…

Then opps you are in a lot of trouble, it is very likley that with suitable equipment enough plain text can be recovered from the side areas of the hard drive platter (this is a well known technique used by law enforcment).

So you now have either enough plaintext info to be worth something in it’s own right, or enough material to start looking for key recovery….

Carlo Graziani September 12, 2006 10:37 AM

@jeremiah johnson

That estimate seems way, way too low.

2^256 = 1E+77

Assuming coarse-ish sand, about 0.1mm in radius, the light-crossing time is about 7E-13 sec. To scan 2^256 keys at a rate of one every 7E-13 sec per sandgrain requires 7E+64 sandgrain-seconds.

The Earth is about 6400 Km in radius. Covering it to a depth of 1 m yields a volume of 5E+14 m^3. The above grain of sand has a volume of about 4E-12 m^3. Ignoring packing correction factors, that makes available about 1E+26 sand grains.

So the required time is 7E+64 sandgrain-sec/ 1E+26 sandgrains = 7E+38 sec = 1E+31 years. By comparison, the estimated current age of the Universe is about 1E+10 years.

The larger point is, as you say, Brute Force Ain’t Gonna Happen.

Jonathan Leffler September 12, 2006 11:39 AM

I downloaded NISTSP800-88_rev1.pdf that is dated 2006-09-11 and it has an erratum (p4) that says that paragraph (p7) is deleted:

(cut’n’paste from PDF, reformatted, columns marked by |):

The following changes have been incorporated into Special Publication 800-88
Date | Version | Change | Page Number
09-11-06 | 10-06 | Deleted “Encryption is not a generally accepted means of sanitization. The increasing power of computers decreases the time needed to crack cipher text and therefore the inability to recover the encrypted data can not be assured.??? | 7

-ac- September 12, 2006 12:08 PM

“Given enough time and CPUs, any encryption can be brute force decrypted. A wipedisk cant be decrypted.”
But how will you know that you’ve solved it? Bruce said “If the encryption is done properly,” and, dang it, that’s a lot more thorough that what what you all are thinking. 😉 He’s talkin’ ’bout “Schneier Good Privacy”, not PGP. 🙂

David September 12, 2006 1:35 PM

If you are going to assume that the encryption could be done wrong (bad keys, algorithms, etc.), then you have to assume that any other physical disk destruction has similar holes.

If you drill into the disk in the wrong place, the data can be retrieved.

If you have a backup tape of the disk, the destroying the disk won’t help.

If the disk was RAID-1 mirrored and you didn’t destroy it, you’d be in trouble.

Obviously, it depends on how much sanitization you need. If you were to take a disk that’s encrypted, then reformat it and use it elsewhere, the encryption likely was good enough to keep that disk safe.

If you’re going to encrypt it and then leave it out for anybody to try to hack it, then yes, you could have trouble.

Peter September 12, 2006 3:12 PM

Did some British newspaper purchase some laptops when Afghanistan was invaded after 911? I remember some tale about decrypting harddrives that had used 40-bit EFS in half a week.

Anonymous September 13, 2006 11:18 AM

Bob wrote:

Given enough time and CPUs, any encryption
can be brute force decrypted. A wipedisk cant
be decrypted.

If by wipedisk you mean overwriting with random data, you might be mistaken. There are several techniques that can be used to retrieve data that has been overwritten. Hint: magnetic levels in the 1’s and 0’s are analog. Recording a 1 over a pre-existing 1 results in a different level than recording a 1 over a pre-exising 0. So the level of magnetism in the new 1 reveals what was previously there. This effect continues (in diminishing levels) through the second, third, … nth overwriting.

Then there is the width of the track where the 1’s and 0’s are deposited. The new 1 or 0 generaly is not written identically in the same spot or with the same size footprint. So pre-existing data might still exist on the edges of the track.

Based on these artifacts, the work (cost) to retrieve pre-existing data that has been overwritten is likely to be much less than the cost to brute force an encryption key.

Tim September 13, 2006 12:35 PM


Why not just smash the media with a sledgehammer? Modern hard drives with glass/ceramic platters will shatter pleasantly into a million tiny pieces.

And the payoff in fun is easily worth it alone.

Jacob Davies September 13, 2006 3:15 PM

That same document suggests that current large-capacity hard disks are impossible to read after even one overwrite.

Alan September 13, 2006 4:10 PM

Jacob pointed out:

That same document suggests that current
large-capacity hard disks are impossible to
read after even one overwrite.

Well, it didn’t exactly say “impossible.” It said one pass is adequate to protect from a laboratory attack (“purging” sanitization level). OTOH there is also a more stringent category of sanitization, called “destroying”.

Keep in mind that the document in question is addressing only “sensitive but unclassified” data. What is deemed “adequate” would surely be more stringent for data with a highter classification.

Little Buffalo September 15, 2006 7:06 AM

The whole thing seems to hang on the life time of the usefulness of the data.

I have seen CDs for sale in the Middle East that have copies of the shredded papers taken from the US embassy in Iran. College students spent years pasting these tiny strips together because they had an extreme interest to embarrass America.

In my experience files encrypted with one program can not be decrypted by another even if they use the same algorithm (AES, Blowfish etc [PGP and its freeware variant GPG are the exception]). So would it be necessary to identify the encrypting program, possibly even the version of said program before being able to decrypt?

anon_uid September 16, 2006 1:34 AM

@Little Buffalo:“The whole thing seems to hang on the life time of the usefulness of the data.”

Agreed in triplicate. Name me one
algorithm with a key much smaller than the
plaintext that is older than AES, that is still
considered secure for the amount of data
we’re discussing.

FWIW, the storage encryption folks are still working out the requirements and threat model.

Since you need random access, you only get error-propogation at the granularity of a sector; within that sector, you should probably use a mode that propogates errors over the entire plaintext, so that destruction of a few bits guarantees a difficult search of the ciphertext space. I guess that’s got to involve two passes over the sector, or a 512-byte block cipher.

And as long as you’re at it, remove the recording surface. I have yet to sanitize a disk that is made from glass or ceramic platters; all I see is metal. I’ve heard of glass platters but never seen one.

“I have seen CDs for sale in the Middle East that have copies of the shredded papers taken from the US embassy in Iran. College students spent years pasting these tiny strips together because they had an extreme interest to embarrass America.”

The book series is called something like “Inside the Espionage Den” and to quote the URL cited below, “[t]he political and economic section files included documents going back to the mid-1950s, useful only in a historical context, if that.” The story is written by one of the hostages, where he discusses how their security doors UL rated to withstand n minutes of forced entry didn’t, how their shredders were totally inadequate to the document destruction rate that the door’s RATED value implied, and how they could be used to compile a list of everyone who had visited the embassy (yikes, not good tradecraft). All I saw was boring geopolitical or economic analyses of oil and political intelligence on the intentions of the Soviets regarding that region. None of the pages I saw was shredded in any way. It’s probably the least interesting leak of official documents I’ve seen, and likely most of it could have been declassified, had anyone cared.

@Jeremiah Johnson: “Keyloggers cannot capture a key unless you use a really weak key and a really weak algorithm.”

What are you talking about? Careful programs hash a passphrase to get a key. What algorithm do you “encrypt” a passphrase with to get a key, and where do you get a key to “encrypt” it with? It can’t come from the user, that’s begging the question (note: that means he’s assuming his result and then using it to prove itself).
If the key is in the software, then it’s available to everyone. Sorry, you’re the one who is confused. You hash passphrases to get a very unpredictable key from a longer somewhat predictable passphrase that is larger.

For that matter, Schneier seems to be begging the question as well; how do we know what encryption software is “done properly” according to next year’s standards?

Anyone who is interested in storage encryption should take a look at ZFS,
there’s some interesting stuff being done there.

hhhobbit December 5, 2006 7:17 AM

Now that the fur has flown and people have moved on I will add that yes, keyloggers can get the passphrase, and that if that can be used to get the data back it is theoretically possible. What is theoretically possible and what can be done are frequently different things. Encryption is better used for protecting data that is in use. I wonder what they say about putting the disk driver under those huge electromagnets used at car junk yards (disk out of case of course)? That should be a sufficient degaussing, eh? Follow that with judicious use of the sledge hammer. I defy you to get the data off of it! You can!? A trip to the open hearth furnace will do wonders in making it disappear forever.
With all the myriad ways of losing data unintentionally (disk drive fails, etc.) it seems like that is more of a problem than getting the data back as long as you did anything to obfuscate the data. I include encryption with suitably chosen randomly generated keys (what PGP uses) as something that will make the junkyard disk collectors to give up on the drive immediately. Why muck with a disk that used encryption when there are easier pickings (clear text) to be had?

Ian January 16, 2007 8:31 AM

I’d love to have all my computers fitted with gigantic hard drives – drives with tiny, but powerful, battery acid sprays. The spraying could involve something much stronger than mere battery acid, I suppose, depending on risk.

I’d suspect that my machine is being accessed and my privacy / encryption are looking a touch compromised and I’d just be able to use a telephone, or maybe just my cellphone, to dial a special number, if I have time in which to do this.

This number dialled would effectively be a phone call to a tiny machine inside the hard drive casing and, once followed by an eight digit additional safety-catch code, the lengthy battery acid spray units would spray a fine mist of acid, once every 20 seconds, onto every drive platter surface – that should slow the authorities down.

If nobody manages to put a signal barrier around my machines in time, my data couldn’t survive seriously concentrated acid being sprayed onto the surface.

This would all begin, I’d guess, when my computer’s casing and drive “telephone” sends a text message to my mobile phone, or pager. It’s complicated and highly unlikely, but…it’s still a nice thought, for those who like their privacy private.

And, with luck, I’d not be prosecuted for destroying evidence because the authorities would have to be able to demonstrate that I knew that my disk contents constituted evidence in the first place.

All I’d know (hopefully for sure?) is that somebody is likely to be in my home and likely to be accessing my data (that copy of it, anyway, on my hard drive/s) and I want them to be incapable of ever reading what I have on the machines.

All this is the stuff pipe dreams are made from, of course and a person would have to be pretty certain that their machine was compromised, stolen, taken away, etc., or they’d face their data being destroyed because of a false alarm.

God bless well-secreted backups, I say.

It’s time to build savage self-destruction features into the casing and platters of these drives, so that we can add physical destruction to our list of possibilities, giving us greater speed and control over who gets to see what we don’t want them to.

(I wonder what a microwave oven would do to the surfaces of the platters in a drive?)

It’s time for hard drives to be built to help us stay one step ahead of the arrogant authorities in their ever-intensifying drives to take from us our rights to privacy. They’ve certainly already taken from us far too much and it would be nice to hit back, now and then.


Saucer May 9, 2011 10:26 PM

I do like the idea of Encrypting the whole drive with Truecrypt using 3 wipe pass, then do a regular format of the drive to wipe out everything. I tried everything to get the encryption back as readable after formating the drive, it was impossible. 🙂 figured this out this method by mistake… lol

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.