Deniable File System

Some years ago I did some design work on something I called a Deniable File System. The basic idea was the fact that the existence of ciphertext can in itself be incriminating, regardless of whether or not anyone can decrypt it. I wanted to create a file system that was deniable: where encrypted files looked like random noise, and where it was impossible to prove either the existence or non-existence of encrypted files.

This turns out to be a very hard problem for a whole lot of reasons, and I never pursued the project. But I just discovered a file system that seems to meet all of my design criteria—Rubberhose:

Rubberhose transparently and deniably encrypts disk data, minimising the effectiveness of warrants, coersive interrogations and other compulsive mechanims, such as U.K RIP legislation. Rubberhose differs from conventional disk encryption systems in that it has an advanced modular architecture, self-test suite, is more secure, portable, utilises information hiding (steganography / deniable cryptography), works with any file system and has source freely available.

The devil really is in the details with something like this, and I would hesitate to use this in places where it really matters without some extensive review. But I’m pleased to see that someone is working on this problem.

Next request: A deniable file system that fits on a USB token, and leaves no trace on the machine it’s plugged into.

Posted on April 18, 2006 at 7:17 AM92 Comments

Comments

Thomas Downing April 18, 2006 7:35 AM

Using linux, putting a deniable file system on a USB memory device, leaving no trace in the host should not be tricky as an initial problem. FUSE (the user land file system) would seem to be a likely candidate.

The more difficult problem is the secondary one. How can I be sure that no traces of the data in the deniable filesystem remain in the host after I remove the USB device? There are a few obvious areas of concern – swap file, automatically written backups (generated by editors, etc.) that might be written in the users home directory in a resident file system, etc.

One way to start might be to chroot to a normal file system on the USB device, and then mount the deniable system also on that device. This would leave swap and possibly incriminating entries in log files. Swapping could be disabled as well as logging….

But the list is likely to go on.

Thomas Downing

Thomas Downing April 18, 2006 7:53 AM

I’ve taken a look at the rubber hose site, and I think that in one way they are missing a bet.

If the mere existance of encrypted data can is some places and or circumstances be taken as incriminating evidence, then surely the existance of the rubberhost kernel modules might also be hazardous.

If you are only going to use the rubberhose data on a small set of known machines, the the modules could be installable, which would solve the problem.

On the other hand, following up on the USB idea, the advantage here is that one might, for instance, slip into a random internet cafe, do whatever, and leave. This disallows installable kernel modules. It is true that at this time, I doubt that internet cafes will have linux – especially linux with FUSE support, but this may be changing.

Overall, a complete linux running in userland stored on USB may be the more practical approach.

Thomas Downing

Dominic White April 18, 2006 8:01 AM

TrueCrypt is great and appears to do the job. Better still it is cross platform. It is basically a block device mounted via loopback. I am not sure if any serious cryptanalysis has been done on it however.

Stinky April 18, 2006 8:29 AM

Truecrypt uses known ciphers: AES, blowfish, serpent, and the rest, in combinations. Excellent software.

arl April 18, 2006 8:47 AM

Until the popular operating systems put random noise onto empty disk sectors instead of zeros, all of these methods will still point to the use of encryption.

I have used truecrypt and like it for some applications. But I don’t think I would recomend it to someone who would be jailed if they were found to be in possesion of encrypted data.

Per Hedetun April 18, 2006 9:02 AM

Later versions of FreeBSD provides native support for an encrypted filesystem, along with some additional steganography which, supposedly, can masquerade the encrypted filesystem as unencrypted data.

It is all done within the “Geom Based Disk Encryption” (gbde)-framework, described here:

http://www.freebsd.org/cgi/man.cgi?query=gbde&sektion=4

See the section about “Steganography support”.

CG April 18, 2006 9:06 AM

“The basic idea was the fact that the existence of ciphertext can in itself be incriminating, regardless of whether or not anyone can decrypt it.”

Doesn’t the existence of a Deniable File System have the same incriminating property?

James Walker April 18, 2006 9:09 AM

“Overall, a complete linux running in userland stored on USB may be the more practical approach.”

There is a device that kinda does this. It’s called a Blackdog. It’s about the size of a credit card, 10 cards thick. It has a fingerprint reader, and an PPC/FPGA running linux. That is, processes run outside the OS of the “host PC”. It’s a weird device that isn’t very secure, but it’s a good start and great idea.

I think the company that made it is making a new model.

radiantmatrix April 18, 2006 9:09 AM

Details on the TrueCrypt “Hidden Volume”: http://www.truecrypt.org/hiddenvolume.php

Basically, you create an “outer” encrypted volume (we’ll call it ‘O’). You put a few files in it that aren’t really all that important, and protect O with a passphrase.

Then, using some of the free space on O (which is random bits), you create an “inner” volume (‘I’), which looks like random bits. Nothing about O gives any clue as to the existence of I — you have to know I is there to even attempt to use it.

The idea is that if someone captures your filesystem, you might be forced under duress to give up the passphrase for O, but could still keep the existence of I hidden.

In principle, however, the fact that TrueCrypt has this function is public knowledge, and therefore keeping I a secret isn’t trivial. If detained by “legal” police, you might have deniability — it would be hard for them to prove you have an I volume.

If detained by the secret police, however, one would be questioned about the use of this feature, and such (hopefully hypothetical) illegal organizations are willing and able to “extract” such information (e.g. through the use of sodium pentathol).

Russ April 18, 2006 9:17 AM

Jetico’s BestCrypt product also offers hidden containers. I haven’t tried using them though, I’ve always been afraid of getting my drives confused and writing to the container holding the hidden container and accidently damaging/destroying the hidden container and its contents.

Jungsonn April 18, 2006 9:30 AM

Indeed, invisibility is key.

Just like the ‘ZoneAlarm’ method.
Why protect a ‘known computer’? Everything can be compromised
(in the end with bruteforce)

Simply make the computer invisible.
because things that do not exist cannot be comprommised.

frog51 April 18, 2006 9:46 AM

Rubberhose was always clunky, but the major feature it had, which TrueCrypt doesn’t, is the unprovability that there are any further layers of encryption. As physical torture becomes less useful if the victim can plausibly deny any further layers…Thus the name Rubberhose

Bruce Schneier April 18, 2006 9:47 AM

“Doesn’t the existence of a Deniable File System have the same incriminating property?”

Yes. But if it worked easily and well, it could become a standard feature in — say Linux — and only used by the few who need it.

Chris S April 18, 2006 9:58 AM

So …

When everyone has security tools, then having security tools won’t mark you as a criminal.

I like it.

Alex April 18, 2006 9:58 AM

The problem with TrueCrypt is that you need to be an administrator to run it under WinXP. Therefore it’s not terribly handy for travelling.

Prohias April 18, 2006 10:25 AM

“The basic idea was the fact that the existence of ciphertext can in itself be incriminating, regardless of whether or not anyone can decrypt it.”

“Doesn’t the existence of a Deniable File System have the same incriminating property?”

“Yes. But if it worked easily and well, it could become a standard feature in — say Linux — and only used by the few who need it.”

Wouldn’t it be equally effective to have randomly sized chunks of text that cannot be distinguished from cipher text in the file system, and make this a standard property of the distribution?

dhasenan April 18, 2006 10:28 AM

I heard about Rubberhose perhaps two years ago but couldn’t find a copy of it when I looked again a few months afterwards.

The existence of a Rubberhose partition would be quite incriminating. While you could try hiding partitions, that would require the Rubberhose program to recreate mount and save at least some of the partition information in an accessible file. Also, if there’s a predictable format for the partition, then it would be simple enough to search for it, even if you didn’t know what was on it.

Now, if the format was to have an encrypted segment at the beginning of the partition with partition information, that might be doable; but in order to be secure, you’d have to fill the entire unpartitioned portion of the disk with Rubberhose-style chaff.

Even so, you’d have to be dealing with people who don’t presume guilt; you’d have to hide the Rubberhose program; and you’d have to hide any data that Rubberhose needs in order to mount that partition. An interesting issue. Hardware solutions, as aforementioned, are probably best; though of course you could simply put all your sensitive information on a Flash drive and keep a brick handy.

Paul Kirteski April 18, 2006 10:54 AM

“Rubberhose was always clunky, but the major feature it had, which TrueCrypt doesn’t, is the unprovability that there are any further layers of encryption.”

Actually, TrueCrypt does have that feature. You cannot prove that a hidden TrueCrypt volume exists (unless you break AES or the mode of operation).

“As physical torture becomes less useful if the victim can plausibly deny any further layers…Thus the name Rubberhose”

Infinite nesting could actually cost the person his/her life. TrueCrypt has only one level of nesting, so torturing can be stopped by revealing the second password, if necessary. But with infinite nesting, you could be infinitely tortured.

The problem with deniable file systems (such as Rubberhose) — TrueCrypt does not have this problem — is that there is no plausible reason to use such a file system except wanting to conceal data. All deniable file systems have worse performance than conventional file systems (overhead). This makes any sensible “deniability” somewhat infeasible.

By the way, the last alpha version of Rubberhose was released in the year 2000. The project never even released a beta version.

Tristant April 18, 2006 10:56 AM

“By the way, the last alpha version of Rubberhose was released in the year 2000. The project never even released a beta version.”

Also, Rubberhose needs Linux kernel 2.2. Now consider the fact that kernel 2.4 is no longer officially supported by Linus Torvalds. Rubberhose is pretty obsolete software, Bruce.

Ian April 18, 2006 11:30 AM

Wouldn’t the existence of Rubberhose on the computer be equally as incriminating as the existence of unknown encrypted data?

False Data April 18, 2006 12:12 PM

For those of us who aren’t doing cloak-and-dagger work, just having a USB drive that leaves no trace of the files’ contents (as opposed to its own existence) on the host machine would be very useful. However, the big challenge I see with USB drives, if you just jack one into your local Internet Cafe, is securing them against virus infection. We used to solve a related problem using a “protected” region of flash, a region of the chip where you can store data and then blow a write protect fuse so you effectively get a bit of ROM on your flash chip. With a feature like that, or something similar activated by a mechanical switch, you might be able to add a security layer, an executable front-end that checksums (and/or encrypts) everything else on the drive. As others have pointed out, this scheme would still leave recognizable portions of the front-end in the host’s swap space, but you might at least be able to put portable apps on the drive that are designed to avoid leaving recognizable sensitive data there.

dimitris April 18, 2006 1:03 PM

The “posession” of deniable filesystem support may indeed be incriminating. “Yes I have the module but I don’t have a hidden volume” won’t work when the interrogator is not in a presumed-innocent context. Finally, as someone pointed out, an infinitely deniable system may cost the owner their life if the interrogator is free to torture.

I remember reading/watching a book/movie where some “criminal”, half-jokingly, said that if caught and questioned by the cops, it’s always good to have a lesser offence available to “confess” after some interrogation, in order to hide the “real thing”. If I remember correctly, the perp was committing “treason”, but also had some stolen goods handy. Posessing them would explain to the investigator why you were employing whatever stealth you were when caught – “what were you doing out and about aftrer curfew?” (I believe it’s a Terry Pratchett book but I’m not sure).

Anyway, applied to deniable filesystems, I wouldn’t be surprised if things like pornography, or “light industrial espionage data” were used as smokescreens in a multi-layer deniable filesystem.

Woody April 18, 2006 1:04 PM

My initial thought on this, when using something like a blackdog, is that one could allow it to talk USB mass storage to an external computer, and go through a security mechanism similar to port knocking.

If you ask the drive for a directory listing, it gives you a normal FS to read/write from.

After some sequence of failed directory listings (ie, dump to command prompt and:

d:
cd foo
cd bar
cd foo2
cd bar2
and then write a to a file:
echo “test” > d:\foo\bar\test.txt

or similar

Then after seeing all of those commands come down the USB pipe to the black-dog, you unlock a hidden volume on the device (maybe again paired with the biometrics that the device has).

Failure points here are that anything you do on the commandline is probably stored in the commandline buffer. Any files you write to are likely in the recent documents list. And locations you try to browse to in the finder/explorer are likely to get logged.

So you’d want to make it a very high noise to signal ratio so that bogus commands could be included, or that anything that was found turned out to look completely innocuous. Even better would be to have another USB memstick that DID have the sequence on it, or a FS that the correct sequence was a subset of operations. The idea is plausible deniability, right?

Bill April 18, 2006 1:14 PM

I’ve been looking at the FAQ from the Truecrypt site and I found these interesting q and a’s that would create even more plausible deniability for a user of encryption.

“Q: Is it possible to use TrueCrypt without leaving any ‘traces’ on Windows?
A: Yes. This can be achieved by running TrueCrypt in traveller mode under BartPE. BartPE stands for “Bart’s Preinstalled Environment”, which is essentially the Windows operating system prepared in a way that it can be entirely stored on and booted from a CD/DVD (registry, temporary files, etc., are stored in RAM – hard disk is not used at all and does not even have to be present) . The freeware Bart’s PE Builder can transform a Windows XP installation CD into BartPE. As of TrueCrypt 3.1, you do not need any TrueCrypt plug-in for BartPE. Simply boot BartPE, download the latest version of TrueCrypt to the RAM disk (which BartPE creates), extract the downloaded archive to the RAM disk, and run the file ‘TrueCrypt.exe’ from the folder ‘Setup Files’ on the RAM disk (the ‘Setup Files’ folder should be created when you unpack the archive containing TrueCrypt).

Q: Can I mount a TrueCrypt volume stored on another TrueCrypt volume?
A: Yes, TrueCrypt volumes can be nested without any limitation.

Q: Can I run TrueCrypt with another on-the-fly disk encryption tool on one system?
A: We are not aware of any on-the-fly encryption tool that would cause problems when run with TrueCrypt, or vice versa.”

Run Truecrypt as a container inside an encryption program that does not have hidden container. Or keep a boot CD around that can download Truecrypt when needed and keep the container as a ‘corrupt’ .wav or other type of file. Truecrypt does not care about the type of file. It will try to decrypt any file you tell it to, and if it cannot it will fail never indicating if it was a truecrypt file or not. I think the error message is approx. “Not a truecrypt file or the wrong password”

Ale April 18, 2006 1:18 PM

If we assume that the computer and belongings of a given person will be subjected to detailed forensics down to the statistical analysis af free disk space, I think that deniability is extremely difficult to assure. Even if the information itself is not identifiable as such, the software used to access it will be conspicuous as not part of a typical user installation. If, however, a fake password produces a volume containing mildly incriminating documents and the interrogators are not into heavy duty entropy analysis and such, then the user may escape unscathed with the real bobshell data in a different volume. And the life saver is that both volumes are actually intermixed… It is very difficult to prove the existence of a second volume once that one has been found.

aikimark April 18, 2006 1:32 PM

@Bruce,

<>
I looked for a device I’d seen some time ago without success. It is a front-end appliance that is powered by the USB port power and runs a lightweight version of Linux. It is usually bundled with an external hard drive.

Maybe some of your blog posters can point you to this appliance.

Simon McVittie April 18, 2006 2:03 PM

You can avoid leaving traces in swap using one of various mechanisms for encrypted swap under Linux – the main candidates are loop-AES and dm-crypt. Because swap inherently doesn’t need to be preserved between boots, the key comes from /dev/random and only ever gets stored in kernel memory.

On Debian, installing cryptsetup and putting something like this in /etc/crypttab and /etc/fstab will give you encrypted swap:

crypttab

myswap /dev/hda8 /dev/random cipher=aes-cbc-essiv:sha256,size=256,swap

fstab

/dev/mapper/myswap none swap defaults 0 0

Matt Austern April 18, 2006 2:04 PM

I’m sure that the idea of having a lesser offense prepared so you can confess to it if necessary has been used in many books, and I’m sure it has also been used many times in real life. I couldn’t possibly guess who invented that idea. The first time I happened to come across that idea, though, was in one of Heinlein’s juveniles: Revolt in 2100. Possibly that’s where you came across it for the first time too. I’m sure I’m not the only person here who read lots of Heinlein as a kid.

Antonio Varni April 18, 2006 3:40 PM

I’d thought about doing this a long time ago – but never got around to it:

  • use a steganographic filesystem that can provide multiple levels of deniability.
  • create two levels of protection. The first level would be the ‘fake’ hidden content. This could be something like pornographic videos (which is something that would appear plausible to law enforcement / whomever). The second layer is your ‘real’ hidden data.

If evidence is found that you’re using a steg FS – you can unmask the porn content. Your real hidden data is safe, and you won’t be found in contempt of court by refusing to ‘reveal’ what you’re hiding.

1915bond April 18, 2006 3:56 PM

Also seems like a slick way to stealth malware….might be a little too embrangled for quality ops, tho.

Brian April 18, 2006 4:01 PM

One way to check for the presence of a large amount of hidden data would be to try to fill up the disk. If the amount of data you can put on the disk is significantly less than the capacity of the disk, you can be fairly certain there is something hidden there.

Small amounts of hidden data could still evade detection, since things like bad blocks and file system metadata do count for something.

On the other hand, if you only need to hide a small amount of data, why bother with something as elaborate as an encrypted file system?

Tristans April 18, 2006 4:11 PM

“One way to check for the presence of a large amount of hidden data would be to try to fill up the disk.”

I’m not sure if you are referring to TrueCrypt’s hidden volume. But if you are, then you’re wrong. By doing so you would destroy the hidden data.

However, this can be cleverly prevented. From the TrueCrypt website:

“Protection of Hidden Volumes against Damage

As of TrueCrypt 4.0, it is possible to write data to an outer volume without risking that a hidden volume within it will get damaged (overwritten).

When mounting an outer volume, the user can enter two passwords: One for the outer volume, and the other for a hidden volume within it, which he wants to protect. In this mode, TrueCrypt does not actually mount the hidden volume. It only decrypts its header and retrieves information about the size of the hidden volume (from the decrypted header). Then, the outer volume is mounted and any attempt to save data to the area of the hidden volume will be rejected (until the outer volume is dismounted).

Note that TrueCrypt does not modify the file system (information about free space, etc.) within the outer volume in any way. As soon as the volume is dismounted, the protection is lost. The hidden volume protection can be activated only by users who supply the correct password (and/or keyfiles) for the hidden volume (each time they mount the outer volume). For more details, please see the section ‘Protection of Hidden Volumes against Damage’ in the documentation.”

Brian April 18, 2006 4:56 PM

So it looks like you have a choice with TrueCrypt – you can either enter the password whenever you want to write to the outer volume, or you risk destroying the data on the inner volume. It makes sense. If you are really under duress, you are probably hoping that someone writes data to the drive and thus destroys the evidence against you.

But you better not forget to enter that password!

TimH April 18, 2006 5:27 PM

The real key to deniability is the presence of the de/encryption software as standard in the OS. Otherwise, the investigators just have to find that and deniability is gone. That includes Bart PE: Why do you have Bart, sir? Ah, my XP keeps crashing and I use PE to repair it. But sir, your XP logs don’t show a crash for 18 months…

Anonymous April 18, 2006 6:00 PM

Brian: “One way to check for the presence of a large amount of hidden data would be to try to fill up the disk. If the amount of data you can put on the disk is significantly less than the capacity of the disk, you can be fairly certain there is something hidden there.”

There are two possibilities: either the hidden data is within an existing partition or it isn’t.

In the former case, you look for unpartitioned space with junk on it and try decrypting bits of it while torturing the prisoner for the password. If you try filling the disk by creating a partition over the hidden data, that can destroy data.

In the latter case, if it’s a journalling filesystem, you check the journal against what’s in the partition. Otherwise, the best you can do is look for noncontiguous data. At any rate, the filesystem doesn’t know about it and would happily overwrite it if you asked it to.

Bruce Schneier April 18, 2006 8:59 PM

“So it looks like you have a choice with TrueCrypt – you can either enter the password whenever you want to write to the outer volume, or you risk destroying the data on the inner volume. It makes sense. If you are really under duress, you are probably hoping that someone writes data to the drive and thus destroys the evidence against you.”

I could never think of any way around this. For the file system to be truly deniable, it has to be possible to overwrite hidden data without the correct set of passwords.

Devin Binger April 18, 2006 11:47 PM

Several posters have mentioned that Rubberhose will not save you from the secret police. Indeed, it likely will not. Once the secret police have you, nothing short of immense popularity among a major power bloc will save you.

Rubberhose will protect your comrades. What I do is create several contiguous and identical volumes, a few with real data and the rest with dummy data. Now when they take me away and connect the field telephone, they can’t ever know that I’ve given the real password. This takes away my incentive to cooperate: If I tell them the real password, they won’t know I’ve really given them everything and will in all likelyhood keep the pressure on.

Given a choice between real torture and betraying your comrades, almost everyone will eventually choose betrayal. Given a choice between torture or betrayal AND torture, the theory goes, there’s a much higher chance that the captive will resist. I don’t know if I really believe that, but if Rubberhose-like multiple-deniable-volume systems attained common use and it became known to the secret police that they could often attain more information by continuing to torture a captive who has already confessed, it could bolster a captive on the edge of breaking, at least for a time.

nbk2000 April 19, 2006 12:43 AM

“Infinite nesting…infinite torture”

Once they know you’re using an encryption program that supports nested volumes, is there anyway to PROVE to them that there aren’t anymore hidden volumes?

After all, you may break down and give them the second layer passphrase, in hopes of stopping the torture, but what’s to stop them from thinking there’s more and continuing to attempt a passphrase ‘extraction’ on you?

If the program was really intended to provide deniable encryption, presumably to prevent successful ‘rubberhose’ cryptanalysis, shouldn’t it also provide a provable ‘escape clause’ for you in case they don’t stop at the first or second layer?

I wouldn’t want to be in Room 101 for a moment longer than I could help it.

Paeniteo April 19, 2006 3:02 AM

“is there anyway to PROVE to them that there aren’t anymore hidden volumes”

No, there isn’t, and IMHO this is technically impossible.
How could you prove that this certain part of a file is actually purely random and won’t decrypt into a useful header for a hidden volume – apart from trying all possible passphrases?

Asteroza April 19, 2006 5:07 AM

The BlackDog is fairly close to what people want in terms of a USB device based file system with decent security and a deniable filesystem, that can actually be used most everywhere. It’s a little cranky getting it to work right under a Linux host PC, but since most internet cafe’s and businesses are windows shops, that isn’t a big issue. Now that TrueCrypt 4.2 is out today, with proper Linux support, the BlackDog should now be capable of meeting peoples needs (once loopback support is working properly, which the company has finally commited to).

Of course, this will not protect you if you share the mounted container via samba to the host PC (in terms of both deniability, and to a lesser degree infection routes), and there is the ever present issue of keystroke loggers and screenscrapers on the host PC, though there are potential mitigation schemes, including up to booting the host PC with a known good linux distro (an ability currently in development by the company).

The BlackDog websites have had connectivity problems this week due to site upgrades however, so I would suggest reading the current Wikipedia entry, as it lists the specs for the next hardware revision, which is expected to come out in May.

Jungsonn April 19, 2006 5:17 AM

U mean like:

cmd:
calcs:\System Volume Information” /E /G :F

Well, everyone knows that data on a disk can only “truly” be erased when the disk is thrown into an acid bath or cutinto 1 billion pieces… or do they know? So to make a “hide” is not very plausible.

Data always resides on disks, once written it remains there. One should not be ignorant to think that when one throws out the garbarge, that the data is gone. no. to easy.

Only thing is to shred the bits, with a good shredder the bits become unreadable. But even then.

The moral is?

I do not believe in storing sensitive data and encrypt it,and use it on the same pc it once was “unencrypted”

Pat Cahalan April 19, 2006 10:55 AM

For the file system to be truly deniable, it has to be possible to overwrite hidden
data without the correct set of passwords.

Sure, but you could hedge your bets. It stands to reason that you’re most likely not trying to make a large chunk of data hidden, right?

So just use lots of redundancy/parity. If you bloat up your 20MB incriminating plan to blow up something several times over, you can have four or five effectively (raid1)*(raid5) copies of the data. Sure, you can still possibly lose it if you lose the section of the disk that has the volume information, but there’s always deniable backups 😉

Neil April 19, 2006 12:08 PM

Concerning TrueCrypt and Room 101: my security needs are limited to wanting to have personal records on a flashdrive on the road. Since possessing encrypted data can be considered incriminating, I would like to be able to prove that I have nothing more than what I can decrypt for the folks in room 101. It seems like you have to pick either no nesting and no deniability or nesting and deniability.

Krunch April 20, 2006 4:16 PM

is there anyway to PROVE to them that there aren’t anymore hidden volumes
If the whole disk is full of data, you could prove there is no more hidden data since there is no space left.

Paul April 20, 2006 8:52 PM

Perhaps, I am missing something here, but when I read the article it referred to a laptop being captured by the bad guys who will presumably torture a suspect until they give up a passphrase.

As many have already pointed out, the existence of multi-layer encryption systems must be assumed on the part of the captors. Therefore, even “legal” police would be better served (IMHO) by having a warrant to install a keylogger on said laptop, then simply release it for a few days/weeks and recapture it complete with all passphrases.

Said keylogger would not have to be software, btw. With the resources of a government agency, I wouldn’t have thought it difficult to create a hardware keylogger that could be hidden readily within a laptop PC, even disguised as some innocuous PC component. How many human rights workers can find and remove a bug (while in the field)?

Much cleaner than torture, and likely with a higher success rate. Of course, a small amount of coercion may need to be applied to the prisoner in order to distract them from the real goal, which is to release them believing that that have gotten away with it.

Just a thought.

Paeniteo April 21, 2006 2:55 AM

is there anyway to PROVE to them that there aren’t anymore hidden volumes
If the whole disk is full of data, you could prove there is no more hidden data since there is no space left.

But you can’t prove that there is no hidden data embedded somewhere in the visible files. Steganography can hide information in images, sound files, even plain text and who-knows-where.

To prove your innocence beyond doubt, you would have to prove that there does not exist an algorithm which can extract information out of the contents of any file of your hard disk.
Needless to say that most steganographic algorithms accept passphrases to modify their hiding-patterns so that you would have to try every existing algorithm with every possible passphrase.
Non-presence of appropriate software is no prrof of innocence, either, as you could download such software from the net every time you use it, or even program it from scratch.
It is not even enough to restrict this to known algorithms as you could have devised your own (think of the “playing cards” as in Cyptonomicon).

Conclusion: You are busted.

Paeniteo April 21, 2006 2:58 AM

@Paul: Truecrypt also accepts keyfiles as part/replacement of the passphrase so pure hardware keyloggers would be useless and software would have to be designed to specifically monitor file system access in addition to keystrokes.

Anonymous April 21, 2006 10:58 AM

Pat, stegfs was doing just this: store each block of data many times at different block numbers on the volume. With large volumes and many copies per block, it is very unlikely that one destroys all copies of a given block even when writing with the deeper levels locked.

Then there is always “slack”, that unused average half of the last block of any file. If a filesystem is sufficiently old then the probability that the last block has previously been a non-last block of the same or another file is very high, even if the file system was initially created on a all-zero volume.

My answer to the question “can i prove that my deniable-encryption file system has no more hidden layers?” is: I think there is none, for the reasons that Paeniteo stated.

And if there was a way to prove, it would need to be optional, because if it were mandatory then one could just punish you until you give that proof, hence it would no longer be “deniable encryption”.

Deniability stems from the fact that there is no way to prove if encrypted data exists or not.

Christoph Zurnieden April 22, 2006 12:33 PM

Every time I hear of “deniability” the old addage “You cannot prove a negative.” comes up to me. A perfect DFS (deniable file system) would therefore be a prove that you cannot prove a negative, thus all data would be suspect automatically and one has to react accordingly (asume the worst or whatever) and the only defense would be to have no data at all; even a couple of pictures of the “grand leader” would be suspect, they will(!) give the blueprint of a bomb if treated with the correct algorithm.
It is quite similar to the XOR-thingie[1] some days ago: a mathematical sound solution might not be sufficient in a juristical environment–there is rarely a technical solution to a non-technical problem.
But I certainly don’t want to bawl out all attempts to build a DFS because, ampong others, a good DFS may give the defender some extra time: time to delete the offending material, time to ignite the bomb, time to gather the people needed to start a revolution and so on; steganography has a lot of advantages.
So, Bruce, no reason to stop pursueing but to try harder?

CZ

[1] I took a look at the sources and I don’t use the notion “overengineered” lightly, but if that word doesn’t fit to that program … 🙂

hugh crawford April 24, 2006 9:24 AM

An adaptation of one of those Linux on an iPod projects with an encrypted file system etc. would be a little less suspicious that that BlackDog thing. You could probably easily set up it to self destruct the incriminating info without the regular application of the super secret handshake.

Clive Robinson April 24, 2006 10:03 AM

Just a little note about Flash drives and their kin.

1, They use the same technology as modern aircraft black box flight recorders used during missile development, and the data on them is extreamly dificult to destroy physicaly (a pluss point for black boxes a significant minus for security).

Some experiments have shown them to still be easily readable after being blown up with conventional explosives, so your house brick, foot, teeth etc are just not going to hack it. This is true even after the chip casing has been significantly damaged, and even if the chip it’s self is damaged it can still be nano probed and some if not all the data retrieved.

2, Even if erased correctly Flash chips etc have residual memory issues like the old “screen burn in” that can reveal the data to the appropriate probing technique.

3, Also if unpowered they are quite fire resistant needing the actuall chip temprature to get above something like 300C for a sustained period of time for data erasure to be gaurenteed.

Looks like for reliable data destruction you would need a mixture of Aluminium powder and Iron Oxide etc (Thermite) and an igniter for it built in and around the chips, not something you would care to carry around in your pocket.

I have not seen any data on how well they survive being cooked in a microwave oven, but then microwaves are not exactly pocket portable either…

geneva convention April 24, 2006 11:23 AM

@Devin Binger: “Once the secret police have you, nothing short of immense popularity among a major power bloc will save you.”

But, I AM NINJA! … and now I have them exactly where I want them Muhahahahaha!

Anonymous April 24, 2006 11:39 AM

@nbk2000 and @ all the other’s expressing similar views:

“I wouldn’t want to be in Room 101 for a moment longer than I could help it.”

Oh right, and after you tell them where you put the bombs, their gonna send you home with cookies and apple pie.

People who are using crypted filesystems for deniability and get caught by someone ‘like’ the secret police, well it doesn’t matter if you have the specific information they are after or not. It’s too late.

For one thing, the mere existence of correctly engineered plausible deniable crypto means your consumer-grade plausible deniable crypto buys you **** all. They just assume you successfully hid the rest from them. The decision to continue your torture is mediated by a number of psychological and political factors, not technical factors such as your preference in operating system. And it doesn’t matter if the guys who got you were the ones you were hiding the stuff from, because to the guys who do have you, you are a sneaky fing crypto smartarse and they are control freaks who don’t particularly like those sneaky fing crypto freaks. Which is only worsened by the fact they are dependent on sneaky fing crypto freaks to help them unravel your sneaky fing computer. They’ll probably deal with that person later in ways that relieve their stress.

If you start into using this sort of scheme, either you’re stupid or you’re taking a risk. And if you are sensible you take the risk only because the benefits outweight the risk. Sadly we live in a world where this may sometimes be the case.

If you do not have totalitarianism working against you, you don’t use this stuff generally. If you do, you already have room 101 as part of your life, and you’re trying to get rid of it. Or you work in it.

Anonymous April 24, 2006 11:44 AM

“If you do, you already have…”

Sorry, I mean “If you do have totatliarianism working against you, you already have …” … re-read it now.

dinsdale April 26, 2006 9:27 AM

Truecrypt with deniability:
1. encrypt your tax docs in a standard container.
2. encrypt a device (unformatted partition) w/hidden volume for your mp3 files–soon to be illegal if new DMCA passes.
–Nobody can assert that an unformatted partition is a container.

What Truecrypt needs is a special password that nukes the hidden container–or automatically nukes it after N sequential bad password attempts.

PaulM May 15, 2006 5:48 AM

What Truecrypt needs is a special password that nukes the hidden container–or automatically nukes it after N sequential bad password attempts.

what makes you think that gov’t agencies would use the same version of TrueCrypt that you use – they would have their own version which (a) doesn’t nuke data after failed decryption atttempts (b) allows dictionary attacks to crack passwords (c) automatically cycles through the different algorithms.

since the NSA have huge farms of computers, they would have a good chance of cracking the truecrypt data.

ReD May 15, 2006 6:36 AM

Even when the filesystem is undiscoverable / undetectable and unreadable, wouldn’t using the files in the filesystem leave traces on your system?

Think about features like ‘Recent documents’, indexing service / google desktop search, MRU lists in applications, filenames and configurations stored in the registry or files, etc… Even file-open dialogs keep tracks of the documents opened with them. I’m talking mainly about windows here, but i’d be surprised if linux machines or applications didn’t have simular featuers as well.

Not much sense in denying the existence a file (or even a filesystem) which name is stored somewhere on your pc is there?

Chris May 15, 2006 11:17 AM

“Next request: A deniable file system that fits on a USB token, and leaves no trace on the machine it’s plugged into.”

I actually designed a small steganography project last year that should fit the bill. It requires a soldering iron and under $100 in parts, though my goals were a little different (I included the stipulation that the device itself should resist detection as a data storage system).

Do-it-yourself steps for anyone who’s interested:

  1. Take an old (and more-importantly: large) USB optical mouse (or any similar USB device with a fair bit of empty space in the chassis) and sever the connection between the cord and the mouse’s PCB.
  2. Obtain and strip down an Adaptec X-Hub 2 or similar compact bus-powered 2-port USB hub, remove the casing and USB connectors from the board.
  3. Wire the upstream port of the hub to the mouse cord.
  4. Wire one hub port to the mouse’s PCB
  5. Connect the data and ground pins of the remaining hub port to a female usb type A connector using three short lengths of wire (1 to 3 inches).

  6. Connect the hub’s remaining +5v pin to a magnetic reed switch or mercury switch, and wire the other side of the switch to the +5v pin on the female type A connector.

  7. Re-assemble the mouse, you should be able to mount the original PCB in it’s usual place, and glue the USB hub and the switch elsewhere in the chassis (I placed a reed switch on the left-hand side of the mouse), it is best to leave the USB type A connector free on an inch or so of wire inside the chassis, and you may want to cover the hub or other exposed parts with electrical tape to prevent accidental shorts.

  8. Connect any small USB storage device to the type A connector and close the case.

  9. Test the mouse, it should still work normally. You can activate the block device (which should then be detected as a hot-plugged device) by placing a magnet next to the mouse (for a reed switch) or by tilting the mouse to the appropriate angle (for a mercury switch).

  10. For software, I used the standard cryptoloop device support in Linux 2.6 kernels (i.e. encrypt the entire USB storage device at the block-device level). Provided that you deactivate appropriate command-history and logging options (and mount your main filesystems with “noatime???), you only have the standard default encryption software on your system, and there is no evidence that you have ever used it. The USB device itself holds only encrypted stream of data, which should be very difficult to distinguish from random noise (assuming someone detects it’s existence in the first place). You can use other software, my only suggestion in that case would be to store the software on an un-encrypted part of the USB device itself.

A couple of people have suggested sealing the mouse casing, but personally I think that by the time somebody decides to disassemble your mouse looking for something, it’s probably too late to deny the existence of the device, also as long as you can still open the mouse chassis, you can always upgrade/replace the USB storage device (e.g. I built and tested mine with a 32MB SanDisk Cruzer Micro, but could easily swap a 2GB version in if I felt the need).

The other good suggestion I’ve had is to build a similar device using a larger USB hub instead of a mouse (this gets rid of the one major vulnerability with my own design: anyone paying close enough attention will notice the hotplug detection of the USB hub inside the mouse).

p.s. No, I’m not trying to smuggle data I just enjoy this sort of thing.

Samiam May 15, 2006 11:35 AM

There is also the m-o-o-t project, started by Peter Fairbrother against the contingency that the UK would begin to enforce Part 3 of the Regulation of Investigatory Powers Act (RIPA). Since that has never happened, the project is currently mothballed. However, there is quite a lot of information about theoretical and practical aspects of successful camoflage and denial of encryption as it applies to email messages. Worth a look..

http://www.m-o-o-t.org/

vedaal May 15, 2006 11:49 AM

] “So it looks like you have a choice with TrueCrypt – you can either enter the password whenever you want to write to the outer volume, or you risk destroying the data on the inner volume. It makes sense. If you are really under duress, you are probably hoping that someone writes data to the drive and thus destroys the evidence against you.”

I could never think of any way around this. For the file system to be truly deniable, it has to be possible to overwrite hidden data without the correct set of passwords.

maybe a possible solution would be to have an ‘undo’ option,
where TrueCrypt (or a similar file system)
stores what/how it ‘overwrites’,
so that it can be reversible

it can plausibly be stored as ‘backup’,
either on the hard-drive or on removable media,
and then the ‘hidden volume’ could be reconstructed as necessary

— vedaal

Samiam May 15, 2006 12:26 PM

I should have read the comments pertaining to the “mere presence of encryption is incriminating” problem before posting about m-o-o-t. The main point of the exercise was to use a dictionary approach to allow plausible alternative decryptions. If you can supply the authorities with a key that translates your message as “How about lunch on Thursday?” in place of something more sensitive or compromising, they don’t have many remaining options…

Samiam May 15, 2006 2:47 PM

Anonymous – Room 101:

You are correct in that once a society has disintegrated to the point that the gang with the badges need not even pay lip service to laws, conventions, or unfavorable publicity, there is no help from such mild strategies as encryption, deniable or otherwise. At that juncture only three basic strategies remain: go somewhere else; become a kiss-booty collaborator and hope for survival for yourself while selling out your fellows; or adopt the “Tasty Tidbits of Smoky Turkey Breast” POV. For most purposes we haven’t arrived at that point yet…

Exothermicus May 15, 2006 8:36 PM

Chris,

I had a similar idea to your mouse, USB hub with embedded thumb drive.

My idea was how to hide a whole computer acting as a disk server. My idea was to place the computer and a modified network switch in a hidden location, with two cat5 cables running to the hidden location, each end, going to switches near two other computers. The hidden switch would require modifications such that if either end of the two cat5 cables going to the hidden location are disconnected, the cable will appear to get disconnected at the opposite end.

Only problem, this would be obvious if the cables are tested with a Time-domain reflectometer (TDR). The TDR would show the cable was not the correct length and depending how the switch modifications were done, could also show there is a tap in the cable.

Also, there would be the problem of sheilding for RFI from the hidden computer that could be detected with radio equipment. As well as ventilation, and remote power control of the hidden PC.

Good investigators, will probably notice any significant inaccessible space within a building, and look for access to that space.

Exo

Flash00 May 15, 2006 9:56 PM

For a perfectly good reason to encrypt stuff on your computer, you need look no farther than the stolen or lost computers and backup tapes full of personal info that regularly make the news.

Incidentally, there is a very high performance little (~70MB) Linux distro that boots and loads from CD, or USB flash drive, or HDD, and runs entirely in RAM (minimum 128 MB, including swap, for best results), but can save back to a (multisession) CD or DVD, or USB flash drive, or hard drive. It’s Puppy Linux. Check it out here:
http://www.puppyos.com/

Warren May 16, 2006 10:54 AM

Another option is to use a live CD distro. They’ll leave nothing behind, and they can be used in a hard drive -less machine.

Mandrake used to have a live CD that came with a USB key where it saved settings, home directory, etc.

SidViscous May 16, 2006 12:12 PM

Just had a thought tangentially related to this while reading this recent article.

http://www.it-observer.com/articles/1136/wireless_security_attacks_defenses/

Besides an invisible encrypted file system. What about a file system that hides certain files. Similar to the hidden option, but hides the files based upon a user, or a user group.

For instance. Years ago I worked in a company and I had some IT responsibilities. One day while perusing the network I found that a file in a universal share was publicly accessible within the company and was a spreadsheet with everyone’s pay rates on it. After numerous times trying to get it onto a password protected share and the CFO refusing, I simply printed it to every printer in the company. But that’s another story.

My thought is that a file should be marked as hidden to all except those with the proper permissions. So for instance a salary information file would only be visible by someone within the accounting or upper management group.

I know this is a bit of “security through obscurity???. But as we all know, if you can identify the file you want to attack makes things a bit easier.

Encrypting that file would be an added bonus. And if encrypting files for more mundane reasons became universal, than the sheer amount of encrypted files would lead to added protection for more furtive encrypted files.

Peter May 16, 2006 4:49 PM

As far as I can see the only solution to this is to make randam data ubiquitous. Random bits everywhere, prefrerrably each unused bit is random.

Disk formatters should fisnish with random data, all files should be randomized when deleted, all CDROMS should fill up to the last unused sector with random bytes.

Communications protcols should have a random data channel… and so on.

Before that we are not safe.

I guess we also need USB memory keys with a CPU. This would give us some new possibilities.

Zack Hurst June 5, 2006 5:47 PM

I wrote a program a couple years ago in college that has a similar goal. It is similar to PGP Disk or Jetico’s BestCrypt except that it uses steganography to hide the encrypted file system data in a series of Windows bitmap files. The software (which consists of a device driver and a GUI) must be installed on the system, but the bitmap files could be located on a USB drive. It should be impossible to prove the existence of the file system given just the USB drive with the bitmap files.

You can find my report and full source code at:

http://www.hurstisc.com/recent_projects.htm#secvol

Here is the abstract from my report:

The goal of this project is to create a computing environment that is more secure than that of any currently available product. It creates a volume in Windows (i.e. a drive letter) that the user interacts with like a normal hard drive. This feature is accomplished through the creation of a device driver. The security is implemented in a two-step process. First, the driver encrypts (or decrypts) the data being written to (or read from) the drive. Second, it uses steganography to hide the encrypted data in a series of Windows bitmap files. Therefore, when the secure volume is mounted, the user has easy access to the data through the lettered drive, and the encryption and steganography are handled transparently by the device driver. When the secure volume is not mounted, all of the data for the drive is encrypted and hidden in a series of bitmap files. An attacker viewing the system would not be aware that any sensitive data exists on the machine.

dan July 22, 2006 5:23 PM

Okay, Try this.

I currently use DSL (damn small linux) which is a 50MB micro live cd ( size of credit card) which is a basic functional desktop environment in X windows. I then throw a file on my mp3 player, which is a 50MB file with an inconspicuous name live firmware.bin. the firmware.bin file is an encrypted file system using 256 bit AES and a long pass phrase. I can “don’t know” about the firmware.bin file as it’s supposedly part of the back end of the hardware, DSL comes stock with AES in the kernel, and the whole system can run out of ram if the pc has 128M. To minimize writeback i can copy the firmware.bin file to ramdisk /tmp before i mount it as a loop device.

The whole thing could be automated with another tiny encrypted-filesystem in the mp3 player (settings.bin?) which has any needed boot/load script as well as a strong ( say 4k byte random bits) password file for the primary partition.

DSL boot cd and mp3player have obvious discrete use.
A half dozen xterm commands, and a short plug in of the player give you full access to files, or reloading of player with changes.

the encrypted filesystems are effectively digital random, and would have to be isolated from the functioning of the device in order to become suspect.

I don’t see any way to have security on a system without controlling the OS, so just plugging something in to a running machine seems like an bad idea for anybody with rubberhose problems. A bootable usb maybe, but besides “noisy” areas of the disk which have the partition i see no way to keep the data deniable. DSL has script to install on bootable usb, so i guess using a minimum non-record keeping file system, and a little bit of code ( or judicuous use of dd) one could keep some data in a noisy block on the back end of the device.

Even with this hardware keyloggers are a problem no matter which way you slice it.

wdef January 16, 2007 8:53 AM

OK this thread is very old, but it still shows on google.

RE loop-aes on Damnsmalllinux: as I’ve posted several times on the dsl forum, DO NOT use the loop-aes module in Damnsmall – it’s too old.

As the loop-aes author has made clear more times than I can remember, version 1.x of loop-aes is broken ie vulnerable.

Knoppix or DSL-N have loop-aes v3.x – use that until a recent loop-aes driver becomes available for Damnsmall.

the_date February 17, 2007 1:28 AM

FYI, et al… to my knowledge no one is working on rubberhose proper.
the newest release in my mirror [made before rubberhose.org went squatter] has the following properties:

rubberhose-0.8.3.tar.gz 544664 985786087 20010328T082807
MD5 2623c11cc49ad09f99d4bb5e2071d050
SHA1 6fd657aa8aec239bc248e3f6e169ad1f2f0ef57c

perhaps the author/maintainer/archivist would care to comment on its status?

thanks.

RXppZ2h0 August 21, 2008 7:53 PM

Keep your truecrypt volume on a FLASHRAM DRIVE!!!!

If your computer falls into the wrong hands they could claim that that volume contains an extra layer of encryption and use that against you saying you did not give them the right password and hold you in contempt.
Want to really get rid of it quick????????
Hold your truecrypt volume on a thumbdrive.
(Here loud bangs at the door????)
POP IT IN THE MICROWAVE OVER GIVE IT A DOSE OF 1000 WATTS AT 2.4 GHZ.
It works, TRUST ME.

Anti NWO since 2005

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.