TrueCrypt Security Audit Completed

The security audit of the TrueCrypt code has been completed (see here for the first phase of the audit), and the results are good. Some issues were found, but nothing major.

From Matthew Green, who is leading the project:

The TL;DR is that based on this audit, Truecrypt appears to be a relatively well-designed piece of crypto software. The NCC audit found no evidence of deliberate backdoors, or any severe design flaws that will make the software insecure in most instances.

That doesn't mean Truecrypt is perfect. The auditors did find a few glitches and some incautious programming -- leading to a couple of issues that could, in the right circumstances, cause Truecrypt to give less assurance than we'd like it to.

Nothing that would make me not use the program, though.

Slashdot thread.

Posted on April 3, 2015 at 1:14 PM • 79 Comments


Anonymous 1April 3, 2015 1:27 PM

Who cares about TrueCrypt when there are real open source/free software disk encryption programs out there which would be more deserving of an audit?

Anonymous 2April 3, 2015 1:48 PM

People who already have computers with TrueCrypt installed care. Though I would not use TrueCrypt on a new computer.

NSAIApril 3, 2015 2:17 PM

"Even software designed specifically for security is never foolproof as recent gigantic data breaches have proven. Hardware key storage, on the other hand, is designed with one important and fundamental purpose: to protect the stored secret key in tamper-proof hardware that employs an array of sophisticated countermeasures against attack. The bottom line is that with hardware key storage, attackers cannot see what is inside the key storage device’s hardened hardware barriers, which matters because attackers cannot attack what they cannot see. It is that simple."

And we have a lot of fools and proof. The insecure stuff has undergone a complex audit proving it is a soft target.

QuodApril 3, 2015 2:21 PM


People who use one of the Truecrypt forks (e.g. Veracrypt). Also, people who have used Truecrypt in the past and want to know whether or not their data could have been compromised.

Anonymous 3April 3, 2015 2:49 PM

'Anonymous 1' - TrueCrypt has stood the test of time. Apart from a few improvements such as the iteration count it's a very solid, stable and robust piece of software. Newer forks (and alternatives) may be less reliable as they've received less scrutiny. Until it stops working (i.e. operating system incompatibilities) then there's no reason why it shouldn't continue to be used.

'Pinatex': There have been a few published occurrences of where LUKS can be entirely compromised.

For the Windows platform DiskCryptor is highly regarded and is a lightweight piece of software that supports multiple and cascading algorithms - it's also FOSS. This has been around for a long time but has a very small development team (possibly only one man).

PGP Disk Encryption is used in industry but now that's it's owned by Symantec I don't know if they still publish the source code. Even so, it's not free.

AirheadApril 3, 2015 4:00 PM

Given the first-out-of-the-blocks comments by Anonymous1 & Anonymous2, Truecrypt must be good enough for certain parties to want to restrict its use as much as possible by trashing it (trolling) online.

The problem with LUKS is that if you want to do an FDE using the systems installer at system install time in say Mint you are restricted to the AES defaults. It seems to be technically possible to partition the disk using LUKS - LVM with the options of your choice and then to do the install on top of that--but it is to say the least confusingly complicated and no one has ever published a straightforward cookbook how to do it. Wouldn't it be relatively easy for the maintainers of the system installer to install a drop down menu to allow you to choose the encryption options you want, the way TrueCrypt does? (Hint: in such a case they could even allow an option to dispense with the SWAP file for those with adequate RAM.)

Who is responsible for locking the vast majority of LUKS - LVM users into the particular defaults by not giving them easy alternatives?

metaschimaApril 3, 2015 4:11 PM

Although TrueCrypt is no longer developed, VeraCrypt is a fork that is being actively developed. They are already trying to address the issues found by the audit.

AFAIK, TrueCrypt was the only open source multi-platform encryption software that supports volumes and hidden volumes. GNUPG is also open source and multi-platform, but it doesn't support volumes, and is more geared towards public key cryptography.

GweihirApril 3, 2015 4:15 PM


LVM and LUKS are seperate projects. LUKS is not aimed ad FDE, it is aimed at partition encryption. Sure, it can be used for encrypting a full disk, but then you need LVM to get partitioning again (with all the problems that brings in), and you have to use an encryption method that the initrd can handle. Requiring defaults is pretty clearly a limitation of the Mint initrd, and not any limitation of LUKS.

On the other hand, Full Disk Encryption rarely is Full Disk Encryption, and it is not for Mint either, or for TrueCrypt at that. There still is an initial boot-loader and that is basically just as easy to attack as a full kernel+initrd setup. It is a bit harder to attack than a kernel+root partition setup, but not much so. That is why on Linux, I use LUKS on the data-partitions and on Windows (where I do not trust the MS-supplied crypto) I use TrueCrypt for the Windows System partition as as it doubles in many senses as a data-partition, unlike what you can do on Linux.

But in the end, if a reasonably competent attacker has access to your hardware several times, you are screwed anyways and no amount of disk encryption will help. The scenarios where disk encryption is useful assume that you notice when an attacker had access once (laptop stolen). For example, an attacker with access several times can just install a hardware keylogger. With todays microcontrollers I could build one in a weekend and miniaturize it with a week of time or so. And that is on advanced amateur-level, not professional level. No way to detect it unless you regularly strip down you keyboard or laptop.

Martin WalshApril 3, 2015 4:17 PM's a dinosaur. It's a clunky app and people who like it seem to have no alternative. The whole Matthew Green expedition strikes me as frivolous...nickel and dime contributions, for what? So some backwoods group can keep living in the 90's? I guess it's all some people know.

Martin WalshApril 3, 2015 4:26 PM

The first thing I asked myself was, years ago, why are so many people using TC and they don't know who wrote or anything about it. Really dumb. But now! !! Ooo..Aahhhh..let's do a security audit to kinda undo our stupidity. Then the word comes out - everything is OK, you guys can come out now. So-and-so checked it out and they think it's OK. Where the hell were they before? This is the kind of nutzo "expertise" that continues to dog information security. And just about the time I'm ready to gag, here come the TC fanboys.

65535April 3, 2015 5:07 PM

This audit is good news and a somewhat of a relief. It shows that open source project work and can be audited.

But, there are some concerns. Section1.3 Findings and Summary [page 7] indicates some problems relating to the random number generator [but what’s new]:

“The most severe finding relates to the use of the Windows API to generate random numbers for master encryption key material among other things. While CS believes calls will succeed in all normal scenarios, at least one unusual scenario would cause the calls to fail and rely on poor sources of entropy… Additionally, CS identified that volume header decryption relies on improper integrity checks to detect tampering, and that the method of mixing the entropy of key files was not cryptographically sound. Finally, CS identified several included AES implementations that may be vulnerable to cache-timing attacks. The most straight forward way to exploit this would be using native code, potentially delivered through NaCl in chrome… the simplest method of exploitation through that attack vector was recently closed off.[2]”

“Note [2] Specifically, removing access to the CLFUSH instruction as part of the Rowhammer mitigation.”

I have some mixed feelings about the above statement. The code is certainly fixable but the randomness of the code for the keys is at the heart of cryptographic integrity.

I do suspect that the rumors of the project drying up were indeed correct. With the death [end-of-life cycle] of XP and the new Vista-Windows 7 bitlocker, this removed a huge portion of the user pool.

Bruce is right. You can still use TrueCypt with fairly high confidence. That’s pretty good for a free open source program.

Ben HutchingsApril 3, 2015 5:15 PM


Mint is a Debian derivative and as such it uses the modular initramfs-tools to build the initrd. This means that the cryptsetup package that provides LUKS support in the full system also provides LUKS support in the initramfs, and any encryption mode supported in the full system should also be supported then.

I haven't looked at the Ubuntu/Mint installer, but the Debian installer allow choosing aes, blowfish, serpent or twofish and I would expect LMDE to do the same.

(I am a maintainer of initramfs-tools, but not cryptsetup.)

Anonymous 1April 3, 2015 5:25 PM

metaschima and 65535: TrueCrypt was never open source, please stop calling it that (the OSI never approved the licence and stated that there were outstanding issues with it).

BrabenApril 3, 2015 5:27 PM

As has been pointed out, there are a number of options for full disk encryption. But I have yet to find an alternative for creating portable and mountable encrypted container files, which is what I primarily use Truecrypt for. The "mountable" part is important. While you can of course create encrypted archive files (e.g. zip/tar encrypted using GnuPG), you then have to extract the files to unencrypted storage while you work with them, which is not only less convenient, but also less secure.

HugoApril 3, 2015 5:30 PM

@Pinatex: LUKS doesn't look like a TrueCrypt alternative. Where is the Windows and Mac version?

AirheadApril 3, 2015 5:47 PM

@Gweihir & Ben Hutchings

It is quite clear that forcing the crypto default is a feature of the Mint installer and not a feature of LUKS or cryptsetup.

If you do the disk partitioning and encryption setup (including LVM) separately from the install by using cryptsetup directly from a live disk, then you can use any supported encryption options you want. But it's not at all clear how to install Mint or Debian on top of the resulting disk setup. The problem is that the Mint system installer itself, and I had the impression also the Debian installer (but maybe I'm wrong), only allows you the option to encrypt using LUKS with or without LVM but without giving you any way to set the encryption options apart from the default AES etc. If the Debian installer is allowing selection of crypto options, it would be a pleasant surprise.

Also there's a bit of confusion in the Mint installer because they haven't removed the subsequent option to encrypt /home in the case that you've set up FDE using LUKS and LVM. The /home encryption option seems to be something they should suppress in the case you've set up FDE--unless I suppose there's some reason to have that kind of layered encryption (multi-user?)

Yes it is true that evil maid attacks can circumvent crypto--as can not having your computer air-gapped. But that's a different problem from being locked into a set of crypto defaults.

As for having the boot partition unencrypted, one solution mooted by Arch Linux (and probably others) is to put your boot partition on a usb stick you keep in your pocket.

Again, when I read Martin Walsh's comments, I wonder--who is so anxious to discredit TrueCrypt so that people don't use it? The animosity in such comments sounds to me like an endorsement of TrueCrypt.

Alex LApril 3, 2015 7:02 PM

You're being too hard on TrueCrypt. There are still a lot of lessons to be learned from this audit, regardless of whether you're actively forking it. Diehards can do better than TrueCrypt, no question, but it's still probably the best-known of its breed and lots of users who aren't fully up to date find it useful

metaschimaApril 3, 2015 7:21 PM

@Anonymous 1
Yeah, you're right, it's not really open source unfortunately, but at least you can view the source and compile it yourself. I personally would never use/trust closed-source crypto software.

ThothApril 3, 2015 7:44 PM

Regarding hardware cryptography and key management, I do still thread it very carefully and specifically with companies in NSA/GCHQ/EU jurisdiction namely: Atmel, Harris, Freescale, NXP just to name a few.

Backdoors and purposefully faulty and botched processes in the process lines making the chips and the somewhat seemingly purposefully lax guidelines on FIPS and CC EAL validations (and probably the EMVCo as well) have seen many cryptographic/security processors not just broken by State Actors and Warhawks but also defeated by academics and even homebrew hacking.

The more severe part in the modern day chip-based security is the backdoors. There are many ways to make the keys leak but that's too much in detail in a post.

Cryptography is a tough business as it contains both technical, economical and political elements which is the worst mix (money, guns, machines ...).

LUKS, TrueCrypt ... whatever encryption software you name it, they are all vulnerable. Previous posts by me, Nick P, Clive Robinson, Wael and RobertT have brought up a bunch of ideas and discussions on security that you can search using the search bar on Bruce's blogs.

The reason we are in this state of not having a properly secured environment would have to be the Nations (National Security means no hidding from State Control thus limited or no encryption) and also the default options where people don't like the hassle of security and blindly follow the State Control mechanisms without questioning.

We have come to a state where secure computing might have been so deeply poisoned, whichever crypto you use or security mechanism, you are still vulnerable be it hardware or software. There were mentions of rebuilding computers from elementary blocks and that's just how untrustworthy our computers are both hardware or software.

Anonymous 4April 3, 2015 8:10 PM

TrueCrypt was never open source

"Open source" meaning you can obtain and read the source code, so yes it was.

If TrueCrypt is generally robust, then why did the dev team disband in the spectacular way that they did? This still leaves more questions than answers.

Edward April 3, 2015 9:40 PM


What are some options you would find better compared to the defaults?

Is there a simple GUI tool to format an LVM, maybe using a LiveCD, using custom settings?

(I understand that the latter might be overkill but a GUI is always nice to have)

airheadApril 3, 2015 10:00 PM


One of the possibly state-sponsored professional virus-hacks (forget which one) used Twofish encryption in its internet communications with its controlling server and after a fashion that's a serious recommendation. So for those who are uneasy about AES, the possibility to select Serpent or Twofish or any of the the other cryptsetup crypto algorithm options would be useful. Also there's XTS vs CBC and so on. Also it would be nice to be able to select another pseudo-random number generator over against the default for the salt. There are some other options which I don't have at my fingertips right now--surely a crypto expert who follows this blog would be able to comment on the options supported by cryptsetup. My understanding is that the Mint (and possibly Debian) installer is passing parameters to cryptsetup, so it's a matter of being able to pass any of the supported paramters rather than just the default. This seems very easy to code into the installer via a drop-down menu.

As for using a simple GUI to format LVM and/or LUKS--AFAIK no there isn't and that's part of the the problem. You have to have a fairly high level of expertise to do the disk setup and then, as I pointed out, it's not at all obvious how to so the system install on top of that.

Nick PApril 3, 2015 11:30 PM

It was a piece of software designed by a team that wanted usable, portable, secure encryption for files and disks. It did that job. Its source code was available for review or compilation the entire time. If anything, people gave them too much crap for being anonymous. I remember only a few decisions they made that made me suspicious, more of quality-impairing stubborness than malice though. So, it's good to see a review finding very few serious issues.

Then, I look at the comments. The Slashdot comments are more useless than usual. The one's here are surprising. More trolls and ignorant rebukes than usual. I guess I'll waste one comment on their points.

@ Anonymous 1

Barely usable, platform limited, FOSS software that's had less security review than Truecrypt? How about we just use and improve the proven thing's own open source code? They required very little for that privilege. Basically just taking the name off it from what I recall. Whatever it was hasn't stopped many forks. Building on what's already been vetted and battle-tested usually has better results in INFOSEC than newer, amateur projects.


Gigantic data breaches almost always occur in companies with poor security practices leveraging systems that aren't designed to be secure. The international (Common) Criteria for system engineering processes says only EAL6-7 even begins to be secure. Most of the market, *especially* the security appliance market, is EAL4 or below. That rating is for "casual or accidental attempts to breach security." As in, the attackers have no sophistication or determination.

*Of course* leveraging such technology on the hostile Internet resulted in many breaches. One must use the stuff that's actually designed for security if they want a chance at surviving increasingly sophisticated attackers. Unfortunately, the market rarely bought it, many of those companies tanked, and now it's mostly done by a few companies for defense sector at enormous unit prices.

@ Anonymous 3

"'Anonymous 1' - TrueCrypt has stood the test of time. Apart from a few improvements such as the iteration count it's a very solid, stable and robust piece of software. "


@ Gweihir

Good points. It's why my own designs for disk encryption started with what high security was doing. They must have noticed the same stuff you did because NSA's solution looked nothing like those they encouraged market to buy. These days it's extra clear why. ;)

@ Martin Walsh

"'s a dinosaur."

So are those IBM mainframes... that have gone 30 years without downtime in some cases. Why don't people use more modern stuff with the extra vulnerabilities, downtime, and risks nobody saw coming? What's wrong with these people who use older stuff that works with little risk? Your questions are all too common from a modern crowd. We'll understand when even third world hackers start sending spam from your box, you loose access to something important, or your vendor cancels functionality you depend on.

"It's a clunky app and people who like it seem to have no alternative."

We have alternatives. It's better than them. So, we use it. An even better app might be published with source and similar vetting by Monday. We'd consider moving our data into it. Meanwhile, we use what works even in the face of NSA's cryptanalysts. That it gives *them* headaches is all I personally need to know to use it.

@ Braben

I used it for the same reasons. I had to create the volumes in Windows to use FAT or NTFS for best portability. Yet, I could throw a bunch of files in an encrypted volume about as easily as doing a zip file. Then, I can move the volume and keys around in so many ways. Once online filesharing sites popped up, I just uploaded truecrypt volumes to them, emailed people the link, and gave them the password out of band. I even used them to securely overwrite drives while leaving people wondering which might have interesting stuff and which were chaff. Fun, fun program.

@ Anonymous 4

"If TrueCrypt is generally robust, then why did the dev team disband in the spectacular way that they did? This still leaves more questions than answers."

That's a good point. Yet, I repeat my mantra on this: verify the published result of a development process instead of worrying whether the authors were malicious. Old A1 (now EAL7) development processes assumed developers might be malicious. They had to produce a system and documentation in a way that was easy to vet. So Green et al's approach of digging into the source to expose or vet it was the proper approach. Even better are designs made with verification in mind: simple constructions; modularity; minimal looping; minimal pointer manipulation; straightforward correspondence between requirements, features, abstract design, and concrete design. Add plenty of testing, static analysis, and so on.

Unfortunately, nobody outside NSA and some defense contractors have done a system that way. Rare few in the commercial sector that did it, mostly smartcard IC's. They're far from free and typically not open source. So, auditing and using TrueCrypt until FOSS produces high assurance alternative was the best option despite some oddities with them.

Do remember, though, that many of us in INFOSEC community live in police states that actively target us. Expect some to behave erratically, backdoor products due to coercion, shutdown due to coercion or ethics, disappear in a variety of ways, and so on. It's the nature of INFOSEC in police states sadly. Knowing this, I'm going to continue to judge INFOSEC participants by the quality of what they publish or code they submit.

Anonymous 1April 4, 2015 12:02 AM

Anonymous 4: I suggest you read up on the definition of open source, you need to be able to do more than merely get and read the source code for something to be open source.

TrueCrypt were improving their licence and had they not stopped may have eventually released a licence that was Open Source (and Free Software).

Nick P:

FOSS software that's had less security review than Truecrypt?
Then why not give the programs that don't have licensing issues the audit instead of wasting time on TrueCrypt?

Nick P: All the forks that I've found are still under the TrueCrypt licence as it does not appear to allow it to be re-licenced (even the forks that list another licence still claim to have code under the TrueCrypt licence).

EdwardApril 4, 2015 12:21 AM


My idea would be that one could make an encrypted LVM, and then start up some linux installation, select the encrypted LVM as target, provide the password and proceed with the partitioning inside the LVM. At least with some distros that should work (distros that allow installation in an existing LVM).

Ideally, this formatting would be done using a modified Gnome Disks, but maybe a simplified GUI could provide menus and options and cough up a command line that can be pasted in a terminal window.

Nick PApril 4, 2015 12:40 AM

@ Anonymous 1

"I suggest you read up on the definition of open source"

I suggest you read up on the definition of definitions. It might explain to you why *your* definition of "open source" disqualifies projects whose source is open.

"All the forks that I've found are still under the TrueCrypt licence as it does not appear to allow it to be re-licenced (even the forks that list another licence still claim to have code under the TrueCrypt licence)."

Clever misrepresentation. You post my worry about licensing without the comments countering such worries successfully. You would've seen them. Further, you're probably aware of the number of lawsuits by the Truecrypt team (0) trying to prevent the distribution of software based on their work. You might also be aware of the Snowden leaks that said things along the lines of "Thank God they're using Truecrypt instead of all the FOSS and proprietary stuff out there. We now know everything about them." (read: sarcasm)

Great posturing Anonymous 1. Police states across the world appreciate your actions. You should get the U.S. Medal of Tyranny. I nominate you myself as representing all the qualities our leadership look into.

Gerard van VoorenApril 4, 2015 1:32 AM

This is great news. I thought the audit died in silence but it turned out to be the contrary. And no intentionally back doors or severe weaknesses found. The cabal that designed TrueCrypt can be proud. That the audit was funded by the crowd only makes it better. I leave the nagging for later ;-)

FigureitoutApril 4, 2015 2:03 AM

First off, Truecrypt is in a league of its own in terms of encryption programs. No "true cryptographers" could actually implement it themselves, excellent programmers are needed. It has features and usability that is again and again ranted and raved about "we need this, we need that" but few to no one can actually deliver. It's *solid* as in failsafes work good, and of course it has support in all the big major OS's. It can essentially "beat" OS loading on the major PC architectures, that's just damn impressive to me (you could probably base some nasty attacks on that feature alone, or just tweak it to lock your PC)

All that being said, this is a report done by...3 people (not including Matt Green). I'm just not convinced, w/ its code base, features, that there isn't some killer exploits's just too good to be true. The f*cking developers that worked on this for years and years knew the weak spots and they should've noted it in comments instead of leaving everyone hanging w/ some botched warrant canary.

JacobApril 4, 2015 2:40 AM

The auditors explicitly stated that due to time constraints they covered just a subset of the code.
In my opinion that leaves the question of a possible subversion still open.

For example, they did not delve into the asm code which is security related.

Although they mentioned that "The header volume format and protection schemes were evaluated for design and implementation flaws that could allow an attacker to recover data, execute malicious code, or otherwise compromise the security of the system" they did not address the issue, as reported by prior Truecrypt audits, of the unexplained data in the headers of the WIN program.

In addition, the auditors were explicit by saying that a future study should be carried out looking at the effects of using the program on disks with different sector size - something that becomes acutely relevant with today's disk sizes.

Overall, they raised the "feel relatively secured" level by another notch, buy we are not there yet to fully trust the code.

AirheadApril 4, 2015 3:59 AM


Well something along the lines of what you suggest would be good. The problem is that it is not at all clear which "user-friendly" distro has an installer that would allow you to do the install into the encrypted LVM once you created it using the live disk. For example how would you provide the password to the encrypted LVM so that the partitions would be open to the installer? If anyone can explain this, would be much obliged.

EdwardApril 4, 2015 4:49 AM


I think most Mandriva/Fedora derivatives allow this, using an existing encrypted LVM for installation.

bruce mangeeApril 4, 2015 5:30 AM

Truecrypt is a software you can train your own mother to use. Put away the geek stuff and it comes to useability. Remember the complex passwords we gave users in the early 90th? They were written down because nobody could remember them. Same thing here, we can only make things more robust if normal people are able to use encryption software. Alternatives to truecrypt are still rare.

Martin WalshApril 4, 2015 11:34 AM

@ Nick P
Another thing I find disturbing is why anyone should have to depend in the first place, on an application written by some guys hiding in a basement somewhere in, I guess Eastern Europe.

Notice how much easier it is to just look over code, then say "I don't see anything wrong," than it is to actually create a robust and beautiful application. I'm not fooled by group evaluations - they're nothing more than schemes to distribute and redirect responsibility. Do you know what a really good piece of publicly available FREE and openly evaluated and tested encryption software application could have been designed and built by now? Know why these guys didn't do it? Because they can't. The TC audit reminds me of Cuba.. decades of failure and all they can do to this day is fix up old junk cars. It's pathetic.

TedApril 4, 2015 12:18 PM

I don't use Truecrypt. Or any disk encryption software. Because I am lucky enough to live in a country that is not a police state, I work at a job and in an industry that the government likes, I have no medical issues that are helped by smoking marijuana, and I own dogs and have long ago learned how to adjust my lifestyle so that my neighbors are easier targets for the common garden variety thieves to steal from.

But, I have the utmost respect for what developers like the ones who wrote Truecrypt are trying to do.

There are things in this world that if you deal with them, you are a target. If your passion is to reveal government law-breaking at the state level then you better be squeaky clean yourself and use products like truecrypt. If your passion is to reveal government law-breaking at the federal level then you might as well not bother with encryption, your only hope of protection is true obscurity - using insecure wifi networks and multiple sets of dirty & clean computers that are airgapped, and an early warning system and escape plan worked out - myself, I'd have an ocean-going vessel registered under an assumed name somewhere.

Because, the only really safe way to handle it is to assume that anything you see, someone else knows you have seen, anything you write, the moment you give it to someone else you are leaving a trail that can be traced back to you, anything you say can be overheard by someone, and that if you have taken precautions, and law enforcement cannot pin anything on you, eventually something will be planted on you and they will get you with a false accusation.

There was a case just a few days where a police dash cam caught another cop planting drug evidence and the defense lawyer was smart enough to pull the dashcam video and the police department's censors screwed up and overlooked the evidence plant and handed out unredacted video. Now, the whole thing has blown up. It happens.

Scratch the surface of even what you feel is the most free and open society on Earth, I don't care what country it is in - and you will find uglinesses that there are some people who would kill to keep covered up.

If Truecrypt and software like it has saved a few lives even though it may have some theoretical vulnerabilities, then that is more than any of it's detractors can ever say about their own lives.

RickApril 4, 2015 1:11 PM

Does this news change (or maybe further support) anyone's opinion of why the TC development team stopped updating the software and took their web site offline?

I'm curious as to others' opinions.

My humble opinion is that the TC team was 1) approached by the NSA/FBI/TLA du jour to either subvert the software, or, "pay a price for rebellion", or, 2) as Steve Gibson of hazards to guess, they simply tired of the effort.

AirheadApril 4, 2015 2:03 PM


The article referenced by Anonymous3 outlines an attack against CBC (a method of chaining blocks as they are encrypted). The current default is XTS (also happens to be the TrueCrypt default). So in this case the default value for LUKS when you ask for LUKS + LVM at system install time is the correct one. Is there an attack against XTS? Haven't heard of one. AFAIK, XTS is the best currently available block chaining method.

AirheadApril 4, 2015 2:09 PM

@ All

One quirk of LUKS compared to Truecrypt at FDE level is this. Truecrypt will encrypt ALL of the partition, making it impossible to know where the data is. LUKS requires a separate special step to zero-out the partition before installation of LUKS encryption because it does NOT do full encryption of the unused space--only on-the-fly encryption of data as it's written.

Question which arises is, what are the ramifications of this LUKS approach when you're using an SSD with or without TRIM?

David HendersonApril 4, 2015 2:33 PM

Apple has had trouble updating its systems and keeping them reliable, especially when new Cloud technology is added to an existing system.

Mainly to avoid these glitches and instability but secondarily to ensure my own privacy, I have begun a transition from OSX to Debian Linux. I'm pretty far along and have migrated nearly all the files I use and found compatible tools using Debian 7.8.0. I used a shared zfs file system capable of being mounted on both Linux and OSX to seamlessly run in parallel for a while. I find myself running OSX more infrequently these days.

I used to think I wanted truecrypt running on Linux. I then changed my mind, and thought that I wanted the Debian tcplay package. tcplay is a reimplementation of truecrypt; its also been ported to FreeBSD. tcplay replaced truecrypt on Torproject's TAILS system.

I have recently settled into simply using a LUKS encrypted LVM partitioning created by the Debian installer. I used the default AES encryption for simplicity. I may have made a mistake in choosing the default, If so, I can use a different encryption system. I don't particularly need all the features of truecrypt, but just keep sensitive data out of the hands of others should my laptop be lost or stolen.

Ben HutchingsApril 4, 2015 5:45 PM

@Airhead: The Debian installer zeroes any LUKS encrypted volume by default, though it is possible to skip this.

AirheadApril 4, 2015 6:21 PM

@Ben Hutchings

What I was trying to point out was that in LUKS it's not as automatic as in TrueCrypt and that by implication you shouldn't skip that step.

MarkHApril 4, 2015 6:46 PM

What a laugh, reading the "theology" in many of the comments. Well, we obviously draw the line between skepticism and paranoia in different places.

I suppose that I can shout "everything out there is totally insecure" and seem like an all-knowing security expert ... unless I have the responsibility to advise people about what they can do to enhance their security in the Real World (TM).

My personal views:

1. Truecrypt appears to be really well done from a security standpoint ... some folks who have studied these kinds of problems very expertly and deeply have decided to place trust in it ... the audit results are consistent with it being a good tool.

2. I'm not aware of any alternative that can do all the things Truecrypt does, with an equal basis for confidence in its security. I think all of us are worried about backdoors by now, but experience so far has taught us that simple f*ckups are much more likely to trash security solutions than malevolent tampering.

3. It never bothered me much that the Truecrypt developer(s) wanted anonymity. I can well understand the reasons for this!

4. The most likely explanation for the sudden "shutdown" of Truecrypt development is the most prosaic one. As far as I can tell, there were probably never more than two or maybe three developers; at some point, there was likely just one remaining, who wanted/needed to go in other directions. [Note: if you haven't been responsible for the continuous upkeep of an intensively-used software package over a period of years, you don't understand what a burden it is.] The last man/woman standing might have acquired family responsibilities, the need to pursue income-generating career directions, etc. Life happens!

Again, if you haven't invested years of your life in a really carefully crafted piece of software (as several people who've studied the Truecrypt source say it to be), you will not understand the natural reluctance to turn it over to the hands of strangers. Announcing that "it's over" was a perfectly understandable human action, even if it doesn't make sense from a project or engineering perspective.

5. Yes, the license is peculiar (or, if you're paranoid, a wicked trick to trap the unwary). My reading of it would NOT discourage me from creating a modified/derived package for free-of-charge distribution. And yes, "open source" has lots of attached meanings nowadays, but the source for Truecrypt is indeed open for inspection, independent building of the executable, and modification. Whether it meets any particular criteria for "free software" is very separate from the question of its integrity.

6. @Ted: right on.

ThothApril 4, 2015 7:19 PM

It seems to be rather easy to criticize a good deal about Truecrypt.

1.) So what encryption software for the public would you propose to replace Truecrypt along with it's usability ?

2.) If those crypto programs out there aren't good enough, what do you propose ?

Truecrypt is a venerable and time tested software that have stood the test of time. Leaks have shown that Truecrypt have successfully protected it's users (unless you want to point out that the leaks are fake). The fact is Truecrypt laid down and died last year when it's developers decide to drop a bombshell for whatever reason and forks started springing up. Some people would start to dig around at the history of the developers and criticize the project base on the people and some are more technical/scientific minded that started to dig around the source codes to find out more about the styles and motives of the developers.

Whatever the case, the lost of Truecrypt really rang a wake-up call somewhere in everyone's head. Events are events and we have to move on.

Here's a design for a Low Assurance Security but easy to assemble Hardware-based Encryptor using COTS Hardware Security products with the idea along the same line as Truecrypt for easy usability of cryptography.

- The use of hardware security modules to store critical cryptographic and format codes not reliant on the host computer. Dedicated hardware chips like Smartcards, HSMs, TPMs can be used.

- Key storage to be hosted on the hardware security chip.

- Only a single cipher mode available (Serpent-Twofish-AES cascade).

- Security application should allow definition of at least 2 PINs (Normal User and Self-Destruct PINs). When the threshold amount of tries have been detected, the chip would simply wipe all the keys from it's keystore in the hardware chip and block the chip from use (in Smartcard terms, to block the Smartcard).

- All keys cannot leave the security confines of the hardware module unless wrapped by another wrapping key or wrapping key (FIPS 140-2 Level 3 - NonStrictFIPS) or for a higher security, to have another hardware chip with same security application (already initialized) and then transfer the security world from one module to another module transparently without user defined PINs or keys where both security worlds will negotiate their own keys and transfer data securely (FIPS 140-2 Level 3 StrictFIPS). Those who have used the Thales nCipher HSMs will know what I am talking about regarding Security World Migration.

- The use of a password as a second factor will be allowed. In essence you can plainly generate a master key in the security chip or you can specify to input a password together with a master key. If a password is used, the master key would execute encryption operations on a BCRYPT + SCRYPT stretched password to form the final master key. The use of password with a hardware master key is so that in the event the hardware is captured, the surprise element would be the password which needs to be coaxed to divulge from the user but the drawback is you have to remember an additional password.

- Hidden Volumes may not be useful as once a suspicion of encrypted data has been sounded, it is more likely the user would be coerced into handing over the decryption keys, passwords and PINs. The most favourable action would either to comply (and then get executed which makes the executors happy) or to not comply and hand over the Self-Destruct PIN (and still get executed but the executors unhappy).

- Due to the algorithms and formats being processed within the security confines of a hardware module, the host machine is free to handle the GUI rendering and a better GUI can be thought up.

- If a USB token is used, the executable for the GUI can be packaged in the USB's flash memory whereas the Hardware Security chip within the USB board (usually a Smartcard chip) would be the security processor otherwise all a person needs to do is carry the hardware module and download or carry along a GUI program.

- The security application for the chip should be obfuscated before loading into the security chip as a recommendation to make life harder for hardware chip attackers.

- To prevent hardware loggers on a host computer, you would need to bring your own trusted computer running off probably an OpenBSD with Smartcard support to handle the hardware chip as a Smartcard.

P.S @Bruce Schneier/@Moderator, please start patrolling the blog comments :) .

Nick PApril 4, 2015 8:32 PM

@ Martin Walsh

Their location and identity are irrelevant. What matters is *what* they produce. People might hide in a basement because they're paranoid, reclusive, for medical reasons, or anything else. People developing crypto a while back also were often pro-privacy and libertarians with a vision of anonymous, untraceable, pure democracy backed by crypto. Very likely for them to use anonymity tools for creds or ideology. If your crypto is popular, you might also have a ton of people willing to use a $5 wrench to get rid of it.

"Notice how much easier it is to just look over code, then say "I don't see anything wrong," than it is to actually create a robust and beautiful application. I'm not fooled by group evaluations - they're nothing more than schemes to distribute and redirect responsibility. "

No, they're division of labor. This is common in markets and social structures. Even more common with human nature leaning toward laziness and selfishness. That's why even companies with developers didn't donate money or reviews. Yet, having people dedicated to reviews makes sense as they'll get more done for a given amount of money. Plus, you can always use more than one with different country, ideology, and so on. The alternative is that you don't use any electronic device or even FOSS because you have to trust someone else at some point. Not practical.

"Do you know what a really good piece of publicly available FREE and openly evaluated and tested encryption software application could have been designed and built by now? Know why these guys didn't do it? Because they can't. "

They can and did. It had better features and assurance than any other piece of free as in beer software. Free as speech they chose against to allow integration with non-free software. I can imagine better work but they did better than the competition. The failure you are describing isn't the Truecrypt team: it's the FOSS movement's failure to match them. Claiming extra potential then failing to use it is quite common in that field, especially high assurance systems.

"The TC audit reminds me of Cuba.. decades of failure and all they can do to this day is fix up old junk cars. It's pathetic."

It reminds me of when you pay mechanics to spend some time on a used car to verify that it works, is in good shape, and had maintenance done on it. A new car would be pretty nice. Many can't afford it so they pick among what they can get their hands on. The smarter ones have a qualified party look at a number of them to find the best one for the money. Whole thing buys them a result that, even with vetting, is still a fraction of the cost of a new car.

@ MarkH

Good points, esp on benevolent shutdown possibilities.

@ Thoth

You basically want this or this from a more trustworthy vendor. A combo of processor, I/O, trng, crypto acceleration, and firmware protection. The closest things in market are pretty much all U.S.-based.

ThothApril 4, 2015 8:47 PM

@Nick P
Most of the major spying occurs in the West. On one hand they (The Western Governments) wants privacy and on the other they want to know everything under the Sun. The better option is to look into TPM chips from foundries located in non-Western countries but as we all know, the World Governments are also doing the same spying as any other Governments so it's who you are protecting from. I believe the current threat model is protection from Western Warhawks although Asian Warhawks are pretty troublesome to handle too.

65535April 4, 2015 9:34 PM

@ Airhead [No one answered you – I’ll take a shot at it - and ymmv]

“…LUKS requires a separate special step to zero-out the partition before installation of LUKS encryption… what are the ramifications of this LUKS approach when you're using an SSD with or without TRIM?” –Airhead

Assuming you question is simply about wear of SSD’s, and the wear difference between LUKS and TrueCrypt.

I would guess without proper wear-leveling the SSD will wear faster with LUKS and become slower at a faster rate [Possibly a noticeable degrade in a year or less of heavy use]. With Intel SSD you probably will not notice the difference with either encryption program - OCZ and others vary.

I don’t use any external SSD's because of theft factor [one theft was enough for me]. I avoid internal SSD's because of the price factor compared to spindles.

Further, the OS are going towards hold a high proportion of the working data in RAM [RAM is cheap]. If you are not hitting your HDD much because most of the working data is up in RAM you see fairly good performance after boot with spindles.

Again, if you have a lot of RAM you will not need to hit the SSD as many times as a small amount of RAM. Say you have a large amount of RAM and you only use 50% at peak usage, your SSD will not wear out that fast with either encryption programs [that is noticeably slow down].

The opposite is true if you have say 2 to 4 Gigs of RAM on 64 bit gaming machine – excluding shared memory but including encryption for Bitcoin and the like. On a gaming machine stuff it full as possible with RAM to keep the SSD from constantly being hit.

For VMware I understand SSD’s are now a requirement [I think they recommend at least 10% of drive capacity be SSD]. And, I cannot go any further because wear leveling is not my specialty.

AirheadApril 4, 2015 10:59 PM

@ 65535


I wasn't so much worried about wear as about the fact that the SSD firmware is writing the SSD blocks at random across the SSD field. It just wasn't clear to me if there was going to be any significant information leakage with the LUKS approach of only encrypting when actually writing as opposed to the TrueCrypt model of filling the SSD with (pseudo-)random data right at the beginning. Same issue taking into account TRIM and no-TRIM.

FigureitoutApril 5, 2015 3:29 AM

Here's a design for a Low Assurance Security
--Are there any code samples, vendor names, schematics, etc..???

Nick P
Their location and identity are irrelevant.
--You've made entire posts on the opposite. How about "another matter", what if what they produce becomes infected in transit to you? Again, this is an issue that has little to no solution as people all over the world get "secure" infected binaries hosted on servers any script kiddie can MITM in transit b/c the tools are just too developed now w/ no development on the actual f*cking medium.

you might also have a ton of people willing to use a $5 wrench
--As you've noted, there's people w/ knives and some "saturday night specials" itching for an opportunity to put these tools to use in self-defense.

because you have to trust someone else at some point
--Breaks security and all its claims. Pure and simple.

I suppose that I can shout "everything out there is totally insecure" and seem like an all-knowing security expert ...
--Wrong characterization...wrong wrong wrong. While huge software responsibilties are very stressful and lead people to not document for others (it's an employment strategy too, "you need me if you want a problem solved"). All I have to say is security researchers need a "safe zone" that they can push aside all the paranoid worrying about active/real-time attacks. Otherwise you're studying manipulated garbage (which could be from all the hacked up malware running all over the place) and the research will suck or fail... It affects my personal research *A LOT* wasting my time on stupid sh*t, not getting the best of my PC's.

ThothApril 5, 2015 5:42 AM

I am currently working on an actual applet implementation for a card-based keystore and encryptor which conveniently the above specifications coincides very closely to what I have already designed. Currently code cutting around and will post the link soon.

If you are asking about the Smartcard vendor, you have to somehow buy yourself a Smartcard that preferrably supports 144K EEPROM or more since I would likely be squeezing in software-based Twofish and Serpent as part of the original specs.

- Algorithm suite: Opportunistic support for all JavaCard cryptographic algorithms as long as the card you use supports it. Only two known constant which are software implementation of Twofish and Serpent.

- Password Stretching: BCRYPT + SCRYPT mode. For File Encryption but can be used whenever needed.

- Two PINs (User PIN and Self-destruct PIN).

- PIN Security: Wipe keys and PINs on too many wrong tries. Software zeroize all PINs on cards before calling system destruction on the PIN codes. Keys are wiped using JavaCard's clearKey() command.

- Key Size: Supports Unlimited Symmetric sizes for now by putting them in AES key slots as long as your card has enough space. Uses key handles to map the related key slots to key handles to achieve unlimited size symm-keys.

- Cipher Mode: ECB (default), CBC (Hardware & Software Support), PKCS1 / PKCS1-OAEP (Opportunistic Hardware Support)

- Command Ciphering Operation Mode: DO_ENCRYPT - Plain encryption.
> DO_ENCRYPT_SIGN - Encrypts data then hashes encrypted data and finally sign it with a MAC or asymmetric operation.
> DO_DECRYPT - Plain decryption.
> DO_DECRYPT_VERIFY - Uses MAC or asymmetric operation to verify the hash before decrypting the data.
> DO_HASH - Plain data hashing.
> DO_SIGN - Hashes a data and signs it with a MAC of asymmetric operation.
> DO_VERIFY - Hashes a data and verifies it with a MAC or asymmetric operation.
> DO_FILE_ENCRYPT - Uses Serpent-Twofish-AES-CMAC-SHA1 to perform file encryption as per specified in (
> DO_FILE_DECRYPT - Uses Serpent-Twofish-AES-CMAC-SHA1 to perform file decryption as per specified in (

That's all I have for now which I have planned for my card-based system. Any suggestions for the card-based security module can be suggested but I have to ensure it can fit into a 144K EEPROM card with somewhat enough space for keys as well which might turn out to be an interesting challenge.

Nick PApril 5, 2015 9:28 AM

@ Figureitout

"You've made entire posts on the opposite."

Usually for closed source software or software which many people will trust without their own review. Location was also considered due to laws regulating businesses that might reduce sales or lead to backdoors in closed systems. The software we're discussing is a widely distributed, open source project that a third party reviewed and is open for more review. The opposite of the context I of my other posts.

Be sure to look at context when reviewing information. It's important.

ThothApril 5, 2015 10:11 AM

If a Truecrypt-like cipher algorithm suite were to be used for file encryption, which would you prefer ?

1.) Serpent-Twofish-AES
2.) Serpent-AES
3.) AS-Twofish-Serpent
4.) Serpent-Twofish
5.) Others (specify)

FigureitoutApril 5, 2015 1:51 PM

--So is the card just a 144k EEPROM? Just storing keys on it right? Working on a board for this too or is it an app and plug smartcard into a PC/phone/etc? There's a few easy and open options (basically an backlit LCD screen w/ say 3-5 buttons, and all the initialization that happens every powercycle, a never-ending loop, switch statements for the UI) that would be neat if you want a separate chunk of hardware (I'm not sure if you just want an app or a full-on device, but you have mentioned difficulties outside your control w/ building your own stuff, which sucks...if not here's a cool chip to chew on: like the 128 OTP bytes, don't use RNG, claims some DPA protection).

Cool, curious on some of the wiping functions, and how it handles malicious input securely w/ safe fail states (it's way way harder than it sounds, as we all know). Sounds like you'll be using premade functions so that should help w/ securely erasing and handling memory. You'll probably need some sort of debugger or privileged access to the 'PROM just to check for a clean wipe.

Nick P
Be sure to look at context when reviewing information.
--Oh I do, believe me I do. You too.

Sancho_PApril 5, 2015 6:06 PM

Re authors of TC: ”Their location and identity are irrelevant.” (Nick P)
If I knew Bruce Schneier has written the sw I would probably trust him as a person, but not the sw.
I don’t know (and couldn’t judge) his qualification to code … and also I know that s**t happens, day in, day out.
You have to trust the sw, not the person.
The very first TC advantage is it’s a dinosaur.

@ Airhead, 65535

I don’t know but if LUKS only encrypts when writing this would be pants. An investigation would reveal how much encrypted data you have on your (“zeroed” or fragmented) drive.
The TC model makes it impossible to distinguish between used space (encrypted data) and unused space, the whole file / partition seems to consist of random data. However, access to several historical copies (e.g. backups) may give the attacker some clue regarding size of data.

I’m not an insider of SSD’s internals but it would be a horrible fail if the user had any chance to wear / damage specific locations in memory by using a certain software. Imagine a text file containing only one single character “a” on your SSD. Let’s assume a script to open the file, change the character to “b” and close / save the file in a loop. It wouldn’t “burn” a specific memory location (as it would with spindles) but damage the whole drive due to internal counters and wear avoiding firmware.
You will “damage” the SSD (thumb drive, …) by writing, but quite evenly.

@ Thoth

I didn’t understand your “Hidden Volumes may not be useful as once a suspicion of encrypted data has been sounded, it is more likely the user would be coerced into handing over the decryption keys, passwords and PINs.”
Probably it wasn’t meant as “TC - Hidden Volumes”?
A HV with TC means a second, invisible volume in the encrypted volume. You may be coerced into handing over the pwd for the visible (encrypted) part with your "unsuspicious" data but no one would see there’s another, invisible encrypted part in that same volume, having a different pwd.
This is a huge advantage of TC volume encryption.

ThothApril 5, 2015 7:35 PM

Yes, I meant Truecrypt's Hidden Volume. If you do a filesystem dissection on that stuff, you will start wondering why the volume is bigger (bloated) than the diversion data. It will start throwing question marks all over the place. Usual data encryption would not bloat so badly but in a TCHV, you are putting a bunch of HV (Hidden Volumes) and DV (Diversion Volumes). The best way to handle it is not know the encryption keys from the start thus the Smartcard/SE usage where you prevent key export or to allow key export to another Smartcard/SE without user involvement except to approve the request. I wouldn't say TC did a bad job because they were expecting software encryption and you can't simply lock an encryption key and deny knowledge like how a Smartcard/SE would allow users to innocently deny knowledge of keys even after honestly handing over the actual user PIN (in-regards to my implementation).

It isn't simply for key storage. It will include cryptographic processing capabilities. Put it simply, you split the crypto-program into two parts. The essential core (includes crypto and key handling) will be the hardware SC/SE applets which you dump "securely" into the SC/SE to do the heavy lifting. The thin client which is the GUI frontend would be downloadable and discardable which you can use on phones or laptops. It's not easy to somehow hack a clean hardware with a card reader slot so the best option is a laptop running OpenBSD taken offline from any network access.

For the SC/SE, you are allocated a secure memory cell range to store your binary bits. Because space is limited and must be accounted for, if you software zeroize (if you are paranoid) the key bits, you are more than likely to touch the same cell ranges because if you write onto another bunch of cell range (which shows lack of integrity on the SC controller), that means the card is in an unverifiable state and key bits will leak which might be one of the considerations most SC/SE makers would avoid so you can't simply dynamically assign secure memory cells arbitrarily otherwise you would mess up the keystore.

In simple, my approach is just take an SC/SE and load the file crypto engine and key storage engine on the SC/SE and the thin client GUI doing simply just calling the functions and passing data between the GUI and the SC/SE over a Diffie-Hellman established channel between the SC/SE and the host machine.

Regarding Atmel, I wouldn't go near it. For now I might consider NXP although we have to assume NXP and Atmel are in bed with the Warhawks. Atmel is right in the American Warhawks territory of control and services their needs, so not gonna touch that and plus the user has to buy their own SC/SE to load the application so it's up to the user's decision which kind of SC/SE the user wants to use. For now I do prefer to go with NXP's SmartMX family of chips with 144K EEPROM whenever possible.

*SE - Secure Element chips like the Atmel stuff which is basically a building block chip for RFID tags, Smartcards and tokens which you implant as the core processor for your stuff.

GodelApril 5, 2015 7:48 PM


I was disappointed to see that hidden volumes was one of the items they didn't get around to auditing.

FigureitoutApril 5, 2015 9:36 PM

--Ok got it (kind of...). Looks like they use Keil, which is a pretty popular tool in embedded word (the more complicated the chip, the more you'll probably be reliant on their tool chains). So you'll have to securely separate those memories (probably put all the crypto libs and routines in flash ROM, then store encrypted data and keys or maybe "key-encrypting keys" on EEPROM, using RAM when need be). Maybe a "little" security tack on, if chip has tiny chunk of OTP fuse-based memory, is to always check for some OTP serial number (one time programmed, not one time pad, but it's kinda same in this instance) hardcoded at the beginning of the ROM every power up. Would be easy to test too (also try to screw it up, like negative values, 0000 or FFFF etc that causes weird glitches sometimes lol).

And then you'd need to look into probably some SD card firmware in laptops (I've never looked at any so got nothing there), lots of proprietary stuff, to maybe at least be aware of attacks on that connection to the card (I'm thinking just make a copy somewhere else on a HDD or even really bad some hidden memory...bah probably too big threat model there). That'd be pretty hard, or rather a lot could go wrong. A large LCD screen would probably be easier (not the TFT touchscreens, even though they "look cooler"); just a consideration. Go for it otherwise (google came out w/ some thing that allows .apk's to run on PC's, so you could get 2 big areas covered there).

On Atmel, yeah but it's kind of hard to avoid such a huge company (I may work there someday, maybe). All the big chip companies are massive, multinational corporations; they're going to be targets (they should be big drivers for actual secure chips and could pay the people and have the facilities for some good OPSEC, but I'll admit there's always a lapse from just being human and not amped up/tweaked out 24/7 and kill your heart). Oh guess what NXP just did, they bought Freescale (American)... Which Freescale had parts of Motorola (man companies get really confusing...). Also Gemalto was headquartered in Netherlands too and they just had a 'little' issue brought to light...

But yeah for me, the hobbyist me, tiny structures encased in a chip is beyond what I can defend against, I'd break the chip 100% sure if I tried to incapacitate it. I can only program it (and assuming everything is what is actually is, which meh never know for sure) right now w/ a toolchain I didn't make, so my security stance is not very strong...Or I need to roll up my sleeves and use more discrete components (rather than it all just in a chip), but they still use basically same DIP packaging and it's a black box too. Grrr so annoying!

Oh btw, if you look into RFID (I dabble, just be a neat project to build my skills), I noticed MIFARE on some SmartMX datasheet, arduino has a module for one of their chips if you want to test that out or mess w/. Pretty cool to see it working.

AnnunakiApril 6, 2015 12:43 AM

Anyone used the TrueCrypt fork VeraCrypt? I am sticking with TrueCrypt. What if NSA is behind VeraCrypt and has backdoors?

ThothApril 6, 2015 1:17 AM

Here is the link to Project PrimeCard. It is a smartcard applet for key management and generic encryption and recently includes goals for file encryption.

I forget to mention it has a goal to implement DJB's Salsa too.

In the event the chip is suspicious or lacking of cipher functionality, the Salsa, Twofish and Serpent might be of use.

It is still half way implemented and only updated in bulk so it will take a while.


Fascist NationApril 6, 2015 4:58 PM

Wonderful news. Congratulations to the auditors for bringing it in on budget.

The mystery of why the Truecrypt authors said 'so long' remains, but their project lives on. And we can rule out NSA or other plot from the beginning. It was honest encryption software; open and free. And a pretty solid build. kudos to the authors as well then.

Dirk PraetApril 6, 2015 6:32 PM

@ Annunaki

Anyone used the TrueCrypt fork VeraCrypt? ... What if NSA is behind VeraCrypt and has backdoors?

I switched to Veracrypt some time ago. It pretends to solve all major security issues and weaknesses discovered in part 1 of the Truecrypt audit, offers several enhancements and can load/convert Truecrypt volumes. It's available for Linux, OS X and Windows, has Raspberry Pi and Armv7 (Chromebook) ports as well as a tc-play fork for BSD. iOS and Android support by 3rd party apps EDS and Disk Decipher (as from 2.2.1). This pretty much covers most of my OS needs. Well, Solaris would be nice too, but I guess I'll have to talk to one of my former colleagues for that.

Like GnuPG's Werner Koch, it would seem that Veracrypt is maintained by a one-man company, i.e. Mounir Idrissi from France. The FAQ explicitly states that Veracrypt doesn't have built-in government/LE backdoors and that he will never do any. The source code is available and Idrissi claims it is constantly being reviewed by independent researchers and users.

Unless Idrissi's company IDRIX is a Paris-based French NSA front-end (which I doubt), I don't see any reason to discard Veracrypt. Some small donations are probably more in order.

anony moooooseApril 6, 2015 10:46 PM

what about the audit of ur mum?

I'll have u know I'm a navy seal m8 with over 9000 confirmed IRL no-scope-360 long-distance record-breaking sniper kills. Wot did you say about me, you dirty peasant?

AjitApril 7, 2015 4:36 AM

I remember reading an essay by Schneier on this website (Can't find it right now, I don't remember the title or the expert quoted by him) in which Schneier quotes some expert who makes the case that it is impossible to find a backdoor if the backdoor is really well designed, that no one can be absolutely certain the software does not contain a backdoor.

How does this TrueCrypt audit fare in the face of such reality ?

Clive RobinsonApril 7, 2015 5:27 AM

@ Ajit,

... in which Schneier quotes some expert who makes the case that it is impossible to find a backdoor if the backdoor is really well designed, that no one can be absolutely certain the software does not contain a backdoor.

You may have it slightly wrong...

There are a couple of ways of looking at such things "theoreticaly" and "practically". And as the old saying has it "It works in theory but in practice...".

Within reason the everyday computers the majority of us use are designed to be "fully determanistic" thus in theory every state and transition can be tested and any error deliberate or otherwise can be detected.

However in practice, for even moderately complex machines and the software that runs on them, there is not enough time in the expected life of the universe to go through them all, and nor is there enough atoms in the universe to make enough machines to run in parallel...

The solution in EmSec circles is to reduce complexity by various techniques such that it's possible to control interfaces...

So not impossible just very difficult at best, and wildly improbable with the standard ways we do things currently.

Nick PApril 7, 2015 5:29 AM

@ Ajit

That's a hypothetical risk. Yet, it's more appropriate for Truecrypt than most given it was the target of many nation-states. The best example of that sort of thing is the "Obfuscated C Contest." (look it up) TrueCrypt has higher risk here due to dealing with cryptography and being written in a language where the slightest mistake leads to vulnerabilities. There's only so much one can do auditing for backdoors.

The solution is to design a product for vetting from the start. Old standards and my newer one's both suggest to make the product very modular, layered, have straightforward control flow, minimized number of states, type-safe language, no fancy constructs, well documented, reviewed for common errors, and reviewed by mutually suspicious parties for the rest. There's also assurance activities such as testing, static analysis, formal verification, covert channel analysis, and so on to weed out problems during development.

That we are wondering about TrueCrypt, OpenSSL, and so on just shows those developing security products rarely use development approaches conducive to real, provable security. Many are too complex to even make an assurance case for. Others are designed well enough to at least catch obvious problems during a code review. Hopefully, Truecrypt is in this category. At least we know the nation states had a really hard time with it. So, compiled from source, it's way more assuring than most other solutions.

ThothApril 7, 2015 8:14 AM

@Peter Galbavy
Using XOR ciphers (One-Time Pads) are fine only if:

1.) Truely Random Keystreams
2.) Never Reuse Keystreams
3.) Message/Keystream must be more than 128 buts (16 bytes) or longer to prevent bruteforce.

Either of these three are not met, it's just not going to work.

The best is still a solid stream/block cipher like Salsa related ciphers or something along the lines of Serpent or Twofish block ciphers.

Nick PApril 7, 2015 5:25 PM

I am Jason Pyeron of the Ciphershed Project: AMA

A good Ask Me Anything session with Pyeron. CipherShed, like VeraCrypt, is one of the successors to the Truecrypt project. His answers show they're putting quite a bit of effort into cleaning up the [unusually sloppy] legacy code. His background is quite diverse in IT and INFOSEC, lending credibility to his contributions. He also uses one of Schneier's blog titles as a response to a question. Haha.

TCSecretApril 9, 2015 1:23 AM

Definitely I migrated to VeraCrypt that has the bug fixes! It is an improved TrueCrypt! Updated!

Shane J PearsonAugust 7, 2015 1:52 AM


#3 with a reference to keystream size and brute force attacks, implies re-use of the keystream to encrypt a larger plaintext?

Properly performed, the One Time Pad must be as large as the plaintext itself. If the keystream is smaller and thus repeated for full coverage of the plaintext, then this is NOT the use of a One Time Pad, because the Pad is being used more than One Time!

If this mistake is made and the keystream is cycled multiple times to cover the larger plaintext, then you run the risk of revealing statistically significant patterns when blocks of ciphertext of the same size as the keystream are XOR'ed against other blocks of the same size. When those patterns are found, they can then me compared with letter frequency of a given language to find the keystream, or at least enough of the keystream to start to recognize portions of words from the plaintext that can then reveal the portions of keystream which may still be uncertain.

With a properly implemented OTP where the pad equals the size of the plaintext, there is nothing to brute force at all, because any other plaintext of the same size could be decrypted with some other Pad of the same size.

Also consider plaintext with long runs of whitespace. An XOR of that whitespace and the keystream will be repeated throughout the ciphertext, such that it will create a pattern that can easily be found and then reconstructed back to the keystream or at least a portion of it. To start with you could find a repeating pattern in the ciphertext, XOR it with the ASCII value for the space character and then XOR the file with a rolling window of that and go from there!

The Pad used in a One Time Pad MUST be as large as the ciphertext!

And brute force attacks do not apply to One Time Pad encryption. If you can brute force a ciphertext, that ciphertext cannot be from an OTP, by definition.

Shane J PearsonAugust 7, 2015 1:57 AM

Sorry, when I said:

"The Pad used in a One Time Pad MUST be as large as the ciphertext!"

I meant to to say:

"The Pad used in a One Time Pad MUST be as large as the plaintext!"

Clive RobinsonAugust 7, 2015 7:39 AM

@ Shane J Pearson,

And brute force attacks do not apply to One Time Pad encryption. If you can brute force a ciphertext, that ciphertext cannot be from an OTP, by definition.

Err not true and not true.

You can regard all stream ciphers of which the OTP is one as substitution ciphers. One of the problems with substitution ciphers is that sometimes you don't have to decrypt them to understand what they are saying. To prevent this you have to precode the plain text in various ways to prevent the different issues.

A simple example is, if I know that you are responding to a question that requires a yes or no answer. If you do not precode to say Y/N then the OTP message for Yes is of length three and No is of length two giving the game away irrespective of how random the key stream is. Thus precoding or block lengthaning will prevent this.

Further, even if you don't precode, you would be wise to make all messages of a fixed block length that is the same size as the normal block length of the most common block cipher in use (ie AES-128) so your traffic does not stand out in the crowd and call extra attention down on your head.

However the meaning of a sufficiently short length OTP message can be brut forced if you have sufficient time/memory, and of all the candidate plain text messages you will get very few that have any meaning. Look in a dictionary for all the words of length three or less charecters, how many do you find? compare that to 2^(8x3) ~16.78million possible byte based OTP messages. With suitable precoding there should be no or very few language statistics. It's why in times past, messages were precoded by a "code book" or other cipher and then super enciphered with the OTP.

It's why you should always precode messages and make them above a certain length. However the better your precoding, then the less statistics the attacker has to work with. Thus within modest limits you can use a shorter message length. Obviously traditional "code books" made entire sentances down to just 14 or 15 bits equivalent which is quite a usefull shortening of a message.

The big problem with OTPs is that outside of those that use them professionally or have studded the professional use of OTPs, the incorrect idea that any input is magicaly transformed into something unbreakable is incorrectly assumed. The idea that they are the ultimate unbreakable paper and pencil cipher is wrong, as is the idea that the OTP key stream must be truly random.

As for your second sentance that is just nonsense, the strength of an OTP is not that you can not get all the possible messages, you can relativly trivialy, it's that all the messages are equiprobable and you can not distinquish which is the only correct message.

Shane J PearsonAugust 13, 2015 6:38 AM

@ Clive Robinson

Hi Clive,

The issue of not padding a ciphertext up to a larger and fixed block size (or to multiples of that block size for plaintexts larger than the block size) is seperate from the points I was making. Sending a two or three character ciphertext when your adversary is expecting a yes or no answer is a schoolboy mistake.

The point behind my statement about the pad needing to be as large in size as the plaintext, was to illustrate that the pad should not be shorter than the plaintext and thus not need to be recycled to make up the full ciphertext. Recycling a One Time Pad takes the "One" out of it and history has shown what a mistake that is.

Sending a two or three character ciphertext is a seperate subject to that of the pros and cons of a properly implemented One Time Pad. The use of a two or three character ciphertext, does not illustrate a weakness in the OTP as a form of encryption, it illustrates a weakness in the sender of the message.

The reason that brute force attacks cannot be successfully applied to a properly implemented One Time Pad encrypted ciphertext, is actually for the reason you and I already cited, being that you cannot know which of the multitude of equally plausible brute forced plaintexts actually matches the original.

You said, “all the messages are equiprobable and you can not distinquish which is the only correct message”.

And I said, “any other plaintext of the same size could be decrypted with some other Pad of the same size”.

We are both talking about the same strength of the One Time Pad, which makes it provably secure.

Your assertion that the OTP key stream does not have to be truely random can be a trap, because who or what can declare that any source of random numbers is “truely” random?

However I wonder if the point you are making there is that it is okay to use strong psuedo random numbers from a deterministic source? Then no, that goes against the very requirements of the One Time Pad.

If you use pseudo random numbers for your OTP, you now actually have something to brute force, which is exactly why the random numbers for an OTP must come from a non-deterministic source of randomness. If full cycles of all known PRNG’s are used to decrypt a ciphertext and from all of that one plaintext comes out which is a perfectly legible message (of suffient size, do I need to make it clear I am not talking about a three character ciphertext?), then you have a probability based on the plaintext size that a. that is the original message and b. it was not encrypted using the proper requirements of an OTP.

This is specifically why an OTP should not come from a deterministic source and it must be random.

One method to cover both weaknesses of pseudo random numbers which can theoretically be found with brute force and natural random numbers which may not be white or otherwise may naturally contain non-obvious patterns, is to use both to whiten each other into a single stream (through XOR for example).

The output of that should also be checked for randomness and if it fails, then it should be discarded for another attempt.

You claim that you can get all plausible messages, but you have not defined the length of the ciphertext. You can get all plausible messages only if the ciphertext is sufficiently small for a realistic amount of compute power and storage. And then even when you can get all plausible messages, you cannot know which is the original.

And thus, a properly implemented OTP is provably secure.

Clive RobinsonAugust 13, 2015 12:26 PM

@ Shane J Pearson,

One method to cover both weaknesses of pseudo random numbers which can theoretically be found with brute force and natural random numbers which may not be white or otherwise may naturally contain non-obvious patterns, is to use both to whiten each other into a single stream (through XOR for example).

That method of "whitening the key" has been sugested in the past with using book codes. At the time it was felt that mixing four or more different plaintext word streams moved the statistics sufficiently far to make the then known attacks unworkable, thus the resulting key stream safe to use as the equivalent of a one time pad. I doubt such assumptions would be considered safe any longer for a number of reasons.

However that was not the issue I was refering to. I've mentioned it several times in the past on this blog but I'll repeate it again.

There is a problem with naturaly occuring random streams in that their statistics are neither flat nor bounded. Both have fairly easy solutions but you have to be aware the problems exist to deal with them sensibly.

Within limits the lack of flatness is not realy an issue, in fact close in it's desirable. The problem is the "unbounded run length" that occures with natural or true random sources. That is at what length do you decide a run of "zeros" or any other sequence is to long and will thus leak plain text statistics and thus alow possible recovery of actual text?

In some circles it's been considered in times past to be the equivalent of the average word length or the number of bits to produce five or six charecters in the normal usage of the plain text alphabet. The problem with this is that it's only around 25bits in hand written english plain text. Which means you have real issues to deal with, that are not that amenable to just simple fixes to the keystream.

From a non professional usage point the solutions in the past have been (1) to remove obvious runs from the key stream, (2) pre encipher/code the plain text to flatten the statistics effectivly increase the effective alphabet size, then use the OTP key stream for super enciphering, or if for paper and pencil hand ciphering (3) keep the message length short.

Aside from the likes of emergancy key transfer and EOW messages OTPs are very much out of favour for all but a few ComCens these days due to KeyMat issues.

However the idea of the "multiple book code" style whitening does not go away in the world where data is not at rest and the standard "communications" assumptions do not apply and thus either chaining modes or fixed sized cipher blocks are inappropriate. That is where fast multiple user random access into very large blocks of ciphered material such as databases or storage is required where users have differing security access to individual records and fields that frequently may be of different length and change rapidly. Thus you can find the use of multiple mixed CTR mode to make a keystream, where each CTR stream uses a different key. Thus simplisticaly one for the database, one for the record offset and one for each field in the record etc. It is however a bit of a minefield due to "security-v-efficiency" it's easily possible to open up side channels which can alow insider attacks.

jamesNovember 3, 2016 6:30 PM

i am microsoft partner i one few that looks at there software finds out how it breaks
saying this i been a partner sence xp. last year i found production probem in there software all of it windows 7 throw 10 server 2012 2012r2
they wanted know what production probem was i told them want pay for information
my computers and servers stared ack odd tack down to microsof hacking my network
savatogeing my network, stocking me haveing live access to my network
hijacking my accounts. and servers accounts to log on
i used a 3ed part program to encript the harddrives this and takeing my pc of wired internent to wifi. stop all updateds lock out windows installer unless when needed
in short every thing done stop all updates

log files have change it seam it work but then again microsoft could disable all loging what they doing. they went as far lock all securty on dirve so i could not change them not even host file i edit,

there is a 3ed hidden parttion on harddrive what this for idk it is imposssable find i used a 3ed party low level formater to find it what its for what can do idk

i do have proof microsoft useing the domain they upload all user accounts all user passwords, all encrition certifics all gpo changes the real threat is microsoft and how they get to our computers as firewall never made stop there kind attack

i cant get no one look at information i got because to do so would force them take action so no one will help

quistion is why what microsoft hope to gain and if people new this and got out microsoft be gone but i one few understands how works and been attack by them

i been in computers sence dos have more hands on then most i know microsoft software better then most all them,
sadly i cant do thing about it dont have money to get a lawyer to fight them
i would go after them two ways at same time first break all user agreements reason prodution probem
i do this with out even them in cort with me
and i get a warrent keep them off my computers at same time

i start crimal action on all board and ceo and past ceo
then start class action sute on behave all people in usa got and run microsoft all busness

in short i break them unless they give me things i want all uploading of any information would end stop collection of dns information
and many other things, i would force them remake windows 10 to more stable os

when dose a busness go so far befor people ack. becuase we dont see dose not meen not going on u do have proof just dont have money or lawyers to stop them

how can u have securty when microsoft telling your computers to call there servers to send them information vmware they broken the keys they break vmware they bring down all servers not just microsoft any hacker can do this any
micorsoft open a back door to make so easy for hackers to get in

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.