Schneier on Security
A blog covering security and security technology.
« UK Car Rentals to Require Fingerprints |
| A Classified Wikipedia »
November 14, 2006
The Doghouse: Skylark Utilities
I'll just quote this bit:
Files are encrypted in place using the 524,288 Bit cipher SCC, better know as the king of ciphers.
For reference, here's my snake oil guide from 1999.
Posted on November 14, 2006 at 1:49 PM
• 62 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
I found the following bit much more exciting:
"Encode It is Optimized for today's Hi-Definition monitors!"
Yay for Hi-definition monitor-optimized cryptography!
I'm reminded of the fact that "skylarking" is nautical jargon for "fooling around", deriving from games of chase in the rigging of tall ships. (Hence the name of the excellent album by XTC.)
From the site:
"...the SSF method which exceeds all other shredding methods in speed, accuracy, and security."
Wow! So it's the best shredding method (it even beats other methods on "accuracy" - the most important thing for any shredding method)? Look at the next sentence...
"The DOD secure shred is also included for those that demand military strength shredding."
So is "military strength shredding" is less secure than "the SSF method", or did they change their minds between sentences?
@Rrat, you clearly have not considered TEMPEST....
524,288? That's my phone number!
Give 'em a break, after all "it employs 21 layers of hacker proof protection." It's also the "number one choice among professionals!"
"better know as the king of ciphers"
I assume that is supposed to be "better known" rather than "you better know"...
You know it's high quality marketing when it's not even proofread.
Stuff like this should be illegal. Isn't this false advertising or something? Please! Make it so!
Unfortunately for them, I don't trust any less than *twenty-two* hacker-proof layers.
You guys are being to critical!
This Powerful and Accurate software has the "Ability to encrypt a floppy disk and then decode the files with or without Encode It"!!
Finally a way to encrypt all of my floppy disks! woot! I hope it works on all of my SD Single sided 8" floppys as well as it does the new fangled 5.25" double density ones.
Keeping in mind that Budweiser is the King of Beers.
"""Finally a way to encrypt all of my floppy disks! woot! I hope it works on all of my SD Single sided 8" floppys as well as it does the new fangled 5.25" double density ones."""
Just let me be the first (and hopefully last) to say it...
Floppy disks? In my day .....
I dare you to name one other encryption product that is Optimized for today's Hi-Definition monitors!
After looking at the names of the other products they have, I have two words:
Cram It !
Wow:"in addition, it employs 21 layers of hacker proof protection to ensure your data does not fall into the wrong hands."
Unfortunately, the average (l)user who does not read Crypto-gram or Schneier on Security will immediately think that this is the ULTIMATE SECURITY TOOL!
Who wants to buy a copy, pop it in IDA Pro 5.0, and laugh as we see it use 8-bit XOR encryption!
Really? I thought *I* was the king of ciphers.
I emailed the chap, giving him a chance to explain why his crypto system was better than others available:
"When you see programs with BlowFish and the like, it's good to know that the code can be obtained from the internet. Anyone can build a program using it, so is your data really secure - not at all.
SCC is unique in the fact that only I have the key and the code. Compare a 32 Bit, 64 Bit or even 128 Bit to my 524,288 Bit cipher and you will see a huge difference. That's the difference between military trained and kids just playing around on the internet."
He goes on to say how his file eraser is different, because none of the others take account of Windows's file buffering. I don't see how that's relevant. He also makes no mention of dealing with SMART sector relocation, or free-space scrubbing, so I doubt it's actually useful.
I worry about such people. But I worry about people who take them up on their offer more, believing themselves to be safe.
I also worry about anybody who gets certification that their products are adware, spyware and virus free from... themselves.
The screenshot reminds me of File Manager in Windows 3.x.
Rob Kendrick> [quoting dog-boy] "That's the difference between military trained and kids just playing around on the internet."
Ironic how true that comment is in this case, just not in the way he intends.
If you write back to him, ask him if he's considered trying to have it validated under FIPS 140-2.
hmm... a 524,288 Bit cipher.. I guess I'm not the only one assuming that this program simply uses a 64k snippet of the author's /dev/random as xor key.. which probably is even stored as plain"text" somewhere inside the .exe..
Anyone care to xor a few files with their cipherfiles to check for a common pad? The trial version is free.. ;o)
Nonono, I'm sure he was smart enough to use SCC to encode his key. That way they'll never find it, becuase you need to know the key to decrypt the key.
From the changelog: "The `Esc´ key has been adjusted to only cancel the encoding/decoding process when Encode It is the top most application"
It's sounding better and better!
>He goes on to say how his file eraser is different, because none of the others take account of Windows's file buffering.
What he's trying to imply here is that it's some secret knowledge to flush (or close) the file after each overwrite. DOD 5220 doesn't specify this because it is of course not platform specific. But everybody implementing DOD 5220 on Winders knows to do this already.
>So is "military strength shredding" is less secure than "the SSF method", or did they change their minds between sentences?
Well of course it's easy for us to make the "military strength" claim mistake for software. Actually no claim of "military strength" can be made unless physical security is involved, i.e. some type of armored/secured enclosure is part of the design.
Also, it's simply a rookie mistake to make any claims that techniques used are better than those recognized by NIST or some other standards body. Whether one even provably has a better method to encrypt or shred or whatever, the method will still be snake oil to the corporate world.
Clearly weak encryption. Encoding solid null bytes with the default password set, you can see clear 32-byte patterns in the output file. Not even precompression.
Encoding a second file with a 1-bit difference produces a file that differs only in the same bit.
Looks pretty clear that it's XOR with a generated keystream.
Even if I knew nothing about cryptography, grammar errors like:
"including complete folders and all it's[sic] sub folders"
"assume responsibility for misplace, lost, or damaged keycodes!"
don't give me the impression of a professional product. We all make typos in email, but if you're trying to sell a product on a website, you'd better put your best foot forward.
Oh, and get this--the crypto doesn't depend on the password.
Looks like it's XOR with a repeating 524288-bit pad, which itself contains many repeats. The pad looks like it's generated by overlaying multiple XOR patterns of different lengths. In any case, you can recover any original file using only 64KB of pad, regardless of the password. Generate the recovery pad by "encoding" 64K null bytes.
Caveat: I'm running this under wine, and it's crashing after writing the output file (maybe trying to invoke some "Windows file buffering" API). Looks like it's done with "encoding" at that point, though.
OK OK! I get it already! The guy is a charlatan or a fool or both. At least he's not selling election software.
Man, you nerds are ruthless.
As the risk of seeming ruthless, let me clarify that the pad incorporates data that is generated at the first run of the program and stored on disk. So you don't get another user's pad by installing the program on your own system. Of course known plaintext still works. And the program is deeply broken in several other ways, but again, I don't want to be ruthless, so I'll leave it at that... :^)
The amazing thing is that he wants $49.95 for it.
Well, at least he didn't send me (successful) spam about it.
"an unique" is actually gramatically correct despite the way it sounds, since grammar is based on the written form rather than the pronunciation.
Sorry Uniqueness but you're wrong.
"An unique" is incorrect and "a unique" is correct because when the U word begins with a vowel sound (not a vowel letter) we use "an", ie "an umbrella" or "an umbilical cord".
However when the U word is pronounced "yu" then it is correct to say "a unique". Don't take my word for it, look it up.
Found this in the cryptography-digest archive (you need to scroll down to the part about Encode It):
With reply in:
So it turns out that this ridiculous cipher was cracked (surprise, surprise) back in 1999! And yet "Skylark Utilities" continues to sell it... Their flagship product, no less!
Some company. What was that about "kids playing around on the Internet" again?
"In addition, it employs 21 layers of hacker proof protection to ensure your data does not fall into the wrong hands."
If it's hacker-proof, why do you need 21 of them?
This one goes to 524,288.
Military trained? Military strength encryption???
Does this mean that anyone found to have broken it has their location co-ordinates uploaded to the targeting computer on the nearest B-52?
Each time Bruce Schneier doghouses one of these *lame* not-ciphers, a few days later, the domain gets unregistered.
Pattern or coincidence?
Military strength shredding? What ever happened to good ol' thermite?
>"an unique" is actually gramatically correct despite the way it sounds, since grammar is based on the written form rather than the pronunciation.
Actually, the rule for "an" (as I learned it many moons ago) includes specific exceptions for words beginning with a long "u" - thus, an umbrella, but a uniform.
"You must sanitize your drives about once a month to ensure lost file fragments can never be recovered.
Wipe It also caps off remaining sectors and reconditions the drive. The exclusive reconditioning process will extend the life of the drive far beyond it's normal life span - this is a security feature that only Wipe It employs, but is really a cool bonus for the end user.
Wipe It can sanitize a drive with up to 400 GB of free space!"
Personally, I find the ability to (potentially) un-delete my files a benefit.
I'm really trying to understand what this reconditioning process is. Does it give the platters a nice, shiny polish? Does it re-lube the bearings with long-lasting snake oil?
And, what about only working on small drives? You can buy >400GB drives, so I don't really think that a big, bold, "We don't work on big drives until you've filled them up enough!" warning is something to be proud of.
X the Unknown> Actually, the rule for "an" (as I learned it many moons ago) includes specific exceptions for words beginning with a long "u" - thus, an umbrella, but a uniform.
Actually, no exception is needed. Contrary to the assertion by Jo, the choice of a vs. an is made based on pronunciation, not spelling. Consider the following, incorrect examples:
Maybe I'm slow, but...
I finally realized what "Optimized for today's Hi-Definition monitors!" actually means - cause every single one of his programs claims that.
It means that the program takes up lots of space on your screen. It's layed out for today's hi-_resolution_ monitors.
@antibozo: "the pad incorporates data that is generated at the first run of the program and stored on disk"
So even the legitimate user has no way of decrypting their own data on a different machine than the one they encrypted it on?
And assuming that the encrypted files are kept on this machine (as they're unusable anywhere else) anyone with access to the encrypted file also has access to the encryption key as it's on the same disk.
Yes, well I could go on, but I don't want to turn Bruce's blog into full-disclosure...
Im new to Schneider's blog and data security so please don't bite me! I made this on Excel (yeah i really did!). Can anyone tell me if they can break it/what I have done wrong and how I can improve?
sfpt nvok nwzj kisl xlnr pnvf qzsf ares ggrj bspm yqjr rott dclh rcyg hmtq bnkn xlui pvzs ebmp eygj brgt qjgq xley zpoe nsse yqqz rrau cokb aunb vhys iveh htms eqwu gyky elyc rkho fonn rbam yomq nypm pfue qygl ubyl cqtj
if there isnt enough here I can give you more.
Sid in Wales
Sid in Wales> if there isnt enough here I can give you more.
There are good newsgroups for this kind of thing. Offhand, you have a very uneven distribution of characters in your ciphertext, and that's not a good sign, but the sample is very small.
If you really want an analysis, describe the algorithm. If you don't want do that, you're missing the point of this whole discussion. If it's not a well-known algorithm such as AES, and you're not a serious number theorist, it's almost certainly weak. As Bruce Schneier (not Schneider) says, "anyone can create a cryptography product that he himself cannot break".
Yeah ok, it isnt orthodox and im poor at maths. But the thing works by a series of tables of seven coloumns each. Each table contains an alternative alphabet. There is a switch controlling which table to use. The coloumn the thing uses depends on the position of the character so one character may mean one letter somewhere and a different somwhere else.
uhyg qnve tsjm srqj cahy xiqm yyxa gtux gycr ybtu jbbn mdpl peda upcr ffmg thnc
ntah twja gkuy yyjh vhti yfll rogm jeky zovw lgln mcyi sinh zmgh ucuq bwbt ofhg ikyj arwh grhb uqce fjkw nmqa gkkk uscn rpbm aqye wxjq pttg zaqc klbn gxqj ayjl bjnc fqqm wqwm rrjf bsmv qmlo ioum aomm bnwk ldtm hsxe bfar gmym qytw shno zmgw lhjk ktop yqzo hyyg wlcb spkj
gyzv peop eiwy frxh qnto ltlu tjjh uxqt jakl ywae wplc zutw ecxf jacn jwjm thkn wmgm appo unha gltx eqap zqlp gruq ntmg kxma ghqs hpkx jskg igkq tysd qkny xqxq yctk ohmi xrjj bjri oxbo zulg qzfq iqqq citu ldsk kxay lcer jliq pwtu yuax ljka vzgq menr kwdq cpgn kwxh gbky spmp cdxa rejy kdld fnlr ijwr cooh xxup yrky lenc gush wdlb ontk sckn sxfl afqj kyyt hatw lzzd oacy steh rpkm kuef mrqt qmdm heeh ybhk
The real problem, Sid, is one of how you're going to distribute your series of tables to two different parties and how the two parties will decide on the rules to switch between columns. This is the hard part of cryptography, not generating a cyphered message.
Sid, as nksingh points out, you have a key distribution problem, where we are taking the "key" to be the particular tables you are using. But even if you solve that problem, your cipher is going to be weak. Most modern ciphers use a process similar to yours as one component of a complex operation which is repeated multiple times (rounds); the tables are static, but data from a key is mixed into the ciphertext on one or more rounds. With only one round, and no bit mixing operations, any cipher of this design will be very easy to crack with a known-plaintext attack, and with a little knowledge of the algorithm and a sufficient corpus of ciphertext, it could be cracked just based on character frequencies, since the statistical frequencies of the letters in the output of your cipher can be matched with the statistical frequencies in the plaintext language. The use of position to perturb the table just means that one would analyze characters at a given position in the collection of ciphertexts as a set.
I don't mean to discourage your interest. But very few people have the ability to construct new ciphers that withstand cryptanalytic techniques. A lot more (though still few) have the ability to find weaknesses in ciphers others have designed. If you have interest in the field, you might want to spend some time working on the latter skills before tackling the former.
In addition to the books I referred to earlier, I suggest you read the Wikipedia article on differential cryptanalysis, and the external links cited:
IOW, if you can compress the resulting final ciphertext, your cipher is weak.
"IOW, if you can compress the resulting final ciphertext, your cipher is weak."
That is an unfortunate case of a rule of thumb becoming a law. And as it turns out is actually not true. It is based on an unfortunate assumption that redundancy and compressability are the same thing. They are not one is the posible consiquence of the other and in noway implies the former.
To start off,
1) if the plain text is if of normal usage (ie it contains more than one bit of information) it contains (often considerable) redundancy.
2) if the plain text message (Ptext) is of normal usage and suitably long then for most practical systems the resulting cipher text (Ctext) is of the same length as the original message (or slightly longer).
3) where there is redundancy in the form of repeted fixed length symbols it is possible to compress the message by using variable length symbols and a weighting algorithm.
Therefore simple logic dictates quite obviously if the Ptext message can be recovered from the Ctext of a same length, then the Ctext must also contain the same level of redundancy.
Again by logic the inverse of this is, as the texts are the same length. If Ctext does not contain the same level of redundancy then the Ptext message cannot be recovered.
This is fairly easily seen if you say take a large program listing or executable (better as it's instruction size is usually a sub multiple of the cipher block size) and put it through DES or AES in code book mode. Simple observation will show you there is redundancy in the output, due to the having output blocks with the same value. So it is possible to compress the output....
This is a problem that has been well known prior to the original publication of the DES specification.
There are a number of ways you may obfuscate the visable redundancy but the usuall method employed involves using a feed forward / back mechanism.
The simple case is prior to each block (ptext) of the Ptext message being enciphered it is mixed with the encrypted (ctext) block from the previous ptext block encryption. This give rise to the problem of how do you deal with the first block where there is no previous block to provide feedback.
This is usually solved with Initialisation Vectors (IV's) where a block of ptext known to both parties (usually a string of zeros or some such) is encrypted under the same key. Therefor both parties can put it through the function to calculate the first feed forward / back block.
Hopefully when you observe the ouput of such a system you will not be able to identify the redundancy by simple observation. However it is has not gone away, even though there are no repeated symbols that can be compressed.
Then you have to ask yourself a question, is it possible to put an incompressable Ptext into a cipher function and obtain a Ctext output with compressable redundancy?
The answer is yes, think back to the feedback system above using an IV of all zeros what happens if you carry on encrypting all zeros?
You should (asuming it's working properly) get an output where the encryption function steps through every possible output in some apparently random order (actually dependent on the key). As there are no repeats of the output blocks there is no redundancy at the block size so it is uncompressable at that level.
Then as the encryption function is invertable (ie you can decrypt it) you can simply put the non compressed output back into the encryption block and get an output without information (ie compleatly redundant).
Therefor there will always be some input to a block cipher in feedback mode that is not compressable that at the output to the function will be fully compressable. In fact there are as many as there are IV's. Which means that there are as many as the block size so for AES in 128bit there are 2^128 uncompressable messages that will give output blocks that are all the same value (fully compressable).
You can follow the logic down to show that at any point in the Ptext message it is possible to have the start of a string that provides compressable output after it is enciphered.
Finaly just to put the nail in the coffin as it where, what if your compression function does not work at a fixed block size, but a variable block size, and also has an adaptive algorithum. What is the probability that it will compress a Ctext message from a Ptext message with redundancy?
I think my head just exploded after reading that... =)
Clive is the only smart one here, but it seems like your snake oil it alive and well.
So, has the site been hacked, or does this chap have a sense of humor?
"The Snake Oil prevents clear code and other attacks by producing fake code to trick hackers into false sense of security and from the reviews over the years, my Snake Oil is working like a charm. Although I'm still in the doghouse, I know my next release will be the cat's mellow!"
Is there a dedicated Doghouse blog/site? If not, there should be...
"Is there a dedicated Doghouse blog/site? If not, there should be..."
I don't know of one.
Rob Kendrick is emailing everyone looking for a hacker to mess this guy's site up. I don't know about you Bruce, but Rob sounds like a trouble maker.
His site appears hacked, but not maliciously so.
``Then you have to ask yourself a question, is it possible to put an incompressable Ptext into a cipher function and obtain a Ctext output with compressable redundancy?
The answer is yes, think back to the feedback system above using an IV of all zeros what happens if you carry on encrypting all zeros?''
BTW, in this case he means p_n=0 for all n. At first I couldn't tell if he meant this or if he meant that the plaintext was chosen so that the input to the block cipher was 0 each time, but that makes even less sense when you work it out, since the output is always a constant, so the ciphertext is either a repeated (constant) block or 0.
``You should (asuming it's working properly) get an output where the encryption function steps through every possible output in some apparently random order (actually dependent on the key). As there are no repeats of the output blocks there is no redundancy at the block size so it is uncompressable at that level.''
c_1 = e(IV), c_n = e(c_(n-1))
In any mode that uses feedback (CBC, CFB, OFB). Is that really guaranteed to step through every value? If so, it must rely on some property of the block cipher that I'm not thinking of. It seems to me that it could just as easily slip into a short cycle; if e(IV) = x and e(x) = IV, those two blocks repeat over and over. What you're describing sounds a lot like CTR mode, which doesn't use feedback at all. What makes you think iterating the block cipher forms a permutation?
``Then as the encryption function is invertable (ie you can decrypt it) you can simply put the non compressed output back into the encryption block and get an output without information (ie compleatly redundant)."
What's an "encryption block"? The block cipher? I'm lost.
In any case, redundancy doesn't go away when you encrypt; it just gets obscured. I assume that's what he is trying to say. If my stream is the concatenation of the Bible encrypted with a counter as the key, you may have difficulty detecting that it is redundant, unless you happen to try decrypting it with a counter.
And compression algorithms test for _one kind_ of redundancy. In the case of n-byte sliding window algorithms, you just make a repeating sequence with length n+1 bytes and no internal repetition, and the compression function won't find it. Similar redundant inputs can be constructed for each individual compression algorithm. So by all means use them to check your sanity, but don't expect them to find every kind of redundancy, and don't expect them to only yield smaller outputs; that is impossible, by the pigeon-hole principle. What makes them useful is that they detect a certain kind of randomness that's present in the data sets you are concerned with.
Also, most compression routines create PREDICTABLE bits and LARGER outputs up to some break-even point. For example, in Huffman coding, the "enter a new symbol" bit is predictable on the first byte (if not assumed) and unless the second input byte is the same as the first, is predictable again (if plaintexts are randomly generated, this happens 255/256ths of the time).
Testing for redundancy is like testing for predictable patterns in the output of a HWRNG; the statement "there are no predictable patterns in this output" is a universal statement (negation of an existential quantifier) and so cannot be said rigorously due to testing without testing for every possible prediction algorithm. Usually, the negation of a statement is a universal (because hypotheses tend to be existential), so when people say "you can't prove a negative", they really mean "you can't prove a universal with individual (non-)existence evidence".
Actually, I _can_ prove negatives; I can prove the negation of "all elephants are dead" by finding a live one. If I say "measurable gravity exists everywhere", that's not a negative, but good luck trying to prove it with a gravimeter and a space ship.
Proof by contradiction is often the only hope for proving universals over infinite sets.
What does this all mean for crypto? It means that the claim "there is no shortcut for computing the inverse of this cipher that is more efficient than brute force" is not going to get proven. It means that you can't test for unpredictability in a RNG. It means that you can't test for uncompressability. That's what keeps it interesting.... ;-)
Bob Rider: If you'd like to provide me your mailing address and real name, you'll receive a nice letter from my solicitor.
I would like to make it absolutely clear that the only person I have emailed on the subject discussed on this page is the utilities author, and only asked for information on how his cipher worked, and why it was more secure than other methods - no more.
I did however also relay the URL to this page to a Debian-related channel.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.