Friday Squid Blogging: Squid's Beard

It's an acoustic bluegrass band.

As usual, you can also use this squid post to talk about the security stories in the news that I haven't covered.

Posted on February 10, 2012 at 4:04 PM • 39 Comments


Clive RobinsonFebruary 10, 2012 5:04 PM

Hmm "Bluegrass" I used to sing a bit of that many moons ago in a little pub in South West London.

My "other half" on seeing the words "Squid's Beard" had the insensitivity to remark that my beard looks as long as a "Red Devils" (humbolt squid) tenticals, and just as nasty...

Which I'm expected to take as "good advice" however if I was to say remark that her "thatch looked like a birds nest of barbed wire" you can guess where I'd be sleeping tonight and for the foreseable future ;)

This "equality" stuff is very one way...

htmlfailFebruary 10, 2012 9:38 PM

A new research paper gives a detailed analysis of malicious protocols and infrastructure used by the attackers behind the 2011 hack on SK Communications (during which the personal details of over 35 million people were stolen). The findings are used to demonstrate strong links between the SK Communications hack, the recent Sykipot DoD attacks, the RSA intrusion, and the Night Dragon series of energy sector attacks. This appears to be the first time that anyone has been able to publicly prove the existence of a globally coordinated campaign of cyber attacks likely sponsored by a nation state.

DanielFebruary 10, 2012 10:48 PM

I'm a huge fan of The Civil Wars and the duo's Joy Williams recently gave an interview where she talks about the duo's deliberate use of on-line music piracy as a marketing tool.

The money quote (pun intended) is "In short, we try not to be curmudgeons about piracy, but to embrace this shift in culture."

Check them out at the Grammy Awards on Sunday (shameless plug).

MarkHFebruary 11, 2012 3:09 AM

I think this blog has had at least one post related to securing laptops that might be subject to (among other intrusions) US customs inspection, and others about attacks against mobile phones. Here's a NY Times piece about traveling abroad to countries that practice intensive economic espionage against the electronic gadgets of some foreign visitors:

Though the specific threat is different, the rather severe security practices described in the article certainly overlap some that have been discussed here.

The article also discusses threats against systems in the home country, including this interesting quote:

“In looking at computer systems of consequence — in government, Congress, at the Department of Defense, aerospace, companies with valuable trade secrets — we’ve not examined one yet that has not been infected by an advanced persistent threat.”

Food for thought...

Clive RobinsonFebruary 11, 2012 4:47 AM

@ MarkH,

Though the specific threat is different, the rather severe security practices described in the article certainly overlap some that have been discussed here.

Sadly apart from the "travel naked" advice the rest of it is not sufficient for most mortals due to the "human problem"...

That is there will always be an emergancy of some type that occurs that will cause most humans to break the rules and then bingo "the enemy is within".

And if you are a target and fail to respond to the "emergancy" in the way they want then you will get pulled for "walking on the cracks in the street" or equivalent, and it will escalate in one way or another, till they get what they want.

For instance there are many nations you cannot go into or travel through even though you stay on "air side" with encrypted data, and you can go to their prison (and most are a lot worse than you will find in the US) for as long as they wish...

And the problem is if they look at your phone or laptop or USB key, even if it is brand new, you can be sure that there will be a file that you either don't know about or they will claim contains encrypted data, and you can't prove otherwise (look up "bible codes" to see why).

But the real joke of it is the comment from an FBI droid about how all the US has left is R&D.

But then fails to make the connection that the most valuable information is on academics and engineers electronic devices when they go to conferances, AND that the Bush instigated behaviour of the likes of the TSA stop engineers and academics coming to conferances in the US so conferances are better attended outside of the US causing US academics and engineers to travel abroad for the confrences "that matter"...

It is something that is not lost on some US academic institutions. As some of the commenters to the NYT article you link to are only to aware, and one gave a nice link,

That has one or two other useful links.

But even if you don't take the data with you to conferences or "meat space" meetings the Chinese amongst others have another way.

If a nation has a natural resource that is desirable to other countries be it cheap labour or dificult to obtain mineral resources. Then as has been seen with China amongst others, they can offer short term short sighted execs a deal whereby the execs issue a "make it so" order and the company then move the entire manufacturing along with all the trade secrets across to China....

One of the reasons we all buy Far East manufactured TV's is because back in the very early days of colour television the Japanese made the CRT's far cheaper than could be made in the US or Europe. And the European and US TV manufactures bought them from Japan. The US and European manufactures of CRT's could not compete and so stopped manufactutring and in many cases even doing research. So ten years down the line not only was there no CRT manufacturing in the US and Europe, they could not manufacture the new style CRT's even if they wanted to because Japanese companies had patented all the new techniques...

It's not lost on quite a few people that the same issue is happening with micro chip manufacturing and just about all forms of telecommunications manufacturing. The question is what do the US and Europe do to reverse the trend...

Oh and for those with "imperialistic intentions" just remember why the Indian national flag has a "spining wheel" in it...

A blog readerFebruary 11, 2012 9:03 AM

At FreeRangeKids, Lenore Skenazy talked about the meaning of a failed stranger abduction attempt at a Walmart store. Among other things, the ability for a child to make a scene and fight off an attacker provides protection against attacks from strangers, and (perhaps more importantly) attacks from persons who are known to the child. (In addition, there is the effect of video footage.)

Clive RobinsonFebruary 11, 2012 9:42 AM

@ Octopus

It's tentacles, you sucker

Not the way she says it... it's more like "Ten Tickles" (as in 'whatever tickles your fancy') but with her Scottish accent the "les" sounding as "als" (and yes before you ask she does roll her "R's" a lot... ;)

Anon Bruce PosterFebruary 11, 2012 10:34 AM

I've been thinking about encryption and my right not incriminating myself lately. (Who hasn't?!?)

Suppose I had an encryption program that took two passwords. Unencrypting my hard disk with the "real" password would produce documents that revealed my desire to replace the government of America with an elite corps of six-foot tall blonde Amazons. However, using the "fake" password would produce documents that reveal a collection of public domain God Bless America videos and a massive cache of apple pie recipes.

1) Has Bruce already addressed this? 2) Would this kind of system weaken the strength of the encryption? 3) Could forensic analysts tell the difference?

Thanks for the consideration. Please note the above example was an extreme illustration and I have no desire to forcefully replace the government of America with an elite corps of six-foot tall blonde Amazons.

MarkHFebruary 12, 2012 3:08 AM

Another NY Times piece on security measures recommended for actual practice — to those in the USA who have decided to leak classified information to the press (leak prosecutions under the Obama administration reportedly exceed the total of all those that occurred previously):

The recommendations will probably seem obvious to regular readers of this blog, but still they can be costly in practice. Interesting quote: “For God’s sake, get off of e-mail, get off of your cellphone. Watch your credit cards. Watch your plane tickets. These guys in the N.S.A. know everything.”

Most people still probably have no idea that their mobile phones are constantly disclosing their location.

Clive RobinsonFebruary 12, 2012 3:22 AM

@ Anon Bruce Poster,

You asked,

Suppose I had an encryption program that took two passwords. Unencrypting my hard disk with the "real" password would produce documents that revealed my desire to... However, using the "fake password would produce documents that reveal...

As Andrew noted there are programs that appear to do this but...

1, You would have to put both sets of files in,
2, Many investigators would know how the encryption software you are using can work this way.

There are several consiquences of point 1, but the. three main ones are, firstly it is a lot of extra work for you to keep the "fake container" looking real (and trust me it is hard work). The second is also making sure you do not "cross contaminate" the fake container with "real" information which is almost guaranteed to happen more than once with regular use. And thirdly a detectable reduced storage space.

The problem with point 2 is that if the investigating officers get your encrypted drive, it won't take them to long to work out one way or another you've been a "naughty boy". As has been seen in the US, LEO friendly judges are quite happy to strip you of your rights and incarcerate you untill you do tell. Remember atleast one man spent 14years locked up for "contempt" because the judge decided with no evidence (only the word of his ex-wife) that he was lying about his finances... so you don't have to be guilty of anything other than a judges suspicion (which makes them very unimpartial which is wrong, but that is another matter).

There are other ways to do it but all have similar problems.

JonFebruary 12, 2012 3:17 PM

@ Clive Robinson:

Again, this is why everyone needs large blocks of completely random data lying around on their hard drives. Well-encrypted data is (nearly) indistinguishable from random. And having lots of random data around leads to plausible deniability - "It's random, sir. There is no password, no encrytion key, no nothing."


Clive RobinsonFebruary 12, 2012 5:21 PM

@ Jon,

Well-encrypted data is (nearly indistinguishable from random

It should be but... sadly the majority of HD-Crypto software written does not produce "Well-encrypted data" even with just a single user.

The reality is that it is not dependent on the crypto algorithm used so much as on the crypto mode you use the algorithm in, and one or two other things. And a most of the best modes are encumbered by patents presently.

But to do HD encryption properly you have to use multilevel encryption based on which ADT container you are working at.

So you need reliable crypto at the lowest HD level, That is the LBA provides the key at this level.

You then need encryption at the partition or crypto container level.

So far this is all relatively straight forward, however at the next layers up it gets awkward, whilst the user can have their own key for the file contents the file meta data has to be shared, but also kept hidden from other users.

That is each user should only be able to see the files that they have permission to see, but more importantly they should not be able to see the meta data for the files they are not supposed to see, and harder idealy they should not even be able to tell how many other files there are belonging to other users.

This is a hard but not insoluble problem and it needs to be done at either the OS kernel level or at the diskdriver level where the knowledge of the file system layout for file and user metadata is known.

The downside of course is the number of relationships between users, that is no user can be assumed to be fully independant of the others, sometimes files need to be shared. One way to do this is to use a seperate key for each relationship.

For just the simple 1 to 1 relationships the number of additional relationship keys would be 0.5(N^2-N) and these additional shared keys need to be securely managed.

But there is a problem the actual number of keys to be managed is the individual user keys plus the shared keys which gives 0.5(N^2+N) keys with only 1 to 1 relationships, it gets a lot worse with groups. And even with only 1 to 1 relationships the number of keys quickly becomes unmanagable. That is for five users you would have 15keys but ten users needs 55keys and a hundred users 5050keys.

Thus you need to go about things in a different way, the simplest way is to have the file stored on the HD under the "owners key" and for the OS to decrypt as required for the other users. However this means that the OS has to have access to all the user keys at all times which is very bad key managment practice.

And as many cryptographers have indicated key managment is a very hard problem, which we have hardly started in upon.

sabikFebruary 12, 2012 5:21 PM

Another problem with deniable encryption (or truly random files) and rubber-hose cryptanalysis is that Alice then has no way to convince Eve to stop hitting her with the rubber hose because there are no (more) secrets to be divulged.

"It's random, sir" may still get you locked up for contempt if you have no way of proving it.

WoofleFebruary 12, 2012 5:23 PM

@Clive Robinson

"...but the. three main ones are ... And thirdly a detectable reduced storage space."

A small point, and to indulge my insufferable pedantry:
Last time I read the TrueCrypt manual (and it was a while ago) the "hidden volume" was not detected by TC when you were working in the "safe" volume. There were specific warnings to be very careful to leave enough free space to ensure that safe data did not overwrite the "hidden" volume. So they got that much right at least.

But the rest stands as you've written. As the paper linked to in
makes clear "deniable encryption" is very hard if not actually impossible.


WoofleFebruary 12, 2012 5:26 PM

@Clive Robinson

Er - maybe you meant that the "random" data at the end of the volume would give the game away, which it certainly would if the unused portion of a TC volume is not random gibberish when decrypted. Only just thought of that...

(There are so many ways that information leaks.....)

Tired and confused,

DanielFebruary 12, 2012 8:54 PM

There is an important distinction between the leakage of data and the interpretation of that data. If an interrogator is behaving irrationally there is nothing that crypto can do to solve that problem. 'Rubber hose' crypto is a legal problem; it's not a math problem.

Math cannot solve all the problems in the universe. It cannot even solve most national security issues. Asking crypto to solve problems that it was never intended to solve is inapposite.

Clive RobinsonFebruary 13, 2012 4:39 AM

@ Daniel,

Math cannot solve all the problems in the universe. It cannot even solve most national security issues. Asking crypto to solve problems that it was never intended to solve is inapposite

Whilst true on the face of it, we "walking talking monkeys" are a mischievous lot, and like the "curious feline" we find that often our curiosity at what is presented as an impossible problem makes it not impossible from a different view point (Arthur C Clark had an apposite comment about old and venerated scientists).

With that in mind let's look at your first paragraph,

There is an important distinction between the leakage of data and the interpretation of that data.

Yes and as such it's like gambling on horses. Finding the correct interpretation is the key to placing a bet with a higher than average probability of success (and before betting tax people did used to earn an income from betting on horse).

Mathmatics can help a lot in this regard which is why the NSA, GCHQ et al invest heavily in mathematicians and engineers with certain skills, as it helps lift the signal from the noise. Which brings us nicely to,

If an interrogator is behaving irrationally there is nothing that crypto can do to solve that problem.

Yes and no, to see why take a big step back and ask why the interrogator is behaving irrationaly from your viewpoint and not from their's?

Basicaly you are trying to tell them that a collection of bytes that could be just random data or encrypted data is "not encrypted data". What's your back / cover story for having "random data" as opposed to "encrypted data" on your hard drive?

For the majority of people it's far from normal to have random data on their memory devices. So to the interrogator it's a big red battle flag being waved from a hill top for all to see if they chose to look... That is the random data story is like a bucket with no bottom, it does not hold water.

So even with a good back/cover story the balance of probability comes down very very heavily on the side that says you are not telling the interrogator the truth. So their behaviour is far from irrational, whilst yours is.

But the interrogator can go several steps on from that, because the simple truth is encrypted data gives it's self away. Simply because it is either "too random" or provides "tells", when compared to the random data created by "natural processes".

So that big red flag is now accompanied by a Highland Pipe Marching Band giving full vent to a battle anthem.

This happens because all natural processes have bias of one form or another that can be detected. Data that does not have bias is "too random" and thus highly suspect. But the bias of natural processes has certain characteristics which are markedly different from the "tells" of the incorrect use of encryption. These same "tells" often have sufficient characteristics to identify the encryption software used...

So mathmaticaly the interrogator knows with a quite high probability you are lying no matter what your back / cover story is. So do you realy want to ask who is behaving rationaly or not and push the interogator to the next stage?

Which is why in the past it was better to accept the following facts,

1, You can not hide encrypted data.
2, Having encrypted data is a "State Crime" in many parts of the world.
3, In many parts of the world "State Crime" is treated the same as "treason" and effectivly has no limits on investigation or punishment if required.

And not encrypt any data, and thus not carry sensitive data in any form, or have anything that even remotely looked like it might be encrypted data.

Whilst none of those points have changed, other things have and laws now exist for the high level protection of data, and the fun has started with thel likes of HIPPA and Sab-Ox.

So in the past couple of years companies now quite routinely use FDE of various types for compliance reasons, and quite a few major applications will work with encrypted data files.

So when an interrogator asks you, why raise a red flag by trying to pretend it's "random data" instead of encrypted data?

It's better to tell the truth and have done with it, and explain it's company policy for certain depts, and those traveling off premises / abroad to have "loner laptops" setup that way.

However there is a catch, as I said there is a reasonably good chance that the interrogator who has had the chance to examine the device knows which encryption software has been used and thus knows it's charecteristics. And even truecrypt etc are not up to the stage of alowing you to say "I don't know the key" in a fully believable way.

As I said further up the post encryption has to be multilevel working from the disk LBA upwards through the various ADT containers. And most importantly for deniability it has to properly manage multiuser capability with file meta data and this is an asspect most applications fail miserably on as well.

But importantly both the HD level encryption and file level encryption need good key managment systems, whereby it is normal operation for a user "not to know or even be aware of encryption keys". There are several routes to this one of which is secure tokens.

But we have a long way to go on this, however when it gets to the point where all parts of the system are fully multiuser then plausable deniability on keys becomes the norm not the exception.

If it is normal for the system to only show the user "their files" and not the files of others then it would be hard to show that the user could have any knowledge of the files that they cannot see and are thus ordinarily unknown to them.

Think of it this way, when you as an ordinary user log into your company system as User A you don't see nor do you expect to see the files of other users. This is standard practice for any multiuser system or OS and even most judges would now accept this, unless you were a system administrator.

We have to move this multiuser concept onto the laptop and into the security tokens, and technicaly there is no reason why we cannot do it, and actually it would not be that difficult to do.

So yes if done properly mathmatics can solve the deniability problem, because all you are trying to do is shift the viewpoint from "single user device" to "multiuser system". And "multiuser system" already has acceptable "deniability" built in as standard.

Finaly we get onto the "$5 wrench" or tourture issue,

'Rubber hose' crypto is a legal problem; it's not a math problem.

Actualy no, it's not a legal problem either, the reality is that in most places for soldiers and civilians alike tourture is prohibited by international convention which the majority of countries have signed up to.

Tourture is very much a human perception problem, it's just dressed up to look like a legal problem for the old "I was only following orders" excuse by politicians frightened by the spector of terrorism as drummed up by the various intelligence agencies.

You can see this by the way certain types of person are exempt from the international treaties. Historicaly spies and traitors are not covered by the conventions and treaties which is why you see so much effort in defining "enemy combatants" who don't wear uniforms into either the "spies" or "traitors" catagories. This is because some people believe tourture works whilst many others don't believe it works or it is immoral. Thus the believers have to try and hide the reality of their belief from the rest of society.

And so far the scant evidence is that in reality it obtains no more information than simple face to face chat over a cup of tea by a skilled interrogator. And we have known this since before the second world war, because the simplee fact is there are those who give up information and those that don't and 'rubber hose' techniques has little or no effect on this.

Further that the quality of information from tourture is usually considerably worse than that produced by skilled interrogation. This appears to be because tourture is a feed back process, and that the answers given by the victim are as a direct conciquence of the questions they are asked and the pain inflicted and thus have no other correlation in reality.

Basicaly the torturer's questions give the victim direction but the pain induces the victim to say anything to stop the pain at that point. Thus as even vaguely related gibberish works to stop the pain at that point that is what the victim will trot out, irrespective of the long term consequences.

It does not appear to matter if the pain is physical or mental induced the result is the victim telling the torturer what they think the torture wants to hear irrespective of if it is true or not (which is why we get so many false confessions).

In practice it appears that most tortures by the nature of their questions give away far more information to the victim than the victim gives back. So if the victim is sufficiently intelligent they can actually stay a couple of steps ahead in the process and "lead the enemy astray", by the time honoured method of talking a lot by saying very little.

Or another way to look at those that believe in tourture is the only tool they can conceive is a hammer, so to them all their problems look like nails to be beaten down out of sight. Oh and their belief in the effectivness of torture is most probably based on their own fears not that of others.

echowitFebruary 13, 2012 9:01 AM

@Anon Bruce Poster

... and I have no desire to forcefully replace the government of America with an elite corps of six-foot tall blonde Amazons.

D**n!! Had me all excited there for a minute.

karrdeFebruary 13, 2012 9:43 AM


That kind of observation works against the usual suspicion of the Prison-Industrial-Complex.

If they run out of, inmates...then they begin closing prisons. Thus, I suspect that some of the forces which lead to increase in imprisonment in the US were social and cultural.

It is easy to show that the PI-Complex found a way to make money off of the increase in imprisonment rates.

But I doubt that they can be blamed as a cause of the increase. Though they have apparently done their best to keep the money-stream flowing after a decline in overall criminal behavior.

PS: the link to the Miami Herald, about superpredators, comes up as a 404.

DanielFebruary 13, 2012 12:40 PM


Tourture is very much a human perception problem, it's just dressed up to look like a legal problem

Yes, absolutely. This observation applies equally to your entire discussion above about random data. The whole issue boils down legally to what a judge or an interrogator believes in a Bainsian definition of belief ("that upon which one is prepared to act"). However, people act based upon irrational foundations all the time. Even under the best of conditions we can only talk about it in terms of probability. Yet the whole field of mental heuristics teaches us that human beings do an awful job of estimating the objective probabilities in most situations, especially when it comes to evaluating the behavior of their fellow beings.

The key question is the one implied at the end of Bruce's post today. "I doubt that a judge would believe it." The question is why should a judge, a great majority who have no technical training whatsoever, get to decide the case in the first place. His perceptions of the truth are no more likely than anyone else's to be accurate.

All of which reinforces my central point which is that these are not math problems we are discussing. And when it comes to what a judge should or should not believe in any situation mathematicians and computer security experts can get in line behind legal scholars, psychologists, sociologists, etc.

Clive RobinsonFebruary 13, 2012 1:56 PM

@ Daniel,

The question is why should a judge, a great majority who have no technical training whatsoever, get to decide the case in the first place. His perceptions of the truth are no more likely than anyone else's to be accurate.

The answer to this is that they were never intended to do this...

In a court there are two tribunals that of truth (jury) and that of justice (judge). The tribunal of justice is supposed to be "an impartial advisor of the law to the tribunal of truth.

At one point in time the jury had the power to order the judge to carry out certain actions including the dismisal of the case, representing of witnesses to be further cross examined, they also had the right (and still do) to advise the judge on things they might have significant knowledge of as "expert opinion", though this role is now carried out by a clique of "expert witnesses".

The simple fact at the end of the day is judges are paper shufflers, they only understand their rules and proceadures relating to those pieces of paper. The facts of the case are presented as testimony either as "controled" peices of paper or by verbal presentation from the witness stand. It does not matter what the physical, mathmatical or other scientific investigation might show, if it is not presented correctly on a piece of paper for the docket then it's not evidence.

Like any other bureaucrat judges are effectivly untouchable provided they stay within those rules and proceadures. This is one of the major reasons why the US court system is broken for appeals etc.

A judge is supposed to ensure a trial is fair, and it is obvious to all that care to think about it that the rules and procedures are so complex that neither judge or counsel can have an indepth knowledge of it. Thus a properly organised legal team consists of many people reviewing all the paper work line by line, word by word and grematical mark by mark, looking for fault, wriggle room and nuance. Likewise they have others doing the same with the laws directly concerned, those coincidental to the laws concerned and even hair brained off the wall compleatly unrelated laws (where the laws concerned are both legislative and from previous court cases and judgments). It is a very vast body of knowledge, thus those with any skill can demand high prices. Which is why it apppears to many that "the law can be bought" because few judges have the indepth knowledge to counter a lot of what a well paid legal team can come up with...

This imbalance of power has also become institutionalised in that a poor defendant who cannot afford a legal team and gets a public defender who is overworked and under resourced is easy meat for a public prosecuter with a smart mouth and a political career ahead of them.

And just to make it even easier for the prosecution there is "plee bargaining" and "states witness", whilst it might be advantageous to get at heads of companies and crime bosses who might otherwise escape conviction, it's usuall use is to scare the living cr4p out of the defendant and either talk so they convict themselves or opt to take a short term sentance.

Thus the logical conclusion of many is justice is for sale to the rich and smart the poor and feeble minded go to jail irrespective of guilt or innocence...

karrdeFebruary 14, 2012 12:37 PM

Interesting post about physical security and crime, seen on SlashDot:

Classical music used in an attempt to discourage the criminal element from congregating in public areas...

There is lots of discussion as to whether it works.

And there's discussion of why it might work (cultural? social? or is it a relative of the Broken Windows Policing effort?).

Clive RobinsonFebruary 15, 2012 4:55 PM

OFF Topic:

It would appear that we just can not get it right with random number generators (no surprise to me) and it would appear that there are a number of broken Pub Key certificates that share one or other of their PQ primes because of it.

The easy to read NY Times article,

Or if you are using a mobile,

From the researchers paper slightly ammusingly titled "Ron Was Wrong, Whit Is Right,",

This is a concern because, if these owners find out, they may breach each other’s security. It pales, however, compared to the situation with RSA. Of 6.6 million distinct X.509 certificates and PGP keys (cf. [1]) containing RSA moduli, 0.27 million(4%) share their RSA modulus, often involving unrelated parties. Of 6.4 million distinct RSA moduli, 71052 (1.1%) occur more than once, some of them thousands of times. Duplication of keys is thus more frequent in our collection than in the one from [12]. This leads to the same concern as for ElGamal and DSA, but on a wider scale

You can get the PDF from

Or if your PDF view barfs on it try,

Now as "diceware" gets mentioned so often when random number generation is discussed on this blog you can read the diceware authors take on it at,

He thinks the research points to a possible signiture to "malware" fritzing the calls to the random number generator.

Now this is far from a new idea Adam Young and Moti Yung proposed this and other ideas to fritz PQ pairs back last century under their ideas for "Cryptovirology",

Whilst I would not rule it out (I've done similar things myself as I've indicated in the past on this blog) it's more likely to be poor programing in some code library software engineers are using virtualy sight unseen.

Clive RobinsonFebruary 15, 2012 5:24 PM

@ Ollie Jones, NSAstrikes, Christian,

Opps should have read all the thread before posting.

I've just noticed there are quite a few of us hitting the same story.

I guess we've covered most of it between us.

@ Moderator,

My appologies for my duplication.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..