Comments

squid July 1, 2011 8:10 PM

People keep asking, Why the squid? It is a password retrieval tool, as explained in this summary of the story Johnny Mnemonic:

“When Johnny tells Molly the only way to retrieve the password is with a SQUID, she leads him to an amusement park to visit Jones, a retired “navy dolphin”. Jones’ previous job was to locate and then hack enemy mines using sensors implanted in his skull, including a SQUID. To keep them loyal, the Navy addicted all of their “war whales” to heroin, so Molly trades Jones some heroin in exchange for retrieving the password. Armed with the password, Johnny has Molly read it to him in order to enter the retrieval trance, recording everything that is returned.”

This squid data retrieval technique also explains how Bruce knows Alice and Bob’s shared secret.

Clive Robinson July 2, 2011 3:20 PM

From the article what qualifies as the understatement of the week if not the year,

“Sure, it might not be quite as cuddly as a panda…”

Hmm by repute the Giant Squid with 60ft long tenticals with suckers the size of dustbin (trash can) lids, with hooks the size of your hand and a beak large enough to tear holes in whales and wooden fishing vessels.

Yup I’d say it’s not as cute as the panda from a distance but close up… who cares you would not want to get in cuddling range of either of them 😉

Roger July 2, 2011 7:56 PM

This is a terrible idea, proposed by people who are too close to their work to see the wood for the trees. Yes, people are fascinated by giant squid: but as hideous, terrifying monsters! Pandas, in contrast, are cute.

An additional difficulty is that they are not known to be endangered. They may be, but there is no evidence of it; and indeed, sightings have actually increased. Whatever the true conservation status, there are definitely much clearer cut examples of endangered species.

They should probably get an agency to develop their icon, but I would recommend monk seals: cute (or at least, as cute as marine animals get), critically endangered, and conservation status is definitely a result of human activities.

Clive Robinson July 3, 2011 4:21 PM

@ Doug,

The ARS article side steps the major problem and that is “reliable measurands”.

Although security is best thought of from the managment perspective as a “Quality Process” it has a tangable -v- intangable issue.

Quality systems started from a need on the factory floor, tangable raw goods go throough a manufacturing process to produce products. At all stages of the process an item or part goes through a process that has reliable measurands.

These measurands can be put through reliable transformations as they work their way up the managment stack to walnut corridor where they are presented in a manner the execs understand (ie in their language and work domain).

Not so security, firstly software is in effect a one off product that gets replicated n thousand times, as such it is like a watercolour original with thousands of prints. It is inherantly intangable and has currently no measurands that are meaningful in security terms.

In reality the only part of the process that can be measured is the duplication or copying process of product that goes out to customers, just like controling the print process for duplicating the water colour.

This means that we have to take a step back and look at how Quality measures effect the design process not the manufacturing process. And this is a real issue because software design is very much more art than science. Thus we have to look at the history of other industries such as early steam engines to see how their design was moved with the help of rational thought, science and law from art to artisan to engineering.

If you think about the history of engineering you can best see where software is by the example of the wheel wright making. wheels for coaches and waggons. After the initial discovery in unrecorded history of the discovery that round objects roll more easily than other easily made shapes not a lot realy happened to the basic design for hundreds of years. Each improvment was as a result of a minimal change and trial by customer. In Roman and medieval times waggon wheels were constructed of planks of wood cross placed and bolted or nailed together to form layers that were then cut into an aproximatly round shape around a central hole with additional cross bracing. Sturdy inefficient and with little or no give they could not be used at more than walking speed. Chariots in Egyptian and Roman chariots etc had the sort of spoked wheels that we think of today, but more than a cursory glance shows that the wheel was still rigid in construction. It was not until much later that it was found that putting an angle on the spokes that it was found the wheel would go faster and brake less. The secret of the angle was passed down from master to apprentice as a “pattern” that worked without any reason behind it. Significant change in wheel design did not happen untill wire spokes and vulcanised rubber came along in the early ninteen hundreds, but it was not untill the late ninteen hundreds and early twentieth century that science could explain why the “patterns” worked or how to improve upon them.

If you think about it software is in the early artisan “pattern” stage of development. The science of software is realy about algorithm comparison where there is the meaningfull measures of time and resources which can be repeatedly and reliably measured.

Not so security where the only measure you generaly get to hear about is the meaningless number of spam or virus detected at some point trivialy compared to some other meaningless measure to security such as the number of inbound emails.

Currently we talk about “best practice” which is realy a game of “follow the leader” where the leader is the winner of a crap shoot contest. They cann’t tell you why the dice rolled in their favour only that it did. So we employ “magic umbrella” thinking, that is it never rains when I carry my umbrella, but it occasionaly does when I don’t therefore my magic umbrella must be what stops it raining…

Thus legislating on carrying magic umbrellas might not be a good idea, but that does not stop people trying…

Have a look at the recently released update to the 2005 EBanking guidelines to banking security examiners,

http://www.fdic.gov/news/news/press/2011/pr11111a.pdf

and tell me how many “magic umbrellas” you can spot?

Do you actually see anything in there that can be reliably measured, appraised, and tested in a reliable fashion?

The one thing you can almost guarentee is that a lot of politicos are going to get fat on the largesse of the industry that want any kind of legislation like a hole in the head. What will result will be just like Sab-Ox, a bunch of hoops to make the institutional dog jump through but little else. And the hoops will be designed such that only the big dogs will be able to jump through them easily.

AC2 July 4, 2011 12:07 AM

@Clive

Completely agreed on security by design, but I’m not sure we will develop this from art to engineering anytime soon…

The problem I think is that there are no more any systems that are built ground up. For reasons of economics every computer system built nowadays is a series of black boxes or layers with certain defined interface protocols.

Trust between these systems is implicit, e.g. I have to trust the Oracle DB that I use for data storage even though no one outside Oracle (and precious few within Oracle) have seen the source code and done a security review. I could say, OK, I’ll build a layer of security around this untrusted component, e.g by building a layer that will encrypt all data I send to the DB/ network and have a similar decrypting layer, but that just moves the problem.

Same holds true for the OS, the firmware and the microcode. And in some cases the 3rd party application code as well.

Given this I don’t see how we can ever get to a point where someone can step up, look at the whole stack and make a meaningful measure of the security it provides.

It’s going to remain based on the number of security incidents observed.

Clive Robinson July 4, 2011 2:09 AM

@ AC2,

“Given this I don’t see how we can ever get to a point where someone can step up, look at the whole stack and make a meaningful measure of the security it provides.”

Which is with the current “state of the art” a statment I agree with.

Which in effect means that when you throw the switch to turn it all on, you don’t know if it’s going to blow up in your face now or later and if so why… (which is the same state as that of early steam engine and boiler making).

The result is that we know people are going to get hurt (although having goods stolen in your name is not as dramatic as having bits of you torn and mutilated by exploding machinery) which will eventually cause an out cry from those hurt of sufficient size that Governments will have to act.

The difference between the time of early boiler making and now is we are not dealing with tangable objects but intangable information and communications alowing jurisdiction hopping and outsourcing. Whereby organisations can externalise the risk to a third party and thus obsficate or avoid responsability.

Which means that “best practice” will become the “shield of defense” as there is no measurand to say what is secure and what is not. Thus all eggs will end up in the same untested basket made of straw. And as litigation will drive organisations that way the real shield will be a “due diligence” document / audit.

And this is the problem litigation almost always starts a race for the bottom.

@ Doug,

On re-reading my reply to you, the last part comes across a bit strong and could possibly be seen as an attack at you, it was not ment to be so, just my frustration with the whole system comming through.

@ All (US citizens and friends)

I hope your “4th of July” celebrations went ok over the weekend and that not to many of you are nursing that “never again” feeling this morning.

As the old saying has it, “The secret to life is moderation, but… don’t over do it where it matters.”

Richard Steven Hack July 4, 2011 5:55 AM

Clive: “”The secret to life is moderation, but… don’t over do it where it matters.”

Reminds me of a line in the Marvel comic Thanos where my Main Man, Thanos, who was trying to do some good deeds after having been a supervillain for the last twenty-five or thirty years of Marvel history, had his robots dump one of his associates in the garbage bin on his spaceship, declaring “One cannot allow oneself to be carried completely away with this doing the right thing fatuousness. There must always be limits.” 🙂

With regard to software development, I grew up on this in the late ’70’s and early ’80’s when “software engineering” was considered a real achievable goal. Most of it was sort of ad hoc “best practices”, but there was some attempt to develop metrics and reliability approaches based on semi-logical bases; people like Jean-Dominique Warnier, Michael A. Jackson, and Ken Orr worked on development methodologies which had a high probability of producing “correct” programs because they were based on a logical system of what all programs had to do to function correctly. It wasn’t necessarily “formal” but it was more so than other systems like “structured programming” which were based on more ad hoc observed “best practices”.

In my view, “engineering” is simply the taking of materials with known properties, applying known transformations to those materials to produce another material or construct or effect also with known properties.

The key is “known properties”. The problem with software is always in the specification. It never catches all the properties – especially those such as usability, reliability and security. Most of the time it doesn’t even touch on efficiency or even effectiveness, which one would expect would be high on the list for commercial software.

All the development methodologies in recent years – “agile programming” and the like – shied away from this more formal approach to more “ad hoc” methods that were intended to speed up programming and reduce bugs by just throwing more eyeballs and bodies at the problem or “iterating” repeatedly until the bugs sort of “fell out” of the process.

We see how well that worked out every day in Windows and open source software. As Woody Allen once said, “Nothing works and nobody cares.”

Now everything we use is based on a pile of sand. There’s no denying that sooner or later someone is going to have to go back to basics and design a true software engineering system of software tools that can be used to both effectively and efficiently produce provably “correct” software against a number of required metrics such as reliability, usability, security, etc.

That system will probably have to use AI techniques and be heavily computer-aided because humans just aren’t good at remembering things or checking that all the i’s are dotted and the t’s crossed. Only computers do that well.

Without that, nothing will change. Software will continue to be unusable, unreliable, inefficient, insecure and just plain crap.

As I always say: Windows is CRAP. Linux is ALSO CRAP. BUT Linux is FREE CRAP.

That basically could be extended to “All software is CRAP, free or not.” Every new version of most Linux distros is less reliable and efficient than the last as Linux follows Windows down the path of bloat, inefficiency, insecurity and just plain craptitude.

This is why I hate Microsoft. With their money, they could be plowing ahead in the development of software engineering. Instead, they’re one of the biggest obstacles TO software engineering courtesy of the corporate culture fostered by the greed and myopia of Bill Gates and Steve Ballmer.

Richard Steven Hack July 4, 2011 7:36 AM

We can only hope it becomes a handicap to Microsoft. But even then, there’s no reason to believe they will recover and become useful in developing software engineering to a usable level.

More likely, they will either get worse or just collapse, to be replaced by some other company – such as Apple – that does little better.

Someone is going to have to develop a better software engineering process, then use that process to produce software which is so much better than existing products as to drive it and the producers of it to dominate the market.

It could be done. It’s not clear that it must or will be done.

Clive Robinson July 4, 2011 8:57 AM

@ Vles,

“As a Dutch person, I would like to answer ‘Law of the handicap of the head start'”

Agh ha, another “leading edge is the bleeding edge” observation.

I know from long experiance it is easier to “accomplish the known” than it is to “make the unknown possible”.

There are several factors behind this.

Firstly unless an individual is financialy independent beyond the norm, they will need the help of others (investors workers etc) to make a vision reality. Once the vison does become reality it is often stolen from the person with vision, either directly (see development of first metal type printing press) or indirectly by a number of means. One of the later is that others now know it works and thus can attract investors more easily.

Secondly once a technology is established it starts making a return in some way. It is only when the older technology becomes grossly inefficient compared to the new (seperate condenser on steam engines) or is far to dangerous does it get replaced (Davy Miners lamp). There are a number of reasons for this. The first is a vested interest fights the new commer initialy from a possition of strength, secondly is “sunk investment”, thirdly is installation cost.

If as in the gas-v-electric light issue you have a “virgin site” and you have decided to make the investment and thus a spending on the installation cost, you will almost invariably go for the system that offers the lowest longterm operating costs or is safer.

However if you already have a system in place, those that own and opperate it have a vested interest not least in that they will become redundant (see definition of the words Luddite and Sabotage) and thus will activly opose a change. Also it is a fairly easy argument to say replacing the system will cost ten times the annual costs of the current system. Thus a small annual cost/profit is seen as far better than a large sunk cost now compared to a marginal improvment in profit some ten years hence.

Then there is “the Law of the Known”, which is that people have a naturaly conservative outlook. In London the water supply was originaly “piped” in wooden logs with a hole bored down the middle. Fired china clay pipes became available shortly there after and metal pipes where known of. But compeating organisations carried on using wooden log pipes simply because they were a known technology, and those experianced in the use of wood were on hand.

There are some other reasons that explain it but I hope that helps.

mcb July 5, 2011 11:22 AM

@ Clive Robinson and Roger

Precisely so!

Our esteemed host’s fascination with them notwithstanding, squids are fascinating because they are alien, like manta rays, sea cucumbers, and the Portuguese Man of War. We left that neighborhood so long ago we’re almost no longer from there.

Dolphins or whales are so much cuddlier…

Mammals Unite!

Clive Robinson July 5, 2011 1:31 PM

OFF Topic

@ Bruce,

I susspect you may be on the lookout for security stories with a “what were they thinking” angle.

If not this still might amuse,

http://www.theregister.co.uk/2011/07/05/suitcase_jailbreak/

However on a more serious note the article does not say why “Mr flexibility” was not discovered missing before his Girlfriend even left the building…

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.