Hacking Attack Causes Physical Damage at German Steel Mill

This sort of thing is still very rare, but I fear it will become more common:

...hackers had struck an unnamed steel mill in Germany. They did so by manipulating and disrupting control systems to such a degree that a blast furnace could not be properly shut down, resulting in "massive" -- though unspecified -- damage.

Posted on January 8, 2015 at 3:11 PM • 59 Comments

Comments

mozJanuary 8, 2015 3:48 PM

Maybe it would be better if we could title these things something like:

"Failure to secure control systems causes...."

The only way to finally deal with these things is going to be to rebuild all or IT software infrastructure. If the customers don't understand their potential role then they will never demand secure systems.

The security community has been beaten down by the commercial arguments of the 1990s. It's time to start to reassert the responsibility of the software vendors.

BazJanuary 8, 2015 3:59 PM

@Toonna - the original report only has a couple of paragraphs on this attack (section 3.3.1 on p31), that SCM magazine report is about the same level of detail; you're not missing anything. The full report is only much longer because it includes a number of examples like this one and a lot of other waffle about the state of cybercrime in Germany. I did some digging around when the report came out, and couldn't find any reports of (for example) plant closures that could be the incident in question. Hard to tell what the impact or sophistication of the attack really was when the report is so vague.

AnuraJanuary 8, 2015 4:05 PM

@Mark A. Hershberger

I'm not so sure this is a typical example of the internet of things (as in, let's hook this up to my network, because I can), so much as industrial systems need a way to control them and monitor them, and it makes a lot of business sense to put a web service on it rather than inventing your own method of controlling it (which also makes it easy to allow employees to work remotely or for you to contract out the monitoring to another company). Unfortunately, making things a lot easier on the developers and the businesses that deploy the technologies also makes things a lot easier for the attackers as well.

NoSuchAgencyJanuary 8, 2015 4:34 PM

The only reason this happened is because someone made the stupid decision of connecting the equipment to a network that was connected to one or more other networks that were connected to the internet.

@Anura:

[blockquote]industrial systems need a way to control them and monitor them, and it makes a lot of business sense to put a web service on it rather than inventing your own method of controlling it (which also makes it easy to allow employees to work remotely or for you to contract out the monitoring to another company).[/blockquote]

This mentality has to stop! Hire people INTERNALLY to do the monitoring! This obsession with "outsourcing" every function of a business is ludicrous. At the end of the day it can't possibly be cheaper to hire out monitoring services as the company hired is doing it for profit.

Security or convenience. Pick one.

Those that pick convenience (in these cases) deserve to be hacked. None of these systems should be connected directly or indirectly to anything with internet access. Period.

AnuraJanuary 8, 2015 4:51 PM

@NoSuchAgency

It's not going away any time soon, if ever, it's only going to become more and more prevalent. We have to accept that fact and design systems and protocols to minimize the security risk. Unless secure becomes the default, rather than the exception, things have the potential to get very bad in the future.

TonerooJanuary 8, 2015 5:18 PM

@NoSuchAgency There's one flaw in your argument. I work in the oil industry in Canada and for pretty much any company operating dangerous machinery safety first is not just something that people throw around for fun. Safety really comes first and is THE major deciding factor in decisions around access to ICS. The most dangerous activity in our business? Driving. So your alternatives are; a 100% isolated system with absolutely no way of accessing it from anything else or one which through various hoops, holes, and controls an operator can access remotely. The result of option 1 is that the operator climbs into his truck at 2 in the morning when it's 40 below and blizzarding to drive 3 hours each way to flip a switch from the off to the on position. Option 2 allows him to do this from the comfort and safety of the camp. Which one is most secure and which one is the business going to opt for every time? Remember, these are businesses and when the choice is between the possible death of an employee or the possible hacking of their plant, they're going to choose the employee.

There are ways to make this stuff more secure and still usable, it's the battle that most security practitioners fight on a daily basis. Unfortunately the ICS vendors don't give us much to work with and therefore we end up having to put in all sorts of add-ons and do-dads in order to make it somewhat secure.

Sancho_PJanuary 8, 2015 5:49 PM

As far as I know the officially published “information” is blurry, probably more than the linked report from wired is ;-)
Someone got access to remote log-in data by social engineering (?) and started exploring their network “from inside”.

However, there is no “proper” shut down of a blast furnace, every “shut down” results in a (massive!) damage, it is not possible to stop the process. A controlled shutdown takes days.
It doesn’t need knowledge to harm such complex production plants, on the contrary. Once on the internal network, deleting just one single file might cause a domino effect which could finally destroy the furnace in probably less than an hour, I guess no one could stop it but by luck.

Remote access is crucial, also for recovery, you can’t have all experts at the plant 24/7.

albertJanuary 8, 2015 6:50 PM

I spent most of my programming career (20+ years) in industrial control. Most machine control was (and still is) handled by PLCs (Programmable Logic Controllers), dedicated, proprietary electronic systems with their own proprietary networks. They are not like the PCs most people think of.
.
The advent of Ethernet and PC technology allowed PLC manufacturers to incorporate Ethernet connectivity and high level language capabilities into their PLCs. This allowed them to compete in areas where computer control was used, like chemical plants, oil refineries, steel mills, and similar systems. These are called 'process' control systems, as opposed to 'machine' control systems. So you can now have the reliability of PLC I/O, and the ease of programming of the PC (the C language for example). Such systems do a lot of real time data collection and processing; things that older PLCs weren't designed to handle.
.
So now you've got advanced PLCs, or even worse, 'industrial' PCs (Running Windows, God help us), on Ethernet networks, with nothing to protect them from the Internet, except perhaps a firewall, and passwords. These systems are _programmable_ via the network.
.
There is NO excuse for this sort of thing. Using the Internet for plant control is just asking for trouble.
.
Toneroo, why can't the controller at the remote site 'flip the switch'? That's what they're for. Controllers can _send_ data to you, without being remotely programmable.
.
Sancho_P, IIRC, blast furnaces usually need to be 'rebricked' after a shutdown; a shutdown that take days to accomplish.
.
It is unfortunate that these things happen, but they are preventable. I wish this was a wake up call to industry, but I predict it won't be. Losing money is one thing, but do lives need to be lost before something is done?
.
I gotta go...

Ole JuulJanuary 8, 2015 8:00 PM

I see this as a problem with closed source and proprietary software. Companies get tied into something they have not control over and have no way of gaining control over at this late stage.

They probably have agreements with software vendors which makes it impossible to back out now. These same software vendors use network control over the system for convenience (a feature!) and updates. They likely also started off using MS-Windows a long time ago because they felt they would get more sales that way. Then we have managers who are not educated in the ways of open source and the reason for going that way - which is control. They do not have that control now and the system rules them.

This situation is a typical problem with corporate culture, and I don't think it can change. They will just have to accept that they made a mistake a long time ago and are stuck with it. Perhaps if insurance companies wise up to what's really going on here, then there is some hope.

ThothJanuary 8, 2015 8:38 PM

Probably the best thing that can go in a better direction is some form of industry wide directive by standards body (just like food and healthcare or banking and finance) where you have to do this and that to a certain standard certification to be able to operate some industry machine plant and for ICS vendors to certify their machines. I am not sure if they have such an operation rule to mandate secure IT infrastructure in these companies.

It is of course not foolproof (just like hacking EMV cards and ATMs) but it shrinks the surface by a good margin at least.

Next step would be education on how attacks and defenses work.

Bob S.January 8, 2015 8:45 PM

It's unclear to me why critical control of infrastructure operations are leaked to the notoriously insecure web. I suppose it's all about "convenience" as always.

I agree with Mark H., we face the same kind of attacks and damage with the internet of things.

Virtually every government and corporation in the world is willing to sacrifice security in order to partake in personal data orgy.

I would guess in a few years the whole thing will break under it's own weight of excess.

Ole JuulJanuary 8, 2015 11:03 PM

@Bob S."It's unclear to me why critical control of infrastructure operations are leaked to the notoriously insecure web."

That's what I was answering above. I'll try again. A factory can't necessarily afford to change software vendor and is likely locked in, either by contract, or by not being able to change the marriage between extremely specialized hardware and software. This middleware will be connected to management software and on-line information allowing them to match production to the market. Without that, they would have to develop their own in-house solution. This cannot be done with closed source provided by a specialty vendor.

Disconnecting the hardware (production facility) from the internet is not going to be easy. Remember, there is a whole ecosystem here. Part of the problem of sleeping in the bed they've made, is that they can't just upgrade this expensive stuff, so the hugely expensive drivers are not going to get rewritten easily, nor is the OS going to be upgraded from some older version. I'd guess XP, but it could be older than that - so security is even more of a nightmare than one would think.

To sum up:
- They can't disconnect from the OS because it's needed for the very expensive drivers
- They cannot disconnect from the software because it is propritary
- They cannot disconnect from the internet because production information is tied to a world market and accounting ecosystem.
- They probably can't disconnect from the vice grip of vendors

I'd say they're stuck. They could perhaps have developed their own open source system as part of the original engineering plans, instead of doing what everybody does and buy that from specialty vendors. Corporate culture probably wouldn't have allowed that though. In a way, I'd say it's a social problem and not an engineering one.

A Nonny BunnyJanuary 9, 2015 1:27 AM

It's very easy to say in hindsight that the people at this company made the wrong decision of connecting their machinery to the internet. But without further analysis that's like saying after a plane crashed that the passengers shouldn't have boarded.

This is not simply a matter "security vs convenience" or "just because you can doesn't mean you should". It's a matter of what, in the grand scheme of things, makes business sense; it's a matter of cost vs benefit. And until a lot more companies have their machinery hacked and destroyed (intentionally or inadvertently), the benefits of remote/networked operations are greater than the costs.
Not in the end, perhaps, for this particular company; but averaged over all similar companies? Quite probably.

Changing the cost/benefit picture before a sufficiently large disaster changes it for us will probably have to come from government. And I doubt they'll be quick to implement laws that puts their industry at a disadvantage and might drive it abroad.

keinerJanuary 9, 2015 4:24 AM

"Die Stahlwerk-Hacker verschafften sich zunächst Zugang zum Büronetzwerk der Firma, indem sie die Daten einzelner Mitarbeiter ausspähten – Experten bezeichnen diese Methode als Spear-Phishing. „Von dort aus arbeiteten sie sich sukzessive bis in die Produktionsnetze vor“, berichtet das BSI. Dann sabotierten die Angreifer die Fabrik: Steuerungskomponenten und ganze Anlagen seien immer häufiger ausgefallen, so die Behörde. Damit erinnert der Fall an Stuxnet: Mit diesem digitalen Schädling legte vermutlich der amerikanische Geheimdienste iranische Zentrifugen lahm.

Matthias Rosche von NTT Com Security sieht in dem Fall gar ein erstes Anzeichen für den Übergang von Wirtschaftskriminalität zu einem Wirtschaftskrieg. Offenbar habe der Angreifer versucht, das System zu zerstören und so einen Konkurrenten nachhaltig zu schädigen. „Ein Angriff auf diesem Niveau kostet mehrere hunderttausend Euro, es müssen massive finanzielle Interessen dahinterstecken“, sagt der Experte für das Thema Industrie 4.0."

from
http://www.handelsblatt.com/unternehmen/it-medien/cyberattacke-auf-fabriken-es-muessen-massive-finanzielle-interessen-dahinterstecken/11138786-2.html

Clive RobinsonJanuary 9, 2015 5:12 AM

@ A Nonny Bunny,

Whilst what you say is true from one perspective it'a far from true from other perspectives.

For instance those of shareholders who expect a degree of due diligence to prevent substantial lost profirt of which they rightly expect a share and loss of share value due to negative publicity.

Some of us who have worked in telecoms and industrial systems have been jumping up and down and waving red flags since before the turn of the century. Thus calling managment to account and showing they have been far from diligent will not be overly onourous.

Yes it might appewr that managment are caught between a rock and a hard place but that's why they get the salaries and bonouses.

The threat to ICT from all forms of attacker is rising, all companies should be making decisions about how to mitigate such things, there are afterall many choices out there... Not doing anything or burying your head in the sand is not a rational one to take.

u38cgJanuary 9, 2015 8:16 AM

Clive, au contraire, it is extremely rational. Imagine 100 CEOs, saving money by running at-risk operations.

You are one of these CEOs. You can incur expense, relative to the market, by making your operations secure. You are now out of business, or a job.

You do nothing. One out of a hundred of these companies is hit; almost certainly not you. Now everyone must make the effort to become more secure. You lost no advantage relative to the market, plus one of your competitors is gone.

Unless the risk * impact is so severe that doing nothing is not an option, not making the effort to secure is not only rational, but doing anything else would arguably be a breach of his obligations to shareholders. We haven't even mentioned principal-agent yet.

We've been banging on about economic incentives in security for ten years now, so come on people, can't we do better than "rah, shouldn't be connected to teh interwebz"?

kronosJanuary 9, 2015 8:34 AM

@ Ole Juul:
Disconnecting the hardware (production facility) from the internet is not going to be easy. Remember, there is a whole ecosystem here.

My experience has been exactly like the points you brought out. Plant A spends $50 million USD on a new system, which includes lots of hardware and the software/control system. Most of that money is for the expensive hardware. Once locked in to that system they must either keep the software that controls the hardware or write their own from scratch. And the latter option requires specialized programmers AND must fit in with insurance and legal liability issues! And of course the owners or shareholders will have an involvement as well.

That is why so many process control systems have few options as far as software goes and some of the companies doing the software give little or no thought to hardened security. I remember one manager who had the admin password to his process control system printed on a piece of paper that was attached to the wall - in a room with windows on three sides in the middle of a factory! Anybody walking by could see the password and make themselves an administrator to that system! It took a lot of effort in several meetings to get the piece of paper moved from the wall to a desk drawer beside the main computer console in that room.

NileJanuary 9, 2015 8:52 AM

If you're wondering what that "massive" -- though unspecified -- damage actually is, I can tell you.

A blast furnace has a refractory lining that will crack if it cools down too quickly.

Cold maintenance is not a routine job: it takes days to run down the heat, under very, very careful control, and sometimes they get it wrong and the liner cracks anyway.

It'll take weeks to remove the lining - longer if the bricks are fused together, and the most economic outcome for *that* is to dismantle the supporting structure and cut up the vessel for burial onsite.

NileJanuary 9, 2015 9:05 AM

@kronos

The economic incentives for better security are applied by the insurers; and, in the case of financial institutions in a handful of jurisdictions with effective regulatory authorities, by law.

I doubt Sony will disclose the cost of Op-Risk insurance, and the coming year's rate hike; nor will their insurers. However, their bondholders won't let them bear the risk uninsured, and pressure for that is applied through the brutal medium of a ratings downgrade - Moody's, S&P and Fitch assess operational risk, legal risk, and environmental risk as well as the financials.

OliverJanuary 9, 2015 9:13 AM

I call bullshit on that whole report!
How much of that is just simple MS-grade FUD!?!?

I would be very carefull to give any credence to those reports.

albertJanuary 9, 2015 12:02 PM

Sigh,
.
It's really quite simple: Control systems MUST NOT be programmable, or controllable, via Ethernet*. That's it. It's not difficult or expensive to do. Read-only access from your LAN, NO access from outside your plant.
.
These weren't problems until PCs hit the factory floor, and even engineers, who should know better, were taken in by the hype.
.
Given the choice between proprietary systems (which only a few know how to program), and PCs (which thousands of people know how to program**), I'll take proprietary systems every time.
.
I gotta go...
.
* Ethernet isn't a really good network for real time control, but if it has to be used, it must be physically isolated from outside networks.
.
** You don't need programming knowledge to crash the control program, or even access to it; crashing the computer actually works better.

Nick PJanuary 9, 2015 1:02 PM

I'm surprised nobody has mentioned the existence of both standards and solutions for industrial control system security. This offering represents some of that. I'm not endorsing it except to say the market is producing stuff. Another article estimated ICS security market to be worth $8+ billion potentially.

I previously posted on this issue saying trusty old guards and VPN's should be able to reduce a lot of the risk. Convincing various third parties to install one is easier than rewriting whole software stacks. So long as they're cheap, drop in, and largely maintenance free. This won't stop attackers who go into the third party one way to sneak into the VPN. It stops all the rest, which are tremendous in number.

Bob S.January 9, 2015 3:50 PM

@Ole Juul

Sure 'they' CAN!

Air gap all the hardware, firmware, data and software from the internet!!!!

Run anything you want IN the building, but keep it off the net.

If they need reports or gauge readings have someone physically put the data on a clean drive, remove it from the building system then upload to the net, with no way to alter the data or access the source at all. Etc.

"But, gee, Joe across town 'needs' to have access to 'blank', just in case."

Answer: No he doesn't. Back in the olden days, 20 years ago, everything got done without using the convenient crutch in the internet.

So, in response to 'can't', I say 'can'.

carlesJanuary 9, 2015 4:54 PM

@Bob S.

Sure can, but that will kill a $8+ billion market. Ever wonder why the boring if it works why fix it corps are trading at low P/E's; whhereas, the mundane risk taking markets are trading at high P/E's ? ;-)

Sancho_PJanuary 9, 2015 6:40 PM

I'm not happy with some parts of the comments.

1) We should know from our own systems / log files how vulnerable IT is.
We should not assume that others are always / only stupid.

2) A plant running blast furnaces is not a simple machine control logic.
There are probably 4000 people working during daytime and 1000 during the night. It’s a small town, but highly dynamic and complex, in “real time” connected to suppliers, customers, internal and external logistic (transport) often hundreds of miles away. Believe me, it is a huge system.

3) To travel 20 years back in time is not realistic for the ordinary human, and it’s not for the production equipment. However, even in 1994 they already had computers (mainframe and PC) connected to the Internet and ICS.
Wake up, it’s 2014, average time to make decisions is below one hour, probably in the milliseconds for both, machinery and the stock exchange.

4) Such factories are perpetual construction sites, every day and night optimizing production and improving the site, machinery and their logistics + ICS. Both SW and HW consists mostly of patches and workarounds over the years, as it is usual in the IT. You’d find more versions of drivers, system software and hardware than you could count.

5) The report (I’m a bit skeptical about the “official information”) was about the attacker had access to (part of?) the production network - we don’t know more, the rest is speculation.
Imagine someone sitting at your admin console and exploring your network. Probably a disgruntled employee, an interested kid, a maintenance “engineer” investigating a problem (does anyone know if all the ICS problems were really related to the “attack”?), a competitor, a foreign spy - we (and probably they) simply don’t know [1].

But the “enemy” was in control of (some of) the machines - we don’t know how long until it was detected / defeated and what damage was done when trying to get a grip …


However, there is one point considerable:
We don’t know much and therefore should be cautious with attribution and reaction.

@ Thoth had a very valuable point:

Next step would be education on how attacks and defenses work.”

Yes, we need knowledge about attacks to prepare and improve.
We are learning by feedback - if we break that loop there is no evolution.

Secrecy is plain stupid.
This would be the lesson to learn.


[1] Personally I’d assume it was Putin himself learning about ICS in Germany ;-)

Nick PJanuary 9, 2015 7:20 PM

@ Sancho_P

"Secrecy is plain stupid. This would be the lesson to learn."

Secrecy alone as a defense is security by obscurity and stupid. Secrecy alongside good security practices is called obfuscation and OPSEC. That's very valuable against attackers with medium to high strength potential. It's obvious for the mere fact that the first step in compromising networks is understanding what they run and how. Deny them that and watch the NIDS/HIDS for evidence of them exploring the network. It also helps to have packets from certain apps tagged automatically by the networking system so the NIDS can profile them and spot what was hit.

ThothJanuary 9, 2015 9:13 PM

@Nick P
I did recommend looking into standardizations and specifications just like the financial PDC-DSS/EMV, the health-sector's HIPAA and all that sort above but not going into details.

Just like all disasters, we need to unfortunately wait for an incident where a hacked computer network took away a couple of human lives before someone starts moving. It sounds very ugly and very cruel but it is the realistic way people work. They would only move if they fell the pain otherwise it makes totally no sense to them.

It took the payment industry and health-care industry sometime to create those security standards and specifications when bad things already hit the beach and long time landed and assaulted their "safe havens".

They might start learning only if it really bites them real hard. That's how we evolved over the centuries ... by pain and blood ...

FigureitoutJanuary 9, 2015 10:10 PM

Thoth
--While I agree mostly since I'm inclined to worry about a lot of things, at same time, not reacting until something terrible has happened is basic logic and likewise an argument can be made that "securing" every aspect of your life will potentially kill you quicker. The "attack" itself could be forcing you to take extremist measures to do anything, it's been like that for me (working on "near perfect" OPSEC can become a bit obsessive even w/ all the holes and limit making a lot of things due to constantly sanitizing your workspace).

So for something like "acoustic cryptanalysis" or not even cryptanalysis, just extracting and/or injecting opcodes (more likely just injecting garbage first, getting actual correct bits and I'll sh*t my pants) based off power supply components leaking; if that kind of attack is "easy" and "prolific" then computing won't ever be the same. Each PC will be housed in a "phone booth" (someone could lock you in and clog the air holes and kill you).

All that being said, I believe it is negligent on organizations charged w/ running (which implies protecting) critical infrastructure that haven't set up at least one-way remote monitoring and yes potentially one-way external control (successful attacks will need good recon on operations). Insider attacks are forgivable b/c eliminating those you might as well just shutdown.

ThothJanuary 9, 2015 11:53 PM

As Bruce Schneier always says, you need to choose the realistic attack vectors you want to defend against. It is up to the person's view and calling to do so. A bad choice would be terrible and a good and well educated choice would be excellent. Very much dependent on knowledge/education and also on a whole ton of factors.

Robert.WalterJanuary 10, 2015 9:35 AM

So what company was it that was victimized?

I find it odd no name was released as this surely has to count as a material event and be reported in a quarterly conference call.

Unless it wasn't arterial event not required to be disclosed, and then if not, then why did it make the mainstream press?

Sancho_PJanuary 10, 2015 11:58 AM

@ Nick P

Sorry I understand your post as leading somehow off the track?

The point is not to publish the mess of a so long un-compromised company / organization just to hint adversaries how to proceed.
The point is to publish in detail what happened and why it went so bad.
It is too late to protect e.g. Target or Sony but not only others would be alarmed about their systems, probably a bright guy from India would fix sources and publish what we are not able to accomplish.

Evolution is about not making the same mistake again.

Paranoia is a serious mental disease, it may end in self destruction.
National paranoia, fanatism or exceptionalism are even more dangerous to our small nutshell in the universe [1].

To be back at the point:
We, the society, should not condemn those who reveal issues,
be it Ed Snowden or any hacker.

On the contrary, we should openly reward a good “hack” [2].

We should encourage hackers to analyze and come forward, to discuss in the open and to improve security.
There must be an incentive to hack and protect.

Secrecy is contra-productive for the society.
Boston, Sony, Paris: “We had them on our list, we watched them, we knew …”.
I do not have the right word for this kind of craziness.
In secrecy we knew, we just couldn’t tell because it was secret?

- What is wrong in our society?
We do not want to face the cause of problems nor do we want to take action if we see the results.


[1] In case it comes to your mind I only mean America it is not.
All nations have a flag, religions have a book, skin has a color.
We are different but that should be our advantage.
What we need is openness and respect, not blanket distrust.

[2] … and of course punish a bad hack.

Nick PJanuary 10, 2015 12:31 PM

@ Sancho_P

Now I'm starting to understand what you're saying. I disagree that we should reward hackers unless their leaks are whistleblowing. Hackers just do damage. Actually ensuring security or adherence to ethical standards is best done with good incentives (eg liability laws) or regulations plus third party auditing. This model works well in practice when structured right. The hackers, on the other hand, typically just act for selfish purposes (often ego). Their damage to benefit ratio is insanely high on the damage end.

I'll add that people as a whole reject strong security and accountability with their votes and dollars. So, neither the law nor the market offer much of it. You have to pay extra to firms that specialize in it. So, why blame the companies for ignoring security when the whole market does it and the law [for publicly traded firms] forces them to max profit? I blame the demand side: the people that throw away any personal information to watch a vid, listen to a song, poke a "friend," and so on. They're getting what they demanded and paid for.

It's people that value privacy, security, and integrity that loose out in such a system. They must endlessly fight with the voting majority, the markets, and the government. Or move to a country where they fight less.

Sancho_PJanuary 10, 2015 6:35 PM

@ Nick P

Um, I think there is still a misunderstanding.

Hackers are good. We need them.
Take them out and NOBODY will even think that we are vulnerable until that day when shit hits the fan - not from a hacker (you’ve arrested all of them) but say a newbie who (in the best intent, cough) e.g sets up his web-server and kills the Net / the power / healthcare / whatever for a whole region - or more.
- Let’s not wait for that moment which could make thousands suffer.

A bit provocative, but try to think it this way:
Not more laws - but less.
Hackers go free - this is the incentive for the corporations, no extra money needed.
“Hack them, beat the hell out of them - the best hacks will be published”, that’s all.
Corporation security would improve dramatically.
- Think capitalistic, think of the free market - not in lobbies and corrupted laws.

Also I don’t agree with your “people as a whole reject strong security and accountability with their votes and dollars” (assuming by “people” you mean Jane & John Doe).
They don’t know what security is and what they should do with it.
There’s no offense intended, but why should they care - until they or a close relative has a really bad experience?
From the news only? Which news, the propaganda? Rich and beautiful?
They do not believe it, they know it’s only in TV.

10 years ago in my street it would have been ex-treme-ly suspicious to lock the door during daytime, the whole village would have been talking about. In summertime often the doors were left wide open even during siesta.
Nowadays …

So what do we expect them to learn and assimilate in a couple of years?
And why?
Do people know that they have an “identity”, could they articulate what it is / means?
Do they need it?
The’ve learned to talk, take pictures, play, use SMS, email and chat using their phones in a couple of years.
Hurray.
Security isn’t an app, they don’t have it.

Therefore they do not reject it. And they do not loose out.
If someone compromises their email address they start another one, that’s it.
Nude pictures? Embarrassing, probably, for some days - but life goes on.

It may be a matter of culture, though. But the ordinary people …

Nick PJanuary 10, 2015 8:35 PM

@ Sancho_P

"Hackers are good. We need them.
Take them out and NOBODY will even think that we are vulnerable until that day when shit hits the fan - "

White hat hackers are good. They show us we're vulnerable constantly without causing unnecessary damage. Yet, this knowledge doesn't change anything at any level past people doing the tiniest things for their security.

"Hackers go free - this is the incentive for the corporations, no extra money needed. “Hack them, beat the hell out of them - the best hacks will be published”, that’s all. Corporation security would improve dramatically."

It's an interesting idea to let the damage hackers do create market incentives. The problem is that the changing nature of business, technology, and knowledge of weaknesses keeps creating more opportunities for attackers than defenders. As Dillinger says in Public Enemies: "They have to be guarding every bank all the time. I can hit any bank any time." The companies would be hit constantly and by many smart people currently on the bench. This would be true even if they followed good security practices. Damage to benefit ratio is once again negative.

Now let's look at real-world examples of my model. The first was the Computer Security Initiative that created standards, evaluations, and a purchasing policy incentivizing meeting the standards. This just applied to DOD purchases. Yet, the market started building stuff secure enough to meet those standards whose claims were evaluated by NSA pentesters and analysts. Academia built on both the prevaling principles and products of the time. Success.

The next example followed a similar path: DO-178B certification for safety. The standard has rigorous requirements for the system lifecycle to ensure reliability. Level of rigor goes up with critical nature of the system. The software must be certified to the standard in order to fly. That creates a financial incentive to pass certification. The result has been a large number of tools, OS's, drivers, etc written in a robust way expecting a payoff. There's also tools like Esterel SCADE and Perfect Developer that automate portions of development. Success again.

So, my strategy has already worked in two industries in the past & present. It takes a clear criteria, an evaluation process, and strong financial incentive to use them. Your alternative amounts to a digital version of The Purge happening every day. That's far from the safer, online world people want.

Donald DuckJanuary 10, 2015 8:46 PM

@Sancho_P

I guess you're speaking like from a reformed hacker's perspective, and I think we all should applaud you on being on the good side. The less hackers lulzing around our critical infrastructure (e.g. people live their lives on internet, esp. with smart fones, ...) the safer we are in this day and world.

Ole JuulJanuary 11, 2015 1:25 AM

I'm little miffed at the glib use of the word hacking in some of these comments. Some seem to use it as a matter of convenience, and I can accept that, but it seems like others are confused or actually do believe that there is something bad about hacking. I think that those of us who know what it means need to be more mindful of those that don't and critical of those that think it is bad. I'm not wanting to get into semantics here, but rather to explore the related misconceptions.

Hacking is something I believe we all should be doing. Yes, it is bad for the corporate world and cuts into their profits, but to me that is a good thing. It is how it should be. I'll give a small example. Up until recently, my very old neighbour used to drive a '48 Chevy pickup on a regular basis. Right there we have a big hit on corporate profits, but it gets worse. A look under the hood, and it became apparent that this guy was a hacker. I saw he had added PCV (positive crankcase ventilation - for those who aren't hackers) and it was actually a simple hack to increase performance and lower gas consumption. Is this guy a bad person? He certainly was out to screw the corporations. His only crime is trying to figure out how stuff works. Many, in this day and age, consider that to be hacking and to be a bad thing. They think that to be good citizens one should not learn too much. They may not verbalize it like that, but they fear hackers.

Another example is sitting right beside me here. I've been fighting all day with a bunch of discarded computer parts. Not only have I tweaked my glorious pile of throways to get the performance that I need for my project, but I've been having a serious argument with parts of the operating system. Yes, I am a computer hacker. Am I a bad person? Should I give money to some corp to create more landfill, or should I continue my learning experience? Reading the news these days could make one think that in depth learning about computers is the road to cyber terrorism. I think I know where that is coming from. People with a keen sense of curiosity and motivation to learn have always been the enemy of the state.

And @Donald Duck: "I guess you're speaking like from a reformed hacker's perspective, and I think we all should applaud you on being on the good side. The less hackers lulzing around our critical infrastructure (e.g. people live their lives on internet, esp. with smart fones, ...) the safer we are in this day and world."

With all due respect, I find your statement offensive. What do you mean "reformed hacker"! If we were face to face, I might suggest an apology. :) I have no intention of becoming reformed. For one thing I am an old age pensioner and don't have much income, so need to live this way. Don't worry, if with my recent exploration of IPv6 I find your fridge door open while you're out, I'll give you a call so your milk doesn't go bad. Of course if I was a really bad person, I'd just chuckle, knowing that your breakfast was going to be ruined. Knowledge goes both ways, and dumbing down the population in order to be safe is a very, very, bad thing. But that's just my opinion. Perhaps you'd beg to differ.

I think it is a good thing to go looking around, and if you find something is a vulnerability to others then report and/or help fix it. Do you really think that people should stay at home and keep their eyes closed? I would advocate for the opposite, and say that people should get out and look around. Yes, that includes the internet. Perhaps you yourself know very little of what that world looks like (I would guess yes by your comment) but I personally suggest that whoever has the time and curiosity to see how the internet works should exercise that ability. Travel broadens the mind.


Coyne TibbetsJanuary 11, 2015 5:03 PM

I find myself a little disappointed in the--shallowness isn't really the word I want, but it will have to do--of the analysis.

When I first started work for my current employer they had a network of about 20 machines in the finance department. It was amazing, watching the thinking adapt as the network came to carry more and more of our daily business; as it first networked machines in every department; then as it replaced all of the IBM-3270 terminals with emulators; then as it became the only means of making phone calls. What was amazing exactly? Watching the organization hand-wave disaster recovery for a more and more critical network; a network was no longer the finance department's toy of 1991, but had become the single most important infrastructure in the organization. (Okay, power might be more important.)

That is an example of what is called a paradigm shift.

I see the same adaptation problems now. Consider the software problem from the perspective of the way software really is today:

  • Software that has to call home to validate its license.
  • Software that has to call home to check for updates.
  • Software that has to be updated from the vendor, by sneakernet if nothing else.
  • Software that must integrate. The accounting system integration above is a case in point: You either integrate the system or you do it by hand...or by sneakernet.
  • Software that must span buildings, campuses, states, nations, the world; so that running parallel and completely independent networks is impractical...
  • ...and what does completely independent mean anyway, when it all has to pass through the same routers? Unless, of course, you plan to lay your own underwater cable between Europe and the U.S.

Knee-jerk rhetoric aside, we have long since passed the day when it was practical to isolate anything. "Well, I'm setting up an isolated Windows network," you say. Then you run into the problems where Windows says it can't call home so it's license can't be verified. Then you use sneakernet to update it...and malware travels over sneakernet just as easily as over that "evil" Ethernet everyone is picking on.

As we've discussed in here, even the computer BIOS for the new computers you want to install for an upgrade, can carry malware.

You either wind up with a network that doesn't work...running Windows ten-years-old on failing computers because you're afraid to update them or the software...or else that network winds up with some path to the outside world. We're long past the point in time where any network, no matter how critical, can be completely isolated from all the other networks.

Like I talked about above, we can keep hand-waving problems like this, saying, "Well, it's their own fault for providing a path to the outside world." Or we can accept the paradigm shift of today's software system requirements for networking access and undertake try to figure out some way to control malware.

What's wrong with dialup?January 11, 2015 5:16 PM

@Toneroo, January 8, 2015 5:18 PM:

Why does it have to be either the Internet or a dangerous drive? Set up a modem backed up by a program whose only job is to accept a username and password. Throw in a nonce from a security fob. I'd be willing to bet that most programmers could write such a program so as to be unhackable other than by brute force. Throw in Caller ID, so the bad guys would have to spoof the originating phone number.

Only after the simple, stupid login program approves does it hand off to the control program. No random attacks from everywhere on the Internet, against any selection of ports. Just make sure the facility's end is a landline, and disallow calls from cell phones, and you've tightened up security a lot, and your tech can still work in his jammies.

Is this a reasonable improvement, or am I missing an elephant in the room? Remote access doesn't have to mean the Internet.

(Yes, I know state actors could snoop the microwave towers the call goes through, or just put a tap in one or the other Central Office. Hence the nonce. And yes, the tech couldn't call from the airport or his favorite cafe. Wouldn't any operation of this size have a permanently-staffed control center offsite? Or at least someone on duty at corporate headquarters?)

Sancho_PJanuary 11, 2015 5:59 PM

@ Nick P, Donal Duck

Wait !
Think again, try to get some distance from the standards and so.
Please follow me, I’ll start slowly because time is the point!

Have a look at mother nature, beam yourself into the woods, listen to birds and bees, smell the flowers and animals. There is no straight line, no perfect circle, no plain color, no monoculture, instead there is diversity all over the place.
And it works since the beginning without any human governor.
That’s the power of evolution.

Evolution, on the other hand, makes use of the “trial and error” principle.
So there is also error, damage and sadness in our paradise.
There are predators, too, but they are an essential part of the whole system.
And no, there aren’t good or bad predators, no white or black, all have their place and are there to the benefit of others.

Now here comes the hardest part, please bear with me.

So evolution goes on for thousands of years without human brain.
The key for understanding it = [the whole system] + [the time] it needs.

We, mankind, have changed the game by introducing laws
(don’t get me wrong, we need laws, basically I love them) and thus changing both, the “system” and the factor “time”.

Everything’s perfect - until the intelligent genius (the wicked) starts thinking and producing laws to the benefit of some but not all of us, or making laws for all but except for some.
You know what I mean, but that’s not my basic point as this is going on probably since the first law was written.

Let’s turn to the IT now.
IT came in so suddenly that especially lawmakers didn’t know what it is and what to do.
Time was (and still is by far) to short to understand but they had to make laws.
So they asked the experts, those geniuses, to help them.
Bingo.
But they didn’t just get rid of liability, they got rules for monocultures and secrecy.
That changed not only the system but also the factor “time”.
Fascinating business in milliseconds of evolution - no time for predators.

So it grew like cancer. Here we are.

Funny, Nick P wrote:
”White hat hackers are good. They show us we're vulnerable constantly without causing unnecessary damage. Yet, this knowledge doesn't change anything at any level …”
So you see it doesn’t help? And? More corrupt laws?

Donald Duck wrote:
”The less hackers lulzing around our critical infrastructure … the safer we are in this day and world.”
On the contrary.
Balance, robustness and safety needs challenge, training, evolution.
- The stronger our critical infrastructure is, the safer we are!


There will be always some idiots, kids, crazies, predators, aggressors - call them enemies or whatever.
You can probably reduce them by law / pesticides [1] but some will always be there.
We have to face that and make use of it.
And we shall trust in the power of evolution.

The basic idea was: Give the predators a chance to clean the system.
Let them work and find the flaws before it gets out of control.

But when you feel by now IT is “too big to fail” then we are already beyond the cliff.

Uh, long posting. Thanks for reading!


[1] Be aware of the consequences - check for the bees … do you still see them?

ApplesJanuary 11, 2015 6:01 PM

Albert said: "There is NO excuse for this sort of thing. Using the Internet for plant control is just asking for trouble."

Yes, and some of the engineers inside the end user companies and their control system suppliers knew it and kept repeating and resisting ad nauseam it during the 80s/90s. There are still many companies where such a thing would never be considered. However, in some, the advent of Windows (even going back to Compaq luggables running Desqview/Topview and DOS) led to higher management just having to have access to plant at 2 am just because "isn't that neat and we don't have to pay callout charges, we can get appication guys (et al) to fix it from their beds". For real.) Now many of the earlier plant and control engineers are retired and their knowledge gone with them. Being in the industrial systems business from the 70s, I have no sympathies with management who allow plant vulnerabilities such as these. Not until they are personally held responsible and lose their pensions or go to jail when actual humans are hurt will anything change. Heck if I could find anyone who'd actually pay me for auditing their control systems, I might come out of semi-retirement. I'm sure many of us have horror stories even today of sloppiness inside big-name public companies, but so long as this quarters conference call is just poppety-fine all is good. I know that I couldn't take some of it any more after 30 yrs and just gave up.

Sancho_PJanuary 11, 2015 6:01 PM

@ Ole Juul

Very well said, congraz!
I’m not offended by “reformed hacker”, it probably would be an offense to omit the term “hacker” ;-)

WaelJanuary 11, 2015 6:45 PM

@Sancho_P,

We, mankind, have changed the game by introducing laws
Wouldn't this be also part of evolution? It's going at a faster pace, but who's to say how evolution should "evolve"? And since you stressed that "time is the point", then it could follow that evolution is accelerating and not moving at a constant speed. TLA's are influencing "natural selection" and "survival of the fittest", too. So the interaction becomes more complex as we accelerate towards the inevitable ;)

Nick PJanuary 11, 2015 7:32 PM

I get your points about evolution and how our laws mess with it. You sort of left a critical part of it off: optimized to focus on survival. Unlike your description, predators sometimes serve a good purpose in the ecosystem and sometimes are straight up destructive. To deal with this, evolution also produced the organisms like us who could create social and/or physical structures to prevent such destruction. We reduce what we can't prevent. We do this to ensure the very important goals of stability, life, liberty, pursuit of hapiness, and other good things.

So, now let's look at the situation. Almost all computer and network architectures were optimized for everything but security. Vast amounts of human and economic activity depend on their continued function. Failure of certain systems, esp backend mainframes or Internet backbone connections, could cause operations to crumble. This is especially true at sites like hospitals, banks, and industrial control systems. The rule you advocate would cause sites like that to be hit in a constant free for all that could rid us of critical things we actually want.

I'm in favor of getting the public's awareness on issues and things that make them consider secure alternatives. Then again, there's been so little market incentive on secure system development that there's almost no alternatives available. They won't be produced either due to chicken and egg like problem. So, whatever change we need should cause minimal disruption to market, maximal effect on IT purchasing decisions, and creation of a high assurance market.

Your strategy fails this requirement because it's maximally disruptive to what we want to protect, it doesn't give people a trustworthy alternative, and INFOSEC industry will just push garbage as they always have. Rather than evolving things, it just puts the brakes on key drivers of our economy. My strategy has worked twice in producing a measurable impact with minimal disruption to anything except IT acquisition or development. It can also be incrementally adopted by businesses and ISP's. So, I promote that it's the better option.

Clive RobinsonJanuary 11, 2015 11:31 PM

@ Sancho_P,

And what do you say about Natanz? Did they have Internet?

It is said --but not confirmed-- that they did not, and that Stuxnet actually arived on a USB key of a UN Inspector. This was apparently vehmently denied and thus some unnamed contractor / technician was blaimed, only it was also said that they did not allow "sneakernet" use of media at Natanz either...

Thus it must have been either magic or UN inspectors, and it's fairly clear from their actions who the North Koreans thought was the real target, and who they thought was the infection vector, when they dragged a UN inspector in and "shocked the socks off of him"...

The simple fact is maintaining an air gap is difficult, some time prior to Stuxnet, I independently worked out how to do what Stuxnet is reputed to have done, in a quest to show that voting machines could contrary to their makers claims be got at without physical access. If you search this site you will find my short explination of "fire and forget" malware designed to attack the voting machine techs laptops.

Since then as @Conyne Tibbet's above notes, it's just got worse and worse and worse. And to add to that we now have ET "call home" software, the sole real purpose of which is to gather as much user data and activity and send it back to the developers in return for a "free" tetris game using "jelly beans" not bricks and other "silly user" diversions (away from the work they are being paid to do but are not...).

A hundred and fifty years ago, clerks were required as part of their terms and conditions of employment to supply their own "coal and candles" to heat and light the workplace they spent twelve hours a day in. I know that that is a bit draconian, but employers paying employees to play games has swung the pendulum more than a little too far in the opposite direction. And I suspect that it's already swinging back but you have to ask just how how hard and far before it comes back again the workers way again...

As for bees, I have reserved a large chunk of my garden for growing wild flowers etc specificaly for bees and other usefull bugs. However some of the "landlords" around me object because they think it brings their potential rental values down (it does not as the their tennents are students). However, when I complain to them that the food waste etc their tennents throw out improperly is attracting vermin I'm told I don't know what I'm talking about even though I have photographs of rats feeding off of it in their front gardens... The message is clear profit is the only thing that matters, and anything that effects it must be crushed including the bees...

WaelJanuary 12, 2015 12:09 AM

@Clive Robinson, @Sancho_P,

Thus it must have been either magic or UN inspectors
I remember reading somewhere that the virus was embedded in a printer after it was intercepted before its delivery to Natanz.

tyrJanuary 12, 2015 1:33 AM


The commentary seems to be all over the map on this one.

Probably because the causation chain isn't clearly laid
out. Given that industry and infrastructure has been
kludged together out of the latest gee whiz under the
august guidance of people who are not only ignorant of
system synergy but actively oppose anyone who points out
inherent dangers, the major problem is people as the weak
link.

A curious hacker can't do anywhere near the damage that
an incompetent manager writing specs will do to any
profitable business or useful enterprise. They build in
total vulnerability by design. Will they be fired for
exposure of the backside to a harsh outside world ?

CYA is the usual defense followed closely by lying about
how it came to be.

One thing that might fix some of this is to give the
Security audit guy (risk manager, sytem operator) the
power to fire anyone who violates security including
firing the CEO or board of director personnel. Until
that day comes the weak links are human and human
caused.

You can't fix that with procedures or paperwork guidelines
or training sessions. Programmers won't fix it either
the instant some dingbat wants to have your critical
plant appear on the CEOs home TV he'll blithely start
coding the wonderful idea and remove all of your hard
security because it'll slow down the pretty pictures.

That doesn't mean you give up it just means you look
at problems with a wide focus. If SonyPE has been
hacked over fifty times they need a new IT department.
They are not alone in needing a new IT department
either because everywhere you look you see the same
set of problems. Academe is still playing catchup
over the PC and the mainframe boys are still living
in the smock coat and "holy computer" mindset. The
loon band who builds industrial stuff is narrowly
focussed on getting a job done and has no training
in any form of security.

The comp got imbedded in all this stuff because it
was cheaper that the older dedicated controls not
because it was better. Most of the stuff was bought
off the shelf and kludged into whatever system needed
it. In some cases something that worked was modified
for some unrealistic goal that hadn't been clearly
thought through.

Moderns suffer from a dumbing down process that has
limited their ability to think. The word drugs covers
everything from aspirin to entheogenics, the word
hacker covers everything from curiousity to malignant
spammers and criminals (some who work for nation states).
If it's the only word you've got it generates more noise
than data.

Still if this wasn't worth discussing I wouldn't be here.

Sancho_PJanuary 12, 2015 6:32 PM

@ Wael

Of course we (and our actions) are part of the evolution. Regarding speed I guess it’s similar to driving, when you’re going to lose control it was too much.
Some can feel that tizzy before by their buttocks, others not.

@ Nick P

Sorry, what you say doesn’t make any sense to me, is it sarcasm or deception (=security)?

“I get your points about evolution and how our laws mess with it. You sort of left a critical part of it off: optimized to focus on survival. Unlike your description, predators sometimes serve a good purpose in the ecosystem and sometimes are straight up destructive. To deal with this, evolution also produced the organisms like us who could create social and/or physical structures to prevent such destruction. We reduce what we can't prevent. We do this to ensure the very important goals of stability, life, liberty, pursuit of happiness, and other good things.” (emphasis added)

- No, Sir, never “destructive”, history since the big bang proved otherwise.
- OK, we “could create” (probably, I’m not sure in this point).
- [cough] - ”stability, life, liberty, …” - Do you sometimes watch the news (without listening to their propaganda)?

“So, now let's look at the situation. Almost all computer and network architectures were optimized for everything but security. Vast amounts of human and economic activity depend on their continued function. Failure of certain systems, esp backend mainframes or Internet backbone connections, could cause operations to crumble. This is especially true at sites like hospitals, banks, and industrial control systems. The rule you advocate would cause sites like that to be hit in a constant free for all that could rid us of critical things we actually want.” (e.a.)

This paragraph clearly is based on your digital background, “0” or “1” ;-)
But my “proposal” (it was less than that) represents a value of 0.1 or so (?).
I’m far from “1”, proposing “do what you want, we’ll never prosecute you”.
Your conclusion seems to ridicule the basic idea of cooperation between adults, like a politician would defend their party line by shouting “terror - terror - we all will die”.

[ BTW "rid us of critical things" would be better NOW than tomorrow ]

There are thousand shades of grey between white and black.
We don’t need laws for that: To steal, to make money of theft or to intentionally harm someone must be prosecuted, also in the IT or Internet.

But, just to show the direction:
It would be no crime - on the contrary - to send “hacked” data to an independent gov organization (e.g. NSA) which will check, confirm and finally sue the “victim” in public sight.
The fine must be substantial, increasing and in particular reduce the bonus payments at the top brass to be an incentive for improvements in security (give them one year to prepare).
On the other hand, part of the fine should be used to reward the whistle blower / hacker and provide funding for a statewide insurance in case of unintentional harm or loss, but never to cover the “victim”.
- The attempt to “hack” must be free of any prosecution / regress of course.


Last not least, you then reiterate your hope to getting the public’s awareness, but at the same time your arguments clearly show that you yourself don’t believe in that fiction.

Here I agree wholeheartedly, and at the same time I’ll confess that I have absolutely no hope that anyone would pick up my hint ;-)


@ Clive Robinson

I was pointing at Natanz because some here seem to propose no or restricted (dialup?) network access which is of course possible for private home automation (and they’ll do it, just for the increased fun- factor) but not for an industrial complex with hundreds of computers and PLCs.


Yes, the very first dedicated cyber terror / attack was always officially denied, but nowadays it is proudly (off the record, needless to say, only conspiratorially hinted by “former US intelligence officials” or so) credited to the Brave American Nation.

This cowardly hidden action of warfare gives both example and excuse for others to do the same to other states except the US because they will react like kids:
Tit for tat, and the others were first, Sir !

Nick PJanuary 12, 2015 11:11 PM

@ Sancho_P

"Sorry, what you say doesn’t make any sense to me, is it sarcasm or deception (=security)?"

You originally posted comments like these:

"Not more laws - but less. Hackers go free - this is the incentive for the corporations, no extra money needed. “Hack them, beat the hell out of them - the best hacks will be published”, that’s all. Corporation security would improve dramatically. - Think capitalistic, think of the free market - not in lobbies and corrupted laws."

"There are predators, too, but they are an essential part of the whole system.
And no, there aren’t good or bad predators, no white or black, all have their place and are there to the benefit of others."

"The basic idea was: Give the predators a chance to clean the system.
Let them work and find the flaws before it gets out of control."

Hackers go free, beat the hell out of companies, no good or bad predators, and predators clean out the system. Yeah, it seemed like you were pushing in an anarchistic direction that gave hackers blanket immunity. You didn't mention any limits whatsoever. You were also arguing against regulations or government-driven incentives I pushed. So, I started with the premise that you wanted all hacking to be legal and decided it would be a damaging free for all.

Now you're talking about a liability-based scheme where hackers submit their results to a government organization that can fine companies. Quite opposite of what you posted before. Interesting concept, though. Worth exploring further.

gordoJanuary 12, 2015 11:43 PM

@ Nick P to Sancho P
"Now you're talking about a liability-based scheme where hackers submit their results to a government organization that can fine companies."

They better act fast:

Google No Longer Provides Patches for WebView Jelly Bean and Prior

… it would appear that over 930 million Android phones are now out of official Google security patch support," [emphasis in original]

https://community.rapid7.com/community/metasploit/blog/2015/01/11/google-no-longer-provides-patches-for-webview-jelly-bean-and-prior

Sancho_PJanuary 13, 2015 3:52 PM

@ Nick P

Right, I wanted to provoke by a bang - and at least I hit you ;-)
Thanks for thinking along!

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.