Risks of Networked Systems

Interesting research:

Helbing’s publication illustrates how cascade effects and complex dynamics amplify the vulnerability of networked systems. For example, just a few long-distance connections can largely decrease our ability to mitigate the threats posed by global pandemics. Initially beneficial trends, such as globalization, increasing network densities, higher complexity, and an acceleration of institutional decision processes may ultimately push human-made or human-influenced systems towards systemic instability, Helbing finds. Systemic instability refers to a system, which will get out of control sooner or later, even if everybody involved is well skilled, highly motivated and behaving properly. Crowd disasters are shocking examples illustrating that many deaths may occur even when everybody tries hard not to hurt anyone.

Posted on May 2, 2013 at 1:09 PM16 Comments

Comments

Steve Powell May 2, 2013 3:08 PM

Greetings. I just wanted to share an example of an automated system that us Tier1 service providers have played with that in certain situations can go instable. There is an automated way to subscribe bandwidth on networks using MPLS and RSVP. Never mind what MPLS and RSVP is. Take it for granted that dynamic subscription exists.

If you are not smart with your variables and get too aggresive the entire dynamic subscription system becomes unstable and pathologies are introduced. Sometimes it can be a cycle where traffic is dynamically moved never ending in a steady state. Worse case is an entire collapse.

The solution is to introduce weighting functions to delay the feedback loop and install limits to how much change can occur. But without them, KABOOM!!!!

Nick P May 2, 2013 3:26 PM

It’s not surprising. It’s why nature uses the principles of diversity, redundancy and loose coupling.

David McClain May 2, 2013 3:48 PM

Can you supply an example of where “nature uses the principal of loose coupling”?

I’m reminded strongly of the current stock market, and its susceptibility to small perturbations.

Julien Couvreur May 2, 2013 6:05 PM

The system lacks breaking points, as we have them in our electrical system.

The problem the author identifies with the financial system seems largely a question of incentives.
If you transposed the incentives that exist in finances (namely taxpayer “insuring” the reckless risks banks take) to people’s electric systems (if your house burns from an electric fire, the cost goes to your neighbors), then I bet there would not be many circuit breakers or other protections.

Overall, I agree with the author that interconnected systems carry unique risks, and it is possible that many under-estimate such risks.

The solution is to drive more understanding and education about such risks and make sure that incentives are properly aligned on people who can influence solutions (bank owners, network owners, etc.).
If the incentives are there, then banks and other inter-connected systems will seek and likely find mitigations (diversification, loose-coupling, breakers, buffers, backups, etc).

Dirk Praet May 2, 2013 7:50 PM

One man’s disaster is another man’s opportunity. Therefore, many problems can only be successfully addressed with transparency, accountability, awareness, and collective responsibility

Try selling that to any government . Or big corporation, for that matter.

Frodo May 2, 2013 9:54 PM

Taleb brought this idea up years ago. His new book is essential reading. If you don’t know who he is, you’re not a serious security professional.

Gweihir May 3, 2013 5:39 AM

@Frodo: I am a serious IT security professional and I have no idea who “Taleb” is. Care to elaborate?

Jurgen van der Vlugt May 3, 2013 5:53 AM

There’s al whole book on the subject.
Tim Harford: Adapt.
’nuff said. And re Taleb: Itsec pro on what planet?

paul May 3, 2013 9:13 AM

“One man’s disaster is another man’s opportunity” is a very misleading line when discussing networks effects.

Certainly one person’s disaster is another’s opportunity (at least in the kinds of trade where there’s definitely a winner and a loser), but pretty much the whole point of system-wide disasters is that they produce many, many losers and few if any winners, with the total value of losses far exceeding the value of gains (if any).

If you can stand outside a particular network, you may be able to gain by causing it to crash, but the more interconnected and pervasive networks become, the harder that is to do. Another example of how our simpler intuitions, honed in individual or small-group interactions, may mislead us when scales change.

Clive Robinson May 4, 2013 5:06 PM

@ Frodo,

Size makes systems fragile, vulnerable..

Yes it’s the old 0.5(n^2 – n) problem of the number of relationship links between n entities.

This has been known long long before Talib or any other current security writer/guru started out on their careers.

Likewise the solution which is sub grouping so instead of one group of ten entities with 45 relationships you have two groups of five entities with ten relationships each, and one further relationship between the two groups. So you have better than halved your complexity.

There are a whole load of other techniques to reduce complexity in secure systems that were written about getting on for sixty years ago. For some reason the work carried out in the 1959’s and into the late 1970’s appears to have been forgoton and is thus being currently “re-discovered” both in academic papers and security books.

In the past I’ve mentioned quite a few of them on this blog when talking about TEMPEST / EmSec design rules.

Maryellen Evans May 7, 2013 4:44 PM

This directly relates to research we have been doing for several years on SOA security and the interconnectivity it provides.

http://evansresourcegroup.com/wp-content/uploads/2012/01/ergtrendscecurityreportbprev2.pdf

We have a product in Beta now called MQSentry that is SOA Map and testing tool that allows you to see your SOA interconnectivity and secure it properly. The only way to secure interconnectivity is to apply the security constructs, test it to make sure that they are working, remediate what isn’t working and continue to do this on a routine basis. Interconnectivity security isn’t just about risk, it’s also about liability since you are connecting business partners to each other.

After years at this it is very frustrating to herd cats in this area. There are the managers, SAs, C-level executives who don’t always share information with each other creating a vacuum of knowledge in this area. There are the vendors who don’t want customers to know about this, afraid they will rip out their software and there are governments and governance bodies that are influenced by their own internal motives. Put those together and you get a ginormous mess.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.