"The Global Cyber Game"

This 127-page report was just published by the UK Defence Academy. I have not read it yet, but it looks really interesting.

Executive Summary: This report presents a systematic way of thinking about cyberpower and its use by a variety of global players. The urgency of addressing cyberpower in this way is a consequence of the very high value of the Internet and the hazards of its current militarization.

Cyberpower and cyber security are conceptualized as a ‘Global Game’ with a novel ‘Cyber Gameboard’ consisting of a nine-cell grid. The horizontal direction on the grid is divided into three columns representing aspects of information (i.e. cyber): connection, computation and cognition. The vertical direction on the grid is divided into three rows representing types of power: coercion, co-option, and cooperation. The nine cells of the grid represent all the possible combinations of power and information, that is, forms of cyberpower.

The Cyber Gameboard itself is also an abstract representation of the surface of cyberspace, or C-space as defined in this report. C-space is understood as a networked medium capable of conveying various combinations of power and information to produce effects in physical or ‘flow space,’ referred to as F-space in this report. Game play is understood as the projection via C-space of a cyberpower capability existing in any one cell of the gameboard to produce an effect in F-space vis-a-vis another player in any other cell of the gameboard. By default, the Cyber Game is played either actively or passively by all those using network connected computers. The players include states, businesses, NGOs, individuals, non-state political groups, and organized crime, among others. Each player is seen as having a certain level of cyberpower when its capability in each cell is summed across the whole board. In general states have the most cyberpower.

The possible future path of the game is depicted by two scenarios, N-topia and N-crash. These are the stakes for which the Cyber Game is played. N-topia represents the upside potential of the game, in which the full value of a globally connected knowledge society is realized. N-crash represents the downside potential, in which militarization and fragmentation of the Internet cause its value to be substantially destroyed. Which scenario eventuates will be determined largely by the overall pattern of play of the Cyber Game.

States have a high level of responsibility for determining the outcome. The current pattern of play is beginning to resemble traditional state-on-state geopolitical conflict. This puts the civil Internet at risk, and civilian cyber players are already getting caught in the crossfire. As long as the civil Internet remains undefended and easily permeable to cyber attack it will be hard to achieve the N-topia scenario.

Defending the civil Internet in depth, and hardening it by re-architecting will allow its full social and economic value to be realized but will restrict the potential for espionage and surveillance by states. This trade-off is net positive and in accordance with the espoused values of Western-style democracies. It does however call for leadership based on enlightened self-interest by state players.

Posted on May 22, 2013 at 12:05 PM11 Comments


name.withheld.for.obvious.reasons May 22, 2013 1:39 PM

Our research team has worked in a similar vein that has been described as Know-fare. Essentially the concept of knowledge being leveraged in a cyber battlespace. What we wanted to understand is the concept of a militarized battlespace as an extention of thought. It is not pretty, the number of elements affected in society is hard to overstate. We are definitely headed in this direction.

fatbloke May 22, 2013 3:44 PM

Looks & sounds like impenetrable “cyber drivel” to me to justify an expensive consultancy project…

Clive Robinson May 22, 2013 3:46 PM

There is one obvious point to make about this, where the authors say,

    Defending the civil Internet in depth, and hardening it by re- architecting will allow its full social and economic value to be realized but will restrict the potential for espionage and surveillance by states.

Neither the “hardening” or “re-architecting” would be required if the likes of major commodity software companies had actually done their jobs properly rather than rush half baked overly featured software out the door on unsuspecting customers.

As I’ve noted a number of times before the only thing unregulated “free markets” are going to produce is a “race for the bottom” where first quality then reliability then profits are sacrificed to be perceived as “the first to market”.

Oddly perhaps making sure that a minimum amount is regulated actuall opens the market up to compeate on other asspects than “first to market” this gives not only consumer choice but also a degree of stability that alows specialisation in certain aspects that then become main stream.

An example of this is safety features in cars etc. When a certain minimum were required by regulation they had to be built in not bolted on. This caused design innovation which produced other benifits and a market for safety features that had previously not existed.

The hard part for legislators is working out what the minimum amount of regulation is to encorage the desired behaviour without stiffeling innovation.

winter May 23, 2013 5:11 AM

@Clive Robinson
“Neither the “hardening” or “re-architecting” would be required if the likes of major commodity software companies had actually done their jobs properly rather than rush half baked overly featured software out the door on unsuspecting customers.”

I have often wondered about what the current state of cyber-security would have been if Microsoft would have kept windows at a minimum level of security on par with what Apple and BSD/Linux had at the time?

For instance, self propagating computer viruses seem still to be limited to Windows. And much of the rise of cyber crime can be attributed to such computer viruses.

grk May 23, 2013 6:24 AM

Microsoft is responsible for its poor software design in the 95-2005 period. But worms focused on Windows because it was installed on 98% of end user computers.
Apple computer are recently in focus for the same reason, market shares.
I might be the case for GNU/Linux at a time, and I really hope that the SDLC mindset will reach major Linux apps developers, because it is not the case, and no company can incentive OSS developers for achieving so.

To come back to the initial topic, greed for power(in all its forms) will always battle with greed for freedom. Both extreme are utopias.

winter May 23, 2013 6:36 AM

“But worms focused on Windows because it was installed on 98% of end user computers. ”

Some design “decisions” were damaging beyond mere market share.

The initial surge of cyber malware was mostly carried by viruses that could install and propagate from infected emails. These infected machines were the basis of the bot-nets of the last decade. The same with “infected” web pages.

I have not seen many (or even any) such viruses in any other OS than Windows. Even now, with MacOS X having a sizable market share, I do not see viruses propagating by email.

Another horrible decision of MS was to run internal OS services over external IP ports. That way, a Windows machine could be infected over a network even before it’s installation was completed. I know of no other OS that does that.

Clive Robinson May 23, 2013 7:58 AM

@ Winter, GRK,

Microsoft are one of many offenders and I’ve no particular wish to single anyone of them out when it comes to security.

Microsoft however with their relentless drive to make the world Microsoft by various actual and faux first to market techniques and later other less sulubrious techniques, did make themselves target number one with attackers simply because of their dominance at the time. We now see similar with Adobe and Oracle products, because they are on virtualy every user system and quite a few servers, and as the code base is generaly platform agnostic untill the build phase the same or very similar faults appear on all platforms.

I noted a long time ago on this blog certainly well before Google Chrome was even hinted at that the attackers game was moving from the OS layer to the App layer simply because many apps had become the equivalent of OS’s in their own right to get the performance but without the hardwon security that was going into OSs at the time. The app type I’d seen this start happening on back was web browsers and I discused it back in 95 at a University Masters course. At the time web browsers were in effect becoming the new “work environment” as developing for the desktop was considerably harder due to the issues with MS MFC amongst others. The easy attack vector being exploted at the time was the lack of state in web browsers, an issue that still haunts us today in many ways with what are in effect “session tickets” built around cookies etc.

One reason MS got baddly hit was their building IE into the desktop in the way they did and because of it they are still paying the penelty.

However MS had another issue where security problems were rife and still are today and in some ways are currently the major cause of our current attack vectors. The issue is the support of legacy code across many updates in the OS, apps, protocols and standards.

For instance I know for a fact there are still people out there using Self Signed PK Certs which were made on systems that had a broken random number generator (none of which was to do with MS code but is still used with their code). The people that generated the Certs have long since left the organisations for “bigger and better” and those who have replaced them have no idea of the origin of the certs but fear changing them for many customer facing reasons.

As for the various security methodologies for production code cutters on commodity platforms none of them are sufficient, they are slow to bring into place and require all code cutters in the organisation do so, and all the legacy code still in use to get re-milled. As Microsoft have admitted there system is most definatly not for every one and their productivity took a masive hit during the initial phases.

It is rumoured that Oracle are fed up to the back teeth with Java due to the bad publicity, but for some reason are unable or unwilling to get a grip on the problem and resolve it. I’ve no relationship with Oracle or the old Sun teams these days so cannot say, nor can I confirm or deny the comments of others that have or claim to have a relationship with them. What I can say is that if the rumours are true it’s not that surprising the problem has been seen over and over again when one organisation takes over another organisation. As somebody once observed “We knew there would be teething problems but we didn’t realise we had to do puberty and stroppy teenager issues as well!”.

Security is hard, even for those with many years experiance, they are a rare resource and have a golden value. Most managment would rather buy in than train not realising that all code cutters have to be security aware, so buying in is not working for them the way they thought it would, and the number of developers with the required skills still remains well below that required. Consiquently code is still…

Further few realise that most of the code independent protocols were ad hock and were not designed with current security requirments in mind. But they still expect revision 0.5 to still work with 3+.

Belive it or not many standards are just the same take C with it’s fairly infamous security issues from K+R days. Well many got slurped into ANSI C and subsiquently they are still there in many C++ compilers as well.

It’s a mess that is going to take one to two programmer generations to sort out, but legacy issues will still be with us due to embedded systems. As I’ve noted befor electricity and other service meters and implanted medical electronics can have expected life times well in excess of a third of a century. DES died the death in considerably less time but it still has to be supported due to embedded systems that cannot be upgraded. If the original standards had included upgradability as part of the approvals process then we would be in less of a hurt area than we currently are. Which is just one reason I say we need to raise the game by legislation that requires standards to be met, and the standards to be appropriatly future proofed in some way.

name.withheld.for.obvious.reasons May 23, 2013 9:14 AM

@Clive : You’ve about put the issue to bed.

The only thing I’d add, not to give the punch line away, but…

Java can be found in a whole host of embedded systems (ever checked your 3G/4G firmware)?

How about your embedded controllers management features? From a web interface, right? Under these systems are JVM’s and
applets along with the commensurate presentation coding. I know, this is probably more drama than should be written for one play, I leave the play writing to the likes of Shakespheare. What is implied is that for any number of the X billion devices java is installed on–there is probably a JVM injection strategy to access, or deny access, to these billions of pieces of SH — Internet of Things.

jackson May 23, 2013 9:29 AM

The anti-MS comments are really crooked. A lot of younger people repeat this stuff because they think it makes them sound knowledgeable.

Far and away the biggest undertaking by MS was to increase the usability of computing, to truly reach everyone. How many times have people compared the Linux Kernel to a Windows machine running all these applications? Oh see how solid the Linux machine is. Isn’t it weird how people still bring up MS mistakes decades later and with crystal clear hindsight? The fact is, this has nothing to do with MS. It’s about THEM.

If it was up to these guys there wouldn’t be a PC and you’d have to go to the university to use a command line.

Dirk Praet May 23, 2013 8:47 PM

@ Jackson

Far and away the biggest undertaking by MS was to increase the usability of computing, to truly reach everyone

In which universe was that ? Back in the day, MacOS was infinitely more intuitive and user friendly than DOS/Windows, and from a technological point of view IBM’s OS/2 was superior in every aspect. The only reason Windows became the predominant desktop system was the sheer brilliance of Bill Gates’ marketing strategy, courting software developers, brokering deals with nearly every PC hardware vendor to preload DOS/Windows and making it so easy for even ordinary users to pirate both the OS and most software available for it.

By the time Linux had developed a GUI that was more or less usable for the average layman, M/S had cornered the entire market. The bug-ridden OS/2 with its ridiculous dancing nuns campaign was taken to the consumer market way too early with way too few applications available for it, and Apple nearly killed itself by proprietary vendor lock-in and failing to understand the wants and needs of an entire new generation of computer users. If it weren’t for the return of Steve Jobs, they would have gone belly up.

@ Clive

I’ve no relationship with Oracle or the old Sun teams these days so cannot say, nor can I confirm or deny the comments of others …

As a former Sun Microsystems engineer, I was a priviliged witness of its demise. The company was particularly hard struck by the burst of the internet bubble in 2000, and from then on gradually evolved from an engineering to beancounter driven organisation with a Mexican army of middle management alienating the brass from the people on the shopfloor, and shifting its strategy from innovation and long-term vision to quarterly profits. Instead of focussing on its hardware, Solaris and Java flagships, Sun embarked on a mission to try and corner new markets but its lack of a coherent vision got it defeated on pretty much every front. The acquisitions and integration into the company of Cobalt and the iPlanet software stack to name just a few cost a lot of money but were abysmal failures. The appointment of Jonathan Schwartz as new CEO didn’t work out either. Despite being a great speaker, he totally failed at turning things around and effectively drove the company into the ground.

By the time Oracle bought what was left of Sun, many engineering and support teams had been laid off. The departure of James Gosling was a severe blow to Java, not to mention many other former Java crew members defecting to Google et al. Those who staid on found themselves in an entirely different, micro-managed company where they had nothing left to say over what they were working on. It’s hardly a surprise that this does not make for a motivating environment, leading to even more defections and leaving Oracle with a product its creator and many – if not most – of its original developers have abandoned in search of greener pastures. Anyone who has ever been involved in development knows this makes for quite a peculiar situation, especially when the product is ubiquitous and a full rewrite from the ground up is virtually impossible for compatibility reasons. And that is pretty much the conundrum Oracle is facing today.

game November 17, 2013 8:25 AM

Seriously, amazing blog layout! Just how long have you been writing a blog for? you earn posting start looking straightforward. The complete start looking of one’s web site is fantastic, let alone necessary .!

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.