Comments

Nick P March 16, 2012 10:15 PM

Re: NSA Datacenters

This seems stupid. I think they’re probably targeting protocol and cryptosystem failures rather than AES itself. If they think they can crack it, they know about critical flaws that every other crypto group missed. Unlikely. They might be able to crack some of these other weak cryptosystems. Personally, I think it will mainly be for collection, analysis, and cracking weakly implemented or used (e.g. password-based with bad passwords) crypto schemes.

I mean, a supercomputer can’t beat a properly implemented symmetric encryption system. That’s why NSA uses them extensively for government. Different, unlikely angle: the organization that often had stuff beyond the “state of the art” in the past has a working quantum computer design & is building it in a huge secret building. This might be capable of cracking certain cryptosystems. Unlikely, though.

Jonathan Wilson March 16, 2012 11:52 PM

Its not beyond the realms of possibility that the NSA has classified technology with the power to crack encryption systems (after all, the decryption mission and work of the NSA ultimately date back to the work during World War 2 to decode the German and Japanese cyphers and that work (or more specifically the Bombes used to crack the German Enigma code) were beyond the publicly-known-state-of-the-art.

And its known (though declassified materials) that IBM and others built computers and special addons for the NSA that were ahead of what was sold to everyone else (the IBM 7950 for one)

Clive Robinson March 17, 2012 9:31 AM

@ Jonathan Wilson,

And its known (though declassified materials) that IBM and others built computers and special addons for the NSA that were ahead of what was sold to everyone else (the IBM 7950 for one)

Historicaly it’s actually a bit more interesting than that 😉

Various manufactures including the UK’s ICL wanted to put some of these “addons” in as standard but were “persuaded officialy” not to do so.

A couple of the addons in particular were the “set bit count” and the “parity bit” both of which are very basic functions to data communications. Thus the “policy” was well known in the industry long before any “declasified documents”, because lots of people were asking the manufactures to add them and were being fobbed off with increasingly ridiculously excuses. Likewise but to a lesser extent was hardware for programable linear feedback shift registers (LSFR’s). The silly thing about it was that the high speed hardware needed to do these functions was added as standard, but not in the ALU but in the communications circuits.

Another add on that was known about was a fast lookup table, and this function did end up in the ALU as part of fast maths functions. The difference being the ordinary version was “hard coded” with a ROM to do 8 bit multiplication etc whilst the “confidential” version had fast RAM and thus could be programed to do all sorts of interesting things.

Likewise “fast adders” these are a bit of a black art and are still areas of research. However it is known that certain types lend themselves very well to certain “other addition” functions used in cryptography (think addition within a modulus for turning LFSRs into NLFSRs etc)

Another addon that was known to have originated from Alan Turing was that of what we would now call a “True Random Number Generator” based around “Thermal noise”. Refrence can be seen to it in his papers as being especialy usefull for certain types of mathmatical calculations with suggestions it be added to the design then in progress in Manchester.

As far as I’m aware the only non “confidential” computer to have such functionality built in at the time was ERNIE used by the UK Government for it’s “Premium Bonds” draw.

But the whole “confidential ALU hardware” got to be almost farsicle towards the end. The US Government in particular wanted the addons but not to pay the increasingly large costs of “adding them on” so certain manufactures did the “IBM ‘strapping’ trick” of actually building in the functions as standard and having them enabled/disabled by a wire link ‘strap’ on the CPU PCBs.

Quite a few years ago I was shown the guts of a product (of a well known Super Computer Manufacturer of such “NSA” products) that was being scrapped. It should originaly have been “confidential waste” but ended up in a European University who had finaly decided that the electricity bill and leaking artificial blood plasma was not worth the expense any more. I was shown it as they knew I’d been involved with the design of what were the equivalent of super computers (but for things based around FFT’s). It was not to difficult to spot the “confidential” parts and to actualy map out the basic functioning, and work out what the missing instructions were from the micro code instruction set.

So in many respects the whole “confidential ALU hardware” addon’s was well known to very many in the industry.

Perhaps what is most amazing is the influance the NSA, GCHQ, et al had on manufactures in the first thirty years of the development of computers and in fact just how much better the UK government was at suppressing it than the US (with the result that few people knew what the UK’s input to computing during that time was). It needs to be noted that the influance of the US Government on that of the UK was fairly obvious and revolved around the significant debt that arose from before and after “lend-lease” which was just refered to as “war debt”.

Figureitout March 17, 2012 1:44 PM

@Bruce

Now we know what type of usb you would stick in your pc if there just happened to be a few random ones scattered around your work parking lot.

@Daniel

Well, I guess that means we need another security agency…the STA (Staircase Transportation Agency), body scanners at the top and bottom of each staircase so kiddies won’t be impaled with metal objects when (not if) they fall, stairs can be no taller than 3 inches, and must be padded with 3 inches of foam-padding. Sorry, we’ve gotta do it…if not for you…for the children.

@Diego
Very nice, too funny.

Thought this was interesting…may have some security applications later on.

“A team of scientists have successfully implanted a bioelectronic fuel cell into a living organism and used its blood sugar to charge a battery.”

http://www.theregister.co.uk/2012/03/15/snails_generate_electricity/

Terry Cloth March 17, 2012 3:15 PM

@Clive Robinson: “how much better the UK government was at suppressing it than the US”

I suspect that’s because the UK’s Official Secrets Act is much more stringent than anything that would pass First-Amendment review over here. (At least so far.)

aikimark March 17, 2012 4:17 PM

Pay attention to provision 215 in the latest version of the Patriot Act. A couple of senators (who are in a position to know) are urging the release of Justice Dept documents describing how provision 215 is being interpreted. These senators are worried that the intent of the legislation has been subverted by this interpretation.

Nick P March 18, 2012 12:03 PM

@ guy posting March 18, 10:13AM

Just paranoid lunacy. You’re vastly overestimating their capability. They practically wouldn’t have enemies in espionage, except insiders, if they had that much actionable information. The truth is that a bunch of governments are worried about backdoors and, while NSA has failed to backdoor everyone, DOD has managed to utilize many counterfeit or subverted parts from China. If they had the control people like you claim, these kinds of issues would be nonexistent & we wouldn’t see enemies sucking GB of data out of our networks.

Nobody needs tin foil, etc. Keeping cell phones & COTS wireless stuff away from sensitive electronics & doing EMSEC on that stuff gets rid of most of those issues if it’s a concern. Mobile phones and devices can’t be trusted b/c there’s only about 6 companies making the chips. Moving the process size up a few levels, sourcing it from many suppliers, and testing samples of them is the best way to reduce subversion risk. And we have high quality open source OS’s and tools.

No need to be unduly paranoid. The real world is scary enough as it is.

Bob March 18, 2012 2:01 PM

From that Wired article:

“They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.

The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”

What do you think it is, Bruce?

Keith March 18, 2012 2:18 PM

@Gordon: my favorite new information from the WSJ article: John Pistole thinks there are only seven terrorists in the word!

“We can reduce the size of the haystack when we are looking for that one-in-a-billion terrorist,” said TSA Administrator John Pistole.

MarkH March 19, 2012 12:08 AM

@Keith: For once, Pistole and I are in agreement. Considering terrorists who are a serious danger to the USA (Pistole’s brief, after all), and who might possibly consider attempting to come to the USA … I suppose that 7 is a plausible estimate (or rather, an upper bound).

Brandon March 19, 2012 1:49 AM

+1 to the wired article regarding the NSA’s operations.

Three interesting takeaways:
– all of the detail of what’s going on in Utah, Tenesee, and elsewhere
– the amazing amount of unconstitutional activity they’re doing — and paid very little attention to by CNN, NYT, MSNBC, ABC, CBS, etc now that Bush is gone.
– how they’re not interested in trying to be targeted (following the spirit of the law), but instead simply gather everything said by everyone to store it forever.
http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/all/1

I wonder how their computing power compares with Google, and how much of what the NSA does can be (and already is, in a way) duplicated by Google’s extensive machinery. Of course, whether and how much the NSA and Google work together is up for speculation, no matter what Google claims.

AC2 March 19, 2012 3:12 AM

Anyone know why the power substation would be outside the security fence, as per the schematic on the Wired story (from the link in Brandon’s post)

Clive Robinson March 19, 2012 6:23 AM

@ Brandon,

Of course, whether and how much the NSA and Google work together is up for speculation, no matter what Google claims.

The “work together” might noe be known to Google managment…

We know the NSA have “put employes in place” in major US telco’s prior to 9/11 to ease wire taps etc. In some cases this was unknown to much of the telco’s senior managment. This sort of behaviour goes back befor the NSA existed with the “Cable Companies” prior to US involvment in WWII.

@ AC2,

Anyone know why the power substation would be outside the security fence

Specificaly no but you can make reasonable guesses.

First off if you were an adversary who wanted to put the place out of commission using low tech or low personnel count you would go for either the power or data communications well outside of the security perimeter, and well away from roving foot patrols etc.

The comms can be fairly easily hidden or sent via radio, not so the power. Thus the secure compound has it’s own three day UPS. Further I would suspect there are also plans in place to gradually power the place down to critical functions only to extend that period and arangments made to ship in fuel and ship out functionality and personnel to other compounds in other places.

The cost of that tank proof compound fence is going to be a bit more than a few bucks/foot of chain link so it would be desirable to keep the perimeter as short as possible. And from the article it is not clear who owns and operates the substation… Which brings up the question of does it need to be secure at all, if not then it most definatly does not belong in the secure compound.

If you think of the substations function it is just the terminator to a very vulnerable and insecure power transmission system. Due to the UPS it serves no security function other than as the voltage step down for more secure cables into the compound and the UPS. And I suppose it should be said the UPS is a security function as it will be designed to meet EmSec requirments as a back stop to other EmSec equipment all of which will consume power one way or another.

So no the substation and it’s personnel should not be in the security compound. But that’s not to say it should not be protected to a much higher level than normal for a substation of that size or not run by security cleared personnel. It is a rather awkward “weak link” in that whilst over head high voltage cables can be easily damaged they are relativly easy and quick to bring back on line (days to weeks) compared to the substation which whilst equaly as vulnerable could take ten or more times to repair than the overhead cables.

karrde March 19, 2012 10:29 AM

@Alan Kaminsky, a similar event (umbrella causing a security alert and a search for “man with a rifle”) has happened before.

I think last time, it was at a mall in the Phoenix area.

That umbrella may have also had a samurai-sword-style handle…

Clive Robinson March 19, 2012 10:40 AM

@ Jacob,

It appears someone figured out the code for Duqu

I loved the “old school” from “ten years ago” God alone knows what that makes me I still cut pre ANSI K&R C from time to time in a compiler from DOS 2 days based on the “Small C Compiler”…

I shall now slither back into my carbonacious era swamp 😉

jacob March 19, 2012 11:50 AM

@clive. yep, but we can still teach the younguns stuff. I thought the dll trust was a clever move. Holdover from the old days.

I was recently told about company leaving malware between switch ports. Nasty piece of work that. Also, UPX packing but that has been going on for awhile..

Here in us the cia is talking about using networked devices to spy on targets. I have assumed they already are on webcams.

You and I are old school. I learned programming on punch cards and occasionally get weepy over kaypro and windows 2.0 before I slap myself back to reality. There are still municipalites that use Cobol as well as AS400s. Doesn’t that make you shudder?

Security is still a game of cat and mouse,. but never boring. 😉

Clive Robinson March 20, 2012 3:39 AM

OFF Topic:

This is both a little odd and also an object lesson in what can happen when bluff and double bluff from non attributable sources occurs and people have to make choices on very imperfect information for reputational or other reasons.

Basicaly there was a project on SourceForge called Anonymous OS, based on a version of Linux and having the Anonymous hacktavist logos on it. The reason it’s “was” is SourceForge pulled it,

http://www.pcworld.com/businesscenter/article/251956/after_review_sourceforge_gives_the_boot_to_anonymousos_project.html

Now I have no idea if it’s got security problems or not but to be honest I would not be surprised if quite a few were found. Not because they have been put in by some shadowy group be they hacktavist or LEO behind Anonymous OS but because that is the nature of large blocks of code.

Like the story about how the US agencies “Backdoored BSD” it will be interesting to see how this pans out.

Because if you think about it a security project’s credability is almost entirely based on “Reputation” because you cann’t prove a software project is secure.

Now the question is how long before people start talking about “Reputation Attacks” as APT or Cyber-weapons? anyone want to start a sweepstake?

Bob March 20, 2012 6:44 AM

Tor Browser Bundle for Linux (2.2.35-8) “EVIL bug”

“There is an EVIL bug in at least the Linux (2.2.35-8) Tor Browser Bundle start-tor-browser script. It will log things
like domain names to a file in the root of the browser bundle.”

https://trac.torproject.org/projects/tor/ticket/5417

Ticket #5417 (new defect)

RelativeLink.sh in Tor browser bundle has small typo causing debug mode to be always turned on

Reported by: cypherpunks
Priority: critical
Component: Tor bundles/installation

Description

TBB starts in debug mode disregardless of –debug switch used or not. This is caused by small bug on line 208 on
RelativeLink.sh, where it says

if [ “${debug}” ];

where it should say

if [ “${debug}” == 1];

or

if [ ${debug} -eq 1 ];

http://seclists.org/bugtraq/2012/Mar/85

Clive Robinson March 21, 2012 1:56 AM

OFF Topic:

Some may remember one of the reasons Stuxnet was a bit of a shock was the code was signed with (believed to have been stolen) private keys coresponding to official code signing certificates.

Well it appears that malware developers are now doing the same because using stolen private keys makes it so much easier to get around AV software, because of a defect in the way the AV software works.

Some people may remember I have a very longstanding and significant dislike of code signing as a security measure because it is weak to the point of being “faux security” and can be got around in a number of ways. Some of which are,

1, Work for the organisation up stream of it’s code signing process, this is belived to have happened a number of times.
2, Get the organisation to sign your changes/additions to their code, this happens with mobile phone code, hardware driver code etc.
3, Find the private key of the public key, this has happened to a number of hardware/software organisations that have used short or weak generated random numbers for the keys.
4, Steal the improperly protected private key from the organisation, what is believed to have happened with Stuxnet.
5, Generate a false certificate by compromising a CA, we’ve seen this a number of times with browser certificates so it’s a reasonable assumption that code signing certificates are just as vulnerable.

Well Swiss company Conpavi AG’s private key for their digital certificate (issued by Symantec) appears to have been used by the writers of the “Mediyes trojan malware” according to Kaspersky Lab a few days ago.

Which has started an investigation as to how the private key was compromised. Symantec who signed the coresponding public key of the now compromised private key have revoked the certificate concerned.

One of the reasons using compramised private keys to code sign is desirable for malware is because of heuristics in AV software. The manager of Symantec security response Liam O’Murchu said,

“We have seen more [malware] being signed, sometimes with stolen certificates,” because “It lends an air of legitimacy to the file.”

He further indicated that, as Symantec has evolved its malware protection method, it has included a risk-based scoring system based on several indicators as a speed up process to determine if code ariving on a PC is benign or malevolent (ie just like any “fast track” process common in “security theater”).

The problem is that any code which is “Digitally Signed” gets a very significant advantage in the Symantec risk-based scoring system. What is not clear is if being “signed” gives it a pass from all further checking or not (Symantec aren’t saying).

Liam O’Murchu has acknowledged that, compromised private keys and the coresponding digital certificates are available in online criminal black markets, such as “carder sites” you would expect to find stolen credit card numbers etc.

He further acknowledged that if malware writers are effectively figuring out how to get around Symantec’s AV software scoring system it will need to be recalibrated…

http://www.pcworld.com/businesscenter/article/252099/stolen_encryption_key_compromised_symantec_certificate.html

Clive Robinson March 21, 2012 2:34 AM

The Center for Stratigic and International Studies (CSIS) has come up with a list of what they consider “Significant Cyber Incidents” since 2006,

It’ a ten page PDF with 94 incidents in it, it’s noticable to see that a view on who is behind an attack appears in most cases to be based on that nations view of who “public enemy number one” is…

http://csis.org/files/publication/120316_Significant_Cyber_Incidents.pdf

Oh and that neither the British or Israelis get mentioned as attackers even though we know they are major players in the game along with the US, Russia, China, France and most other Northern European and WASP Nations.

kashmarek March 21, 2012 4:48 PM

Isn’t the new NSA super spy center implementing the same concept that Congress shut down following 9/11, formerly known as Total Information Awareness (or TIA, handled by some retired Admiral)?

Clive Robinson March 21, 2012 10:48 PM

@ Kashmarek,

On the new Samsung privacy invading TV

Bad as the idea of the camera and mic is, it’s the licence agreement that is even more scary, basicaly allowing them to install third party apps and the third party to do what they want with the data/video/audio collected…

Now as we know from the US school who thought it was OK to remotly activate the camera etc on laptops issued to children and thus capture them in their bedroom in various states of undress, this does not go down well as it starts getting to a court room…

Also I’m not sure but it might well be in contravention of EU Privacy laws.

And in the UK if they capture an image no matter how fleetingly of a child (ie under 18) in a state of undress then that automaticaly activates the laws relating to “K1dd4 p0rn” irrespective of what the intended purpose of the image. The UK law is written to be so encompassing that you get the “law of unintended consequences” comming into play as proffessional photographas hired by the parents of children to take portrait style photos (like the traditional baby on fur rug) have been prosecuted. There was also a case of a girl taking a “naughty picture” of herself with her phone and sending it to her boyfriend she was threatend with prosecution untill “cooler heads” decided it “was not in the public interest to prosecute”.

Clive Robinson March 22, 2012 8:30 AM

OFF Topic:

Below is another article that basicaly says legaly any “person legal or natural” (you and any and all organisations) “Who puts their head in the cloud is mad”…

[Not that the author of the article would nessacceraly agree as their income is derived by providing expert advice on Cloud agreements]

http://www.computerworld.com/s/article/9225340/In_the_cloud_your_data_can_get_caught_up_in_legal_actions

Put simply he raises two basic issues,

Firstly if your data is on your servers it’s yours and yours alone to do with as you wish, unlike with a cloud service provider or their backend service providers. Thus if you receive a legal demand forcefull or otherwise (subpoena, warrant, etc) you atleast know it’s happened and you maintain some measure of control over how and what is released.

[Not mentioned in the article importantly is the technical issue of “in what format” data should be kept on A) your servers (multi-level encryption using multi M of N shares keys etc), and B) also for releasing it, so you can stop pesky meta-data becoming meta-evidence against you etc].

Secondly and perhaps more importantly, your data can become “colateral damage” to legal action on others and thus unavailable to you through no direct fault of your own.

The example given in the article is the case of Megaupload, various agencies in the US gov went after it and grabbed all data and refused access to innocent data owners. The agencies then deliberatly gave data owners the run around knowing that what they were telling the data owners was a useless fabrication to get them off their backs.

[Like the articles ommission on data formats this is a more serious than the article appears to make out. This because of the underlying technical issue that in reality the data storage in the cloud is actually “backended” to one of just a very few providers, often over which you have no control, not even the flimsy paper control of a Service Agreement or Contract.]

Further only tangentaly mentiond is such action by agencies puts your data in their hands which means it becomes available without presenting you with a warrant etc.

[Again unmentioned there is also the very real possability the data will just get “sold off” when a court case is over and a judge alows an agency to dispose of the evidence and it just sells of the media as “second hand” or “scrap”. Researchers (Matt Blaze for one) have bought equipment and media from US Law Enforcment Agencies on the likes of Ebay and have discovered highly confidential and in some cases life threatening data.]

Nick P March 22, 2012 12:39 PM

@ Clive Robinson

A much better example than Megaupload actually happened in 2008. I bet you can guess which one is me from what I said. Further hint: at around page 3 or something someone attacks my hiding details of my scheme by talking “open source” or AES-like discussion. I smash him lol.

Original Poster & Discussion
http://www.binrev.com/forums/index.php/topic/40830-does-anyone-have-a-disaster-recovery-plan-for-fbi-raids/

Original Article
http://www.wired.com/threatlevel/2009/04/data-centers-ra/

So, we have been dealing with these issues for at least four years now. The good news is that, since the cloud are software machines, they might be able to just order copies be made rather than physical machines seized. In other words, a seizure might not be so bad if you’re using a cloud provider rather than a non-bigtime hosting provider.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.