Comments

Anonymous December 4, 2008 2:26 PM

Likely because some executive HAD TO HAVE have email functionality and didn’t think best practices applied to him/her.

Or because the best friend of another executive was hired to run the security department. That he could barely turn on a PC didn’t seem to matter at the time.

Eric December 4, 2008 2:49 PM

I’m biting my tounge on this one. Personal experience tells me that this story doesnt suprise me in the least.

Jason December 4, 2008 2:49 PM

This puts the zeal to prosecute Gary McKinnon in a whole new light, doesn’t it?

They have to go after the “low hanging fruit” of the attackers to send a message to the real bad guys that they will prosecute you if they catch you in an extradition-friendly country.

I really, really hope Gary beats extradition because I sincerely doubt, given these facts, he will be given a truly fair trial.

He will become the example.

Jason December 4, 2008 2:53 PM

Re: Eric

Of course it is a long-standing, elephant-in-the-room sort of problem.

Why would they leak this information now if not to influence the European Court of Human Rights when they hear Gary McKinnon’s new appeal (because he was diagnosed with Asperger Syndrome)?

kangaroo December 4, 2008 2:59 PM

Passwords and firewalls and ssl, oh my!

Why the hell are these people using just password protections into their systems? I bet their networks aren’t cellularized either.

90% of IT people should be sent to re-education camps with the MBAs and PR people.

Brandioch Conner December 4, 2008 3:10 PM

@kangaroo
“90% of IT people should be sent to re-education camps with the MBAs and PR people.”

Particularly with regard to governmental secrets.

Okay, if you MUST rely upon a username/password for access then why not assign a process to TRACK ALL ACTIVITY by that username? Build up a profile over time. And then alert the staff if there is any activity that is “off profile” for that username.

And why aren’t those computers being checked on a regular basis to verify that the only stuff on them is what is supposed to be on them?

Oh, that’s right:
“A fake e-mail, known as a spearphish, duped several members of the agency’s top brass and their assistants into clicking on the link of a seemingly authentic Web site, according to documents and interviews.”

It isn’t the techs that fail. It’s the executives who believe that their work is more important than following basic security. You’re handling governmental secrets and you’re still running a web browser, hitting outside websites?

HJohn December 4, 2008 3:19 PM

I remember several years ago about an incident with a high-level executive at a sensitive agency (I won’t say which one for fear of being wrong). Someone dialed in to his home computer and stole sensitive work files. One problem, but certainly not the only problem, is an improper rewards system. When looking at productivity and candidates for promotions and raises, we tend to give more credit to the one who gets more done. The one who does not take the time to encrypt, who copies sensitive files to portable media or emails the files home, in order to get more throughput usually get much more credit than the one who spends the necessary time to take due care.

IOW, the executive mentioned above made a habit of careless mistakes, but he may not have been the one promoted to an executive had he made a habit of caution.

While I couldn’t even begin to explain how this happened (I’ve been auditing government technology for over a decade, and not only would I have raised a stink about it, I have seen unbelievable reasons why it happens), I will say a common denominator is misplaced priorities. I’ve also given findings to great IT managers who had good decisions vetoed by a boss that wanted convenience.

Hopefully, one day the priorities will change.

Former SANSian December 4, 2008 3:40 PM

I wonder if NASA has kept up with its vulnerability reduction program, which was one of the main inspirations for the SANS Top 20 :

http://www.sans.org/top20/2004/nasa.pdf

Of course all it takes are a few (or even one) successful vulnerability exploitations to cause a lot of problems.

G-man December 4, 2008 3:51 PM

Back in the 90’s when I worked for NASA, I went down to Property Disposal to look for some furniture and noticed a bunch of PCs sitting on a large shelf. I asked what they do with them. The answer: We auction them off. So I asked what they did with the hard drives. They said the drives are still in them (you can surely see where this is going…) Long story short, my discussion with the property people ended up changing the policy at my NASA Center to make sure that the drives were removed or wiped before they go out the door. Up until that point the buyer got to see anything left on the PC.

Davi Ottenheimer December 4, 2008 4:00 PM

It should be mentioned not only that intrusions have been a concern for a while in these environments, but also that predictions were ignored and resourced reallocated.

One thing that’s always a gotcha in places like NASA is the research community. The guys/gals given a long leash to “play” and “explore” are frequently the hardest to convince about security and they hold a lot of sway. The trick is to get them to shift from seeing everything as a threat to their freedom, like a linear set of roadblocks, and instead value infosec as a form of greater certainty that enhances the chance of success in their work — like having tested and known clean water available for their coffee instead of being left a bucket of unknowns.

Clive Robinson December 4, 2008 4:22 PM

@ Brandioch Conner,

“You’re handling governmental secrets and you’re still running a web browser, hitting outside websites”

You have to remember NASA was set up to stop the power grab between the USAF and the Army, both of whom had rocket programs. It was effectivly made a civilian research organisation with the task of getting to the moon.

After Neil Armstrong put his foot down NASA nolonger had a mission to forfill, the funding started to be cut back and joe public complained that NASA was taking TV time away from re-runs of “who loves Lucy” (or whatever the show was called).

This left a research organisation with wildly ambitious people with a very very open attitude to information sharing in the organisation but with little money to fund their ambitions.

A shortage of cash, combined with researchers with a sharing mentality is not going to be a place where 0.01 cent in the dollar is going to be spent on system administrators or IT security.

Most people in NASA don’t give a fig for national security as the have mankind ambitions. It is only those who are seconded in from the USAF and Army that have a “secrecy” mindset built in.

When I think of NASA and Secrecy/Security and their overal mission I’m reminded of why HAL in 2001 went of the rails…

Clive Robinson December 4, 2008 4:29 PM

@ HJohn,

“I remember several years ago about an incident with a high-level executive at a sensitive agency”

Are you thinking of the director of an Intelegance agency who against the rules used to take his work laptop home so a family member could get onto the Internet?

If so you can easily find the info on the Internet it’s not even an “open secret” any more, it’s used as an example of why no matter who you think you are you are not above the rules.

HJohn December 4, 2008 4:39 PM

@Clive: “Are you thinking of the director of an Intelegance agency who against the rules used to take his work laptop home so a family member could get onto the Internet?”

Perhaps. I understood it to be sensitive files on a home computer, but it may have been this incident you speak of, although I think it may have been a different one (didn’t look it up because the agency name was insignificant to the point). Either way, both your stories and mine are exampes of the consequences of disregarding rules for benefit.

Happy Holidays.

Steve (UK) December 4, 2008 5:25 PM

Ahem.. forgive me…

A bit of computer network security…. I mean, it’s not rocket science is it!

Fris December 4, 2008 6:00 PM

My previous computer was a steel company. The company computers were locked down and security was very tight (mostly because many gullible people would install the virus of the week).

My new employer makes nuclear stuff, and their computers are wide open. The policies are the same, but the execution is miles apart.

kangaroo December 4, 2008 6:06 PM

@brandioch:

Well, what the hell are the computers that are used to access external sites doing being allowed access to the “secret” computers?

Setting up some internal routes, firewalls and vpns should make this sort of thing impossible, and you just keep two computers in your office! You don’t even need a secondary real network — “if” it’s properly configured.

That’s what I meant by cellularize. IT folks always want to make one big flat network, with high walls around it. That makes it a pita to get anything done, and creates giant internal security holes. Ask the migrant farmworkers about the wall on the border; ask Eastern Europeans about the iron curtain. Does nothing for security — just security theater — while making simple transactions impossible.

There’s a reason why we’re not built out of one giant cell — biological security. You build lots and lots of little internal firewalls and you worry a lot, lot less about the external one.

Pat Cahalan December 4, 2008 6:29 PM

@ kangaroo

you just keep two computers in your office!

Yes. That’s a very good idea. One machine for internal networking and one machine for public work.

It’s not going to happen. Ever. Any organization that attempted to institute such a program (assuming it could bull-rush its way through Appropriations) would be vilified for spending too much money as soon as a politician with an agenda decided it was easy pickins to criticize on Meet The Press.

Not to mention the fact that it is almost impossible to get a NSF grant awarded which includes maintenance and system support…

Inscrewtable December 4, 2008 7:53 PM

So here we go again with vague references to the “Chinese”. Could it be the Chinese? maybe. Lets
whip up more Wen Ho Lee fun fun and hang one of the bastards this time, right.

But who has been actually caught again and again spying on us. Rooskies and Israelis. But it is easier to attack those that seem different.

Mark Tiefenbruck December 4, 2008 8:30 PM

@Pat Calahan: The cost of buying personal computers for all the people employed by NASA doesn’t even scratch the surface of what they spend on most things. A politician who criticized it would be laughed off the planet. I once worked for a contractor of the NSA, and they used a similar system. It wasn’t quite two computers for every person, but nobody’s personal computers could access the Internet or any other computer that could access the Internet. They were on completely separate networks. This has to be common practice for any organization that cares about security.

David December 4, 2008 11:08 PM

The quoted article refers to “lost information”, “lost technology”, and “lost intellectual property”.

I’m not saying there aren’t costs to NASA in this sort of hacking; nor even that NASA doesn’t have good reason for keeping some info out of the hands of others. But “lost” is simply not the correct adjective if NASA still has its own copy of the data after someone else has accessed it, with or without authorisation.

Altair bin LaAhad December 5, 2008 1:35 AM

Does “stame.exe” mean it was a Win32 executable?
Shouldn’t NASA (Boeing and Lockheed Martin actually) use some OS more reputable in security?

Greg December 5, 2008 3:01 AM

@HJohn: “One problem, but certainly not the only problem, is an improper rewards system. When looking at productivity and candidates for promotions and raises, we tend to give more credit to the one who gets more done. The one who does not take the time to encrypt, who copies sensitive files to portable media or emails the files home, in order to get more throughput usually get much more credit than the one who spends the necessary time to take due care.”

Actually, it’s much worse than that in the industry. The ones who tend to be rewarded are the ones who are involved in the cleanup of the mess — whether or not they contributed to it through action or inaction. There is no recognition for those who actually do what they are supposed to and prevent such crises in the first place.

anon-ish December 5, 2008 7:38 AM

Nasa was not alone, most governmental institutions had laughable (read: nonexistant) system administration practices. Given the non-competitive and “cushy” culture of governmental tech/eng work, I can’t imagine that that has changed. There are some exceptions, our national labs have some pretty good security people, but NASA?? For over 15 years “Nasa security” has inspired nothing but chuckles and rolling eyes among my friends and colleagues.

I cut my professional teeth pentesting governmental systems, and I have never in my life seen systems set up such absurd ways. Its as if, in many cases, the admins went out of their way to make the systems less secure… really.

This article just made me freak out Lewis Black style, I think that I might need to have a cigarette. Thanks, Bruce, for driving me to smoke. 🙂

But hey, at least there is still a client out there that is backwards enough to allow a security professional to dust off his or her rpctoolkit shell archive and let nfsshell stretch its legs.

the other alan December 5, 2008 8:35 AM

@HJohn. Totally agree that people who put out the fires are the ones who are rewarded, not the ones who prevent them in the first place. I’m constantly harping on my management that we need to reward people who prevent the fires, not the “heroes” who put them out.

Everybody hates the fire marshall who writes up tickets for fire-code violations, but are the first to cheer the “heroes” who put out the fire caused by those same violations.

I’ve gotten to the point in my career, where I note the fire-code violations silently and prepare myself to put out the fire when it eventually ignites–but I make sure I let it burn awhile, so that everyone knows it could have been much worse.

Adrian December 5, 2008 10:24 AM

Make it all public.

It’s paid for by our tax dollars. Students should be able to study and learn from our designs. “Open sourcing” the information might lead to a hobbyist finding a problem or making a welcome suggestion.

When you share information, you don’t lose it.

Making it all (or mostly) public removes most of the incentive to break into these systems. It removes the incentive to hire others to break into the system.

Sure, secure the 1% that might help a rogue group build a scary weapon and that isn’t already widely known. But the other 99% should be freely shared knowledge.

Jerry Mangiarelli December 5, 2008 11:21 AM

It’s unfortunate, but in some cases it takes a incident to happen to realize that the controls/defenses that were implemented 20 years ago just don’t stand up to the sophistication of today’s arsenal of attacks. It can go without saying that experience and knowledge can prevent/limit the damage.

Clive Robinson December 5, 2008 11:44 AM

@ Steve (UK)

“A bit of computer network security…. I mean, it’s not rocket science is it!”

I wonder what they will do when they get (head in the) cloud computing…

Sorry folks bad jokes are a UK thing and they can only get worse 8)

NetLockSmith December 5, 2008 12:10 PM

Perhaps this is what HJohn was referring to, but former CIA Director John Deutch, was facing criminal charges for mishandling classified information — until Clinton gave him one of those last-minute pardons.

The NY Times [obscenely long URL, so use this instead: http://preview.tinyurl.com/5l9hel ] reported that Deutch:

  • Refused to have SCIF or separate classified computer in his home.
  • Carried classified data home on a memory card, and used it on his home computer.
  • Used the home computer to connect to AOL and surf the Web.

The Inspector General also said that he’d used the computer to visit ”high-risk” [IG’s words] Internet sites, “which news leaks report to include adult sites.” [NY Times’ words]

But wait, there’s more. Deutch previously worked at the Pentagon, and investigators found he’d used floppy disks there for classified info — and they were unable to locate some of those disks. ( http://www.washingtonpost.com/wp-srv/aponline/20001009/aponline164533_000.htm or http://preview.tinyurl.com/6ded36 )

What kind of message does it send when the highest-level person can flagrantly violate the law and get off scott-free, but lower-level people are prosecuted?

HJohn December 5, 2008 12:41 PM

@NetLockSmith: “Perhaps this is what HJohn was referring to”

Yes, that’s the one. Ironically, the very carelessness that got him in trouble was probably one of the reasons he was more productive and got promoted to a high level.

The person who breaks the speed limit will usually get places faster than the one who doesn’t, but the one who doesn’t is more likely to survive the trip (as are his passengers).

FormerLockMart December 5, 2008 12:59 PM

Years ago, I worked for LockMart at a NASA center as an IT person. When it came time to convert from one email/productivity package to another, guess who made the “final” decision regarding which package to use?

The secretary for the center’s director. I kid you not. It was purely a decision based on what she liked. Usuability was the prime objective, not security… or cost… or any other factor.

So, it doesnt surpise me that bad decisions regarding IT are being made at NASA by people that arent qualified to make them.

JimFive December 5, 2008 1:23 PM

@FormerLockMart
“convert from one email/productivity package to another, guess who made the “final” decision […] The secretary for the center’s director.”

And that is who should make the decision. That secretary is the person who is going to have to use that software day in and day out. They know the needs and requirements of their position. If there were show stopping security or cost issues they should have been addressed well before the candidates were presented for a decision. The job of IT is to support the business.

JimFive

Anonymous December 5, 2008 1:39 PM

@HJohn

It was John M. Deutch, the former DCI. Amazingly, in 1995 he was Dep Sec for Defense and was signatory on the NISPOM, a directive that the Gov’t issues to Defense Contractors to explain the measures necessary to protect classified information!

The Federation of American Scientists have a case synopsis posted here:

http://www.fas.org/irp/cia/product/ig_deutch.html

HJohn December 5, 2008 2:04 PM

@1:39PM: “Amazingly, in 1995 he was Dep Sec for Defense and was signatory on the NISPOM, a directive that the Gov’t issues to Defense Contractors to explain the measures necessary to protect classified information!”

I sincerely wish I was amazed. In over a decade of auditing dozens of government entities, I find that the higher up officials (the ones most adamant that everyone follow the rules, and the ones least sympathetic if anything, however unpredictable, goes wrong), tend to be the ones who think they are either too busy to bother or too smart to get burned.

I’m no longer an external auditor, and my current boss is fortunately very security conscious. But in many entities I’ve encountered, if there is a sitting duck on the network it is likely a boss who never changes his password, shares it with their secretary (read: written down) and/or is granted an exemption to the parameters.

Happy Holidays

Clive Robinson December 5, 2008 2:21 PM

@ NetLockSmith,

Yes that is the name and office of the person I was refering to.

He made a public appology on 28th Feb 2000, but it appears that he only did it to keep his security clerance with the Pentagon.

With regard to,

“What kind of message does it send when the highest-level person can flagrantly violate the law and get off scott-free, but lower-level people are prosecuted?”

A bad one. At the same time as Ex-Director Deutch was getting off (he had signed a plea bargin to pay 5000USD the day before Clinton gave the pardon) his theft of documents of the utmost secrecy, another person 60 year old Dr. Wen Ho Lee was being chased up on effectivly the same charges (ie mishandling clasified information).

Dr Lee was held in maximum security for ten months and was frequently in chains, he was repetedly threatend with execution by FBI agents.

It appears that all this was because he was born in Tiwan and the Clinton Gov was under preasure about being soft on “Chinese Spys” Dr. Lee and his wife were targeted by no less than FBI Director Freeh (and no there were no other suspects).

Dr. Lee had made the mistake of copying public domain and non clasified data onto seven backup tapes, this was known by his colleagues and was quite common in the area he worked. He subsiquently destroyed them as the information did not require any of this to be logged it appears he had done nothing wrong.

The FBI claimed amongst other things that the information was the “crown jewels” and that he had had clandestine meetings with Chinese agents.

Only when the FBI case started to unravel in court did the information Dr. Lee copy get clasified as secret…

As for the clandestine meeting he had spoken to other people at a conferance and had written up the contacts and it had been checked out as ok…

So yes there is a bit of a difference…

Jean-Claude December 5, 2008 2:30 PM

Sadly, this is endemic. In NASA’s defense, they do have dozens of contracting companies that may have their own IT policy, or one purely on paper. Consolidating such a massive workforce purely on a FIPS document is stupid, and unfeasible. You’d think NASA could maybe switch to BSD, but the costs of retraining thousands of employees (seriously, how hard is UNIX with a mouse?) And to allay all fears, mission critical stuff is all run on RTOS. Since NASA has Marines standing guard in front of Mission Control during operations, you’d think they could do the same with their firewalls.

Jean-Claude December 5, 2008 2:41 PM

I’d also like to question the accuracy of the article referring to ROSAT. Irrespective of the intrusion at Goddard, we’re talking about a sat who had lost two gyros, and essentially repeated it’s slew towards the Sun, which it had done in the early 90’s. Unless pranksters had complete control for duration of its flight (highly unlikely,) knew anything about stationkeeping (even more unlikely,) burning the PSPC-C and HRI on purpose is almost a Chuck Norris movie.

Jean-Claude December 5, 2008 3:18 PM

“An unfortunate chain of events in combination with these limitations
finally led to the event on Sept 20, 1998. One of the reaction wheels
came close to its revolution limit during a ROSAT night; the torque
demands during a subsequent slew then caused the wheel to exceed its
limit. The AMCS was no longer able to completely control the slew and
the telesope axis pointed close to the sun. A sharp increase of the
HRI countrate to 1000 cts/s, in combination with a sudden change
(softening) of the HRI spectrum at 0:47 UT, indicated an anomalous
event in the HRI detector. Shortly afterwards the HRI high voltage
was switched off automatically.

I think that beds the whole “terrorists hijacked a satellite” theory.

Jean-Claude December 5, 2008 3:33 PM

Not to let NASA off the hook, but it’s comments like the other alan’s that trouble me most. “Letting fires burn” led directly to Challenger and Endeavor. Just my two cents.

FormerLockMart December 5, 2008 4:28 PM

@JimFive

I agree with you completely, IT is to support the business.

The cost and security show stoppers were addressed before showing the end users. But when the end users saw the cheap and secure system, we went with the expensive and insecure system.

That’s not the way to run a pop stand.

Anonymous December 5, 2008 4:41 PM

@JimFive

Unless you want to go back to an abacus and double-entry accounting, IT is the business.

I don’t understand how people who can’t understand basic security protocol can’t be trusted to design a serious security plan, much less implement one.

CM December 6, 2008 2:46 PM

Continuing the comments of @Jean-Claude, the ROSAT claim in the article appears to be totally bogus. ROSAT was not even operated from the US! ROSAT controllers were located in Germany. And, as others already pointed out, the satellite’s pointing systems were very close to failing anyway. The claims in the article appear to be the conflation of perhaps three separate incidents, just to make it sound more sensational. To be sure, the NASA Goddard center (where I have been) has had security incidents, but nothing of this magnitude. And since 1998, security at the center has improved by huge steps.

Clive Robinson December 6, 2008 11:35 PM

@ Gung Ho (Lee),

“The Chinese finally went into space – forty years later.”

Yup just before it looks like space travel is going to be realisticaly commercialy viable for things other than communications and spying…

As has often been remarked “the leading edge is the bleeding edge”.

And as is well known in engineering the profit “sweet spot” is rarely found by those “first to market”…

And what’s your local version of that old saying about “following in the foot steps…”?

Clive Robinson December 6, 2008 11:54 PM

@ Anonymous (4:41 PM),

“I don’t understand how people who can’t understand basic security protocol can’t be trusted to design a serious security plan, much less implement one.”

Hmm which negative should be positive?

not litigious December 8, 2008 5:55 AM

@Clive Robinson:
Wen Ho Lee should never have been prosecuted on the very weak evidence that had been collected, and it is quite right & proper that he has received $1.6 million in compensation.

However the current trend, to completely and utterly exonerate him of all suspicion, blows too far the other way. A great deal of what he did was, and remains, suspicious; and some Internet accounts of his snow-white innocence distort the facts to the point of fabrication.

For example, it is not true that the data he copied was public domain and unclassified. It was actually classified “restricted” at the time. Further, the reason it was later reclassified to “secret” was that the “restricted” classification was found to be a serious error; it should always have had a higher classification. Specifically, most of the copied data consisted of software for the simulation of the detonation of nuclear weapons, to enable designs to be tested without conducting actual nuclear explosions. In order to achieve some degree of accuracy, this software had embedded in it a great deal of data from previous nuclear weapon tests and US bomb designs, all of which was classified “secret”. As such, the software itself ought to have been classified secret rather than restricted.

Some activist sites seem to imply that the copying was routine, and could have been done practically in error. In fact, it took over 40 hours across a single two month period, and involved modifying file metadata to remove security markings.

It may or may not be true that Dr. Lee destroyed the downloaded data; we have only his own word for it. Computer forensics determined the number of tapes he had created, and when his office and home were searched, not all of them were found. When asked about the rest, he claimed to have destroyed them. Other explanations are, of course, possible.

It is not true that suspicion fell on Dr. Lee because he was born in Taiwan. That, of course, would make little sense if he was believed to spying for PRC!! In fact, he first came under suspicion during a wiretap of a suspect in a completely different espionage investigation in the early 1980s, when he appeared to be assisting that suspect to stymie an FBI investigation. The resulting investigation into Dr. Lee determined that he had not only assisted the suspect, but had himself passed unclassified-but-sensitive documents to a foreign power, in violation of DoE internal policies. The investigation concluded that there was no offense for which he could be prosecuted, but that he was not a fit and proper person to hold a security clearance. However, due to a communications SNAFU DoE was not advised of this finding until the second investigation years later!

It is also not true that he was completely open about his overseas contacts. At the conferences he attended in Beijing, every one of his US colleagues reported that the Chinese had attempted to elicit classified information, or to recruit them. The one exception who reported no such information was Dr. Lee. He did eventually admit that an elicitation contact had been made, but only years later, during a formal investigation of some of his suspicious behaviour.

That was kicked off by the weirdest incident, which occurred at the height of an FBI investigation into leaking of US bomb design data to PRC (for which no-one has yet been charged), and has still has not been explained. Dr. Lee walked in, unannounced and unauthorised, to a detente meeting with a top level delegation of visiting Chinese physicists, and warmly greeted Dr. Hu Side, the head of PRC’s bomb design bureau. In front of US officials, the two then held a conversation in Mandarin, in which — according to a US translator — Dr. Hu thanked Dr. Lee for his assistance in hydrodynamic modelling (a discipline which Dr. Lee performed for US bomb modelling.) Had this conversation been intercepted by wiretap, it would have been taken as strong proof of espionage. That it was openly made in front of US officials (albeit in a foreign language) left them perplexed, but most certainly stimulated an immediate investigation.

Durign the course of that investigation, the FBI arrived at the hypothesis that Dr. Lee was not specifically attempting to assist PRC, but was trying to establish credentials with any agency that might employ him in the event he was laid off from LANL, which he apparently feared. To test this hypothesis, they set up a sting operation. He didn’t bite, but he nibbled pretty vigorously — and did not report the contact from the “foreign agent.”

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.