Schneier on Security
A blog covering security and security technology.
« Bank Botches Two-Factor Authentication |
| Friday Squid Blogging: Squid Birth Video »
April 13, 2007
U.S. Government Contractor Injects Malicious Software into Critical Military Computers
This is just a frightening story. Basically, a contractor with a top secret security clearance was able to inject malicious code and sabotage computers used to track Navy submarines.
Yeah, it was annoying to find and fix the problem, but hang on. How is it possible for a single disgruntled idiot to damage a multi-billion-dollar weapons system? Why aren't there any security systems in place to prevent this? I'll bet anything that there was absolutely no control or review over who put what code in where. I'll bet that if this guy had been just a little bit cleverer, he could have done a whole lot more damage without ever getting caught.
One of the ways to deal with the problem of trusted individuals is by making sure they're trustworthy. The clearance process is supposed to handle that. But given the enormous damage that a single person can do here, it makes a lot of sense to add a second security mechanism: limiting the degree to which each individual must be trusted. A decent system of code reviews, or change auditing, would go a long way to reduce the risk of this sort of thing.
I'll also bet you anything that Microsoft has more security around its critical code than the U.S. military does.
Posted on April 13, 2007 at 12:33 PM
• 45 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
"The sabotage also forced the Navy to impose additional layers of security to prevent similar security breaches."
Poor Navy. I hope they were sufficiently compensated for the burden.
There are countless guidelines that could be followed to prevent this type of situation. However, it all comes down to having multiple trusted people responsible for different, but critical parts of the process.
This is all easy to do in theory, but given the endless downward spiral of cost cutting, it's almost guaranteed that the redundant people will go away over time and leave one ultimately trusted person.
Separating privileges into roles is a good idea to keep a process secure, until you end up combining roles and ending up with the Unix root security model. Someone is ultimately capable of anything, so hope that person is on your side.
There's any number of things that could have been done differently to avoid this situation, but every one of them would fall apart in the face of budget pressures. While they are working, the measures do not demonstrate their value. Only through failure can you see in no uncertain terms what has been lost.
While not a solution in itself... but it seems likely that people who feel like they belong to an organization are less likely to sabotage an organization. Using contractors rather than paying competitive wages to attract and retain such skills in the Navy might have contributed here. Motivation matters, I find people who do work just for the money, disturbing, but I can't really blame contractors either since their employer has said up front they don't belong...
Contractors are not necessarily de facto doing something just for the money, but are certainly more likely to be.
Having worked on AIX at IBM, no there isn't any real security but there is a very tight audit trail. The place to look for similar situations is in the banking industry. It takes at least 2 bank employees to change anything and possibly more.
Bruce, are you forgetting about those russian hackers who had read/write access to the entire MS source repository for 6 months? It wasn't that long ago, was it?
Two words: "Falcon" and "Snowman".
I agree with Bruce's suggestion that another layer of security would be good. My $0.02 is that using processes that required oversight/supervision by Navy personnel might be worth considering. A sailor has a somewhat different motivation to a civilian contractor and lives with the threat of harsh military justice for serious mistakes. which leads me onto my next point...
Am I the only one who amazed by the leniency of the sentence? Perhaps the mental illnesses referred to by the defence are partly why. If I had been caught doing something like this I would expect to be sent to a Fed Pen for a very long time.
I once had a vendor describe the processes for developing/maintaining one version of State Lottery software.
The development team writes the code and commits changes to a repository.
The secondary team, located in a different city and with no contact with the first team, examines and commits these individual changes to the real codebase. They are not allowed to make changes, just approve or disapprove changes from the first team.
It seems like military software should have a similarly secure process.
"Trust - but verify" --Ronald Reagan.
I just read the article, and I too was "amazed by the leniency of the sentence". With a possibility that it "could result in loss of life" seems to call for a much harsher sentence compared to say just loosing a pile of money. What would cause someone to think twice to take payment to carry out such a plan when the only consequence would be a mere slap on the wrist.
Once one passes a single subject background investigation there is indeed a lot of trust placed in the individual.
I think the people in this guy's company need a refresher in OpSec and reporting requirements, if the guy was actually depressed and bipolar and no one told anybody.
Having worked at MS for a couple of years, I can tell you that just about any developer can get write access to the source repository. All that is required is a signoff from the responsible PM. Some of the reasons I used were tangential at best, yet all of them were approved.
As someone who has written software for government contracts, I am quite shocked that something like this wasn't caught. We had procedures somewhat like what Darron described.
This is why during my brief 21 year career in the Air Force we had what was called SCI clearance and access. Each part of the project was unique and a signature (hidden) to show who did the work and review.
Ten years ago I worked for a company that made electronic test equipment.
You did some code, it got reviewed by the Design Authority for the code in question - an expert in that code, and put in the development baseline.
Come release time, the code, if required, got pulled into the release, built and tested by the test team.
Each stage was done by a separate person/team to ensure quality.
When I left they were integrating mandatory code quality metrics before code would be accepted back into 'dev'....
Even today, I've never seen anywhere with as good a quality regime. Mind you, some of the electronics were inside live air-to-air missiles :-)
Counterpoint to the contractor argument: as a contractor, I know that I'm being paid to do a particular job, and that I'm only as good as my last project. For a truly disgruntled individual, try the 6 year employee who has watched her bonus, benefits, and perks slowly leached away in the name of cost-cutting...
I'm guessing role based access control is the best solution so far for these kind of critical systems.Because it guarantees to practice the Principle of least privilege
I agree completely with DB. The employees are fed a line about being part of the family and how their jobs are more secure because of that. They are the ones who have a reason to be upset when they are deemed to no longer be cost effective and they're shown the door. Contractors know they are expendable resources.
I've been a contractor with the same primary client for over 4 years. In that time I've seen a lot of people come and go and the bad apples aren't determined by whether they get a W2 or 1099 at the end of the year. Even the employees are all short term resources in a big company, since the way to get a promotion is to get hired in a better position in another part of the company.
Interesting that this wasn't escalated to being considered a "terrorist" act.
> It takes at least 2 bank employees to change anything
For a handful of changes that's so. Everything else is a free-for-all. In a security department it's really a case of give up and try not to let the lunacy kill you.
Increased empty-head-count doesn't help either.
During Operation Desert Storm, software piracy was rampant. People on 12 hour exercise shifts with nothing to do started playing games on duty. Some copies were pirated, and subsequently an entire facility's 386 intelligence workstations at a TS/SCI location (which I shall refrain from naming) became infected with the Jerusalem B virus.
As the organization's liason I was begged to fix the problem and told them that they were going to have to reload the workstation operating systems again and that procedure required their commanding officer to inform the AFOSI (who happened to be across the hall from where this happened). I was again strongly requested to clean up, cover up or get rid of the problem without notifying the OSI myself - and if possible clean the viruses off the gaming disks!!! That was their biggest priority!!!
When I arrived at the SCIF I asked for the games and they took a Hefty bag full of them out from behind a tile in the ceiling.
The classified environment is extremely incestuous and there is a lot of tolerance for "blowing off steam" type behavior that would be inappropriate anywhere else. "What happens in the SCIF stays in the SCIF." Even 17 years ago there was no protection for the whistleblower.
Then there was the sweet young thang who was boinking her NCOIC. They were on duty together one night in the SCIF and she dropped a disk pack. It was the backup pack, so it was constantly being rotated in and was used on every system in the SCIF. Since he didn't report the incident in order to protect her, they ended up having to repeatedly repair nine of the eleven drives and had to replace 72 other disk packs that went out of alignment before the truth of the matter was discovered.
Knowing what I know about how things were then, I can only imagine what goes on now. There is a lot we are not hearing about this time around. Abu Ghraib is just the tip of the iceberg, I guarantee it.
We need back our checks and balances - and we won't have that so long as the people at the top continue to treat the intelligence community like a good ol' boy's club. They stopped doing their own dirty laundry a long time ago. Too busy nosing through other people's. Where ARE those emails the Senate Judiciary Committee is looking for anyway, hmmm?
There are two kinds of "contractors" in the defense industry
1) Entire companies which accept contracts like cost-plus or fixed-price directly from parts of the DoD or from prime contractors who have themselves taken on a contract either from the DoD or from a more-prime company up the food-chain
2) Individuals who almost universally bill an hourly rate to a contractor of type 1 and do "employee-like" jobs - some are specialists that the company can't find or otherwise keep in house and some are just more-temporary versions of regular employees.
When people talk about 'loyalty' vs 'money' issues they are usually talking about the type 2.
However, this case appears to be about an individual who owned a company (Ares Systems International) of type 1. He himself was an employee of the company, it may have been a sole proprietorship or a limited partnership because he evidentially took very poorly to losing the bid to renew(?) the contract the company held and retaliated in a way that got him in hot water.
Comparing security at the U.S. military with that of Microsoft's and finding the military has the short end of the stick is a low blow.
Actually, I think the military uses Linux and OpenBSD quite a bit...both considerably more secure than Windows.
I think Bruce meant that Microsoft protects its source-code better than the military does...even though the Windows source-code is crapola. I don't believe he means to say that the operating systems used by the U.S. military are less secure than Windows Vista!
@Anonymous: Linux and OpenBSD are no more secure than Windows if you are allowing a random contractor to update code at will. And "considerably more secure" is very relative - all depends on who is doing the securing.
I've done some programming work in that part of the economy, and, believe me, they have plenty of review and audit procedures. The problem, though, is that the procedures frequently become a pro forma nuisance just to be gotten out of the way. As long as all the right forms get filled out and properly filed, everyone is happy. That's the trouble with reliance on formalisms. They can't obviate competence and care.
Pentagon's software development is more than broken.
It uses sub^n-contractors .. and worthless pm (also subcontracted in most cases).
30 years ago Military was rigorous, now it's a joke .. I am not surprised .. actually surprised that we heard of only one ..
There are systems that are supposed to track some object that should take mSeconds, but actually take 10's of seconds .. by that time the bird has flown away .. and Pentagon is glad to get a system that just doesn't crash .. This I believe is worst than injecting malicious code .. it's the invasion of stupid at the cost of $$$$$$$$$$$$$$$$$$$$$$$$$
My first (and current) job out of college is working for a Navy sub-contractor that mostly does sonar systems for Naval surface ships. The thing that really surprised me was the seeming autonomy that I was given after about a week. Now granted the system that I first started on was not all that complex, nor critical, but I was still quite surprised. As for Another Contractor, I guess I'm technically type 2, cause we all bill an hourly rate, however my heart is very much into the work. My dad did 20 years in the Navy and a good deal of that was ASW work, so I usually keep that in mind. As for the OS questions, all the systems that I have ever come across have all used LINUX, mostly RedHat variants.
You are right on the mark. Too much "checkmark security" where "security" consists of some lowest-bidder type following a procedure just to get to the end so they can make a checkmark on the list of required procedures, but absolutely no understanding what the procedure is meant produce.
All the processes in the world can't make up for ignorance and incompetence. Similarly, if you have competent people doing the work, process is only of marginal benefit and often a serious "red-tape" cost that hinders productivity.
Since most of these operations have the mantra that "no one is indispensable" there is little reason to invest in the people to make them competent either.
Article: "If we can't trust people with top-security clearance, where are we?" the judge said toward the end of a lecture to Sylvestre.
This shouldn't take the judge by surprise. Trusted people commonly abuse their powers, and have for millennia. Just because you're trusted doesn't mean you're any less fallible than anyone else, it just means that (hopefully) your statistically less likely to do something abusive. Usually, it's a good gamble, but it doesn't always pay off.
The article in The Washington Times was better.
This guy was _not_ developing software, he was a contract sysadmin. He was mad because he lost the contract to manage the systems. He apparently did not add malware, but simply set up some chron jobs to delete files and make the system fail after he was gone. The stated motive was to make the new contractors look bad.
This guy is getting some well-deserved time in the slammer. Perhaps a picture of him in an orange jumpsuit, slow-dancing with Bubba in the Pen, should be the wallpaper image on the systems. This to serve as a reminder to all the other contractors.
Contract changeovers happen all the time. They are not normally viewed as being an adverse dismissal situation. Perhaps that will change now.
The guy was root. It would be almost impossible to prevent someone with this level of access from being able to sabotage a complicated production system in some way. If you do not trust him, the only real recourse would be to rebuild the systems from clean media. This is painful and risks causing the same loss of service in a 24x7 system. Maybe in the future contractor changeovers will require such rebuilds, despite the pain and risk. You can bet they will at least check chron for any little surprises...
The ships crew should control the computers. The computers shouldn't control the ship and crew. You should be able to shut all the computers down and still operate the ship and the weapons systems.
You should be able to look at the code, so if it changes you know it has changed and you know what the changes are and who made the changes and why. It sounds like they are putting keys into locks without knowing what the lock should look like or how it should work. You can't just guess at security.
"Trust is good, but control is better ..."
I think the last paragraph is the most scarry one.
This scares me. One thing that scares me even more is the punishment for his crime. This guy only got a year in prison and a $10,000 fine. I'm sure that his contract was for a lot more. In a comparison of crimes, there is a city council woman in Pittsburgh was accused of embezzling $40,000 tax payer dollars and she may get a much worse sentence. His punishment should have been more severe.
Thinking of the issue of audit trails on changes made reminds me of this item I read about on comp.risks years ago: a few years back, there was a big flap in the media about a change made to the copyright law that changed the details of "work-for-hire" copyright on sound recordings, which basically had the effect of advantaging the big record companies at the expense of musicians. The musicians were (naturally) Seriously Annoyed about the whole thing. The deal is, nobody is entirely sure who slipped that provision into the bill. (Suspicion fell heavily on one staffer for Orrin Hatch who, by an amazing coincidence, took a job with the RIAA a few months after the bill was passed, but nothing could be proved.)
Think about that for a minute. Forget about DoD software repositories or the master copy of Microsoft Vista, there isn't even a decent audit trail telling who came up with changes to our nation's LAWS!
Surely the military have the ability to check, check, check and review what is going on. At least they did when I was in the Army. If sabotaging any deadly weapons system is really that easy, perhaps we need to look inward, far short of stupid mid-east wars, to safegard our troops and systems.
Um, you seem to be missing the most important point.
Someone, somewhere, knows where our submarines are located.
This is a single point failure for our retaliation strike capability.
I interviewed for a job in communications. The company was a military contractor. They told me in the interview there was mandatory code review on all code.
They let it slip that they were running behind schedule and were somewhat desperate for people. I turned down the job offer so I don't know if they really had all code be reviewed.
At some level you need to trust your employees. No computer system can withstand the abuse of trust. Were procedures lacking? Yes. No system is perfect.
I'm not defending the fact that so much data was sitting in one place. That was probably foolish. However, there is also security when the right hand knows what the left is doing and they coordinate activities. What we're talking about here is a balance.
Bruce, you're usually on the mark, but I think you missed it this time.
Having worked on AIX at IBM and as a contractor on numerous highly classified programs for both the military and civilian agencies, I can state without reservation that the problem with this navy program was the lack of Configuration Managment (CM). CM includes, not only audit trails, but limited access to code source, independent third-party testing, and inclusion in production only when all milestones to reach that point have been achieved. When I worked on AIX, IBM had all of these steps, and all government projects I have worked on have had them. None of our programmers or administrators could mindlessly insert into production damaging code. Any project that would permit this kind of thing to happen, sabotage code to be inserted, seriously needs to have its managment changed.
Interesting to see the way that an insider 'hack' is punished by the US military legal system.
Compare the extradition to the USA of a UK hacker who is being threatened with 75 years in a US jail (He is alleged to have gained access to hundreds of unsecured Pentagon systems while looking for aliens!! They priced the cost of 'securing' the systems - e.g. using passwords - at £700,000)
Judged quote from the article: "I think the severity of the crime is overwhelming"
"former government contractor ... was sentenced Wednesday to a year in prison."
@ TED Vinson
> This guy is getting some well-deserved time in the slammer.
I'd disagree. Deliberately causing malfunctions on military computing devices deserves a hell of a lot more than a year in the clink, to my way of thinking.
A whistleblower report that you had unreasonable elevated access to the NY Times would have produced plenty of bad press for the Navy and seems to be a much more socially responsible (actually, constructive) revenge.
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.