Schneier on Security
A blog covering security and security technology.
« Security, Economics, and Lost Conference Badges |
| "Lessons from the Sony CD DRM Episode" »
February 17, 2006
Database Error Causes Unbalanced Budget
This story of a database error cascading into a major failure has some interesting security morals:
A house erroneously valued at $400 million is being blamed for budget shortfalls and possible layoffs in municipalities and school districts in northwest Indiana.
County Treasurer Jim Murphy said the home usually carried about $1,500 in property taxes; this year, it was billed $8 million.
Most local officials did not learn about the mistake until Tuesday, when 18 government taxing units were asked to return a total of $3.1 million of tax money. The city of Valparaiso and the Valparaiso Community School Corp. were asked to return $2.7 million. As a result, the school system has a $200,000 budget shortfall, and the city loses $900,000.
User error is being blamed for the problem:
An outside user of Porter County's computer system may have triggered the mess by accidentally changing the value of the Valparaiso house, said Sharon Lippens, director of the county's information technologies and service department.
Lippens said the outside user changed the property value, most likely while trying to access another program while using the county's enhanced access system, which charges users a fee for access to public records that are not otherwise available on the Internet.
Lippens said the user probably tried to access a real estate record display by pressing R-E-D, but accidentally typed R-E-R, which brought up an assessment program written in 1995. The program is no longer in use, and technology officials did not know it could be accessed.
Three things immediately spring to mind:
One, the system did not fail safely. This one error seems to have cascaded into multiple errors, as the new tax total immediately changed budgets of "18 government taxing units."
Two, there were no sanity checks on the system. "The city of Valparaiso and the Valparaiso Community School Corp. were asked to return $2.7 million." Didn't the city wonder where all that extra money came from in the first place?
Three, the access-control mechanisms on the computer system were too broad. When a user is authenticated to use the "R-E-D" program, he shouldn't automatically have permission to use the "R-E-R" program as well. Authentication isn't all or nothing; it should be granular to the operation.
Posted on February 17, 2006 at 7:29 AM
• 24 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
I bet there is a pile of kluged code behind this one. And, I also bet the contract to write this program code was outsourced.
Did they send the tax bill to the owner of the house? :)
I bet they did. And when he sent it back saying "what are you people smoking?" their first response was "that's what the computer says, so you have to pay it."
I have seen this sort of transverse failure in a web-accessible government database before.
Using the NSF Fastlane grant submission system recently, I accidentally performed a set of operations which I don't think I could reproduce, but as best as I can reconstruct involved a web "previous page" or "refresh" operation while the database was updating with info I had just submitted.
Suddenly, I found myself staring at someone else's grant application, from a totally different and unrelated field. This was very alarming, since grant proposal information is highly proprietary, as it generally divulges research results and ideas that have not yet seen publication. I quickly logged out and restarted my session, so as to avoid any danger of improperly modifying data, and to avoid any appearance of tampering.
But I have reflected since on the extreme fragility of a system with authorization boundaries that can be transgressed so easily, and wondered how many other web-accessible databases containing much more critical records (medical, fiscal, financial, etc.) are subject to the same sort of weakness. It's certainly caused me to wonder about the security of Internet Banking back ends -- I doubt this sort of thing is a government monopoly.
I feel your Fastlane pain. I'm hoping that grants.gov will be better.
FYI, _all_ US government research proposals will be filed through grants.gov sometime in the future, which I think is a great idea. It's way better than having one sytem for each agency (DoE, DoD, NASA, NSF, NIH, et al.).
I actually sort of liked Fastlane, I thought it compared well to the trashware that NASA uses.
I'm somewhat worried about grants.gov -- initial word is that it only supports IE-based browsing (how's that for security?). We run a Linux-Unix shop, so if this isn't fixed soon we're going to have major trouble with grant submission.
But we digress...
Is this the slightest bit surprising? I've done web interfaces for organizations much larger than Porter County and had to do everything. Database, security, application design, interface design, testing, everything all in one person. I don't believe I'm an expert in all of those fields, but I try to cover them all as well as I can. I've run into a lot of people who do the same thing with far less experience who believe they are an expert in all of those fields.
The majority of web interfaces are piecemeal systems that are put together by the lowest bidder to achieve the minimum needed functionality. As we all here should know, security is not a minimum feature in most applications. In this case, access was granted to everyone who could access a program. This is not surprising considering how insecure some of the applications are that I've seen.
One trait that I see in most systems these days is a lack of a knowledgeable person or group responsible for the data. Responsibility is one of the first traits that gets cut in the effort to cut costs or outsource functions. People are expected to follow rules and perform certain tasks. They are not expected to understand why they are doing those tasks or why the rules exist.
In this case, if the database that cascaded into budgets and other places were controlled better, some outside user could not have created this problem. An internal user probably could, but they are less likely to try something they don't understand than a random web user. Also, a well configured system would leave an audit trail of which user made the change and when.
I hadn't noticed that, nor had I noticed that Windows is required for the PureEdge Application Viewer. :(
I'm a Linux user as well, so this is a bit troubling.
There are two principal ways of disabling functions in code -- passive and active.
Passive, which I think is illustrated here, is simply "Don't use it anymore." This is the cross-your-fingers-and-wish-as-hard-as-you-can option.
Active means switching off the function. This requires that someone was smart enough to design-in the switches, and that the code was thoroughly tested for all switch positions.
Unfortunately, bad design, coupled with bad coding and incomplete testing, will result in switches not working right (or not being there) or having unexpected effects.
Hence the preference for passive disabling.
Is there a house that is worth 400 million dollars anywhere in the world? How would one prevent these kinds of errors by authorised personnel where a large number of high value homes exist? Audits?
> Is there a house that is worth 400 million dollars anywhere in the world?
With all the embedded tech, Bill Gates may have spent that much on his uber-pad. Of course, "worth" depends on how much you're willing to pay for something, so I doubt he could get a return on investment on his house :)
I doubt that the City of Valparaiso and Valparaiso Community School Corp. got an extra 2.7 million. In the jurisdiction I live in, a budget is developed, based on the previous years budget, and then the budget number is divided into the total assessment for the jurisdiction, resulting in the mill rate. If everyone's house doubles in value, but the cost of snow removal stays the same, taxes are unchanged and the mill rate is halved.
If 400 million is accidently added to the total assessment of the city, then the mill rate is too low and every taxing unit will have less money than budgeted for.
Not surprising. The mass appraisal systems in use by local government today are an absolute mess of code, pieced together by a bunch of cobol programmers who don't get OOD. No front-end or back-end validation being performed, and the data is not reviewed regularly for errors. The industry leaders are JUST NOW starting to roll out cama products w/ SQL or Oracle back-ends, and the appraisal systems are generally a half mil to a mil in cost. Ridiculous.
This story seems more like urban legend than reality, ergo, the fascination with it.
If someplace has a system this bad, they deserve everything bad that goes with it.
But in my lifetime, I know of no local government that immediately disperses funds to budgets based on changes to the tax base. Those funds don't go out until AFTER they are collected.
The first point, that the system didn't fail safely, seems like an instance of fault propagation, one of the really big issues in software design specifically and information systems generally. The suspension cables for the Golden Gate Bridge consist of multiple strands bound together, so if one strand breaks, that break won't propagate throughout the rest of the cable. How to do that same thing with information in a system of communicating entities, in a way that's practical, is still an open, and very difficult, engineering problem.
There used to be a lot of active research in the area, but I've lost track of it. Does anyone have links to research projects looking at this problem?
This article seems to argue for capability-based systems, rather than access-control-list based systems.
Of course a basic sanity check would have been to compare the valuation with others in the local market. Any idiot would see that nothing else was even valued anywhere near that price. Exactly what kind of house would sell for $400,000,000.
"The program is no longer in use, and technology officials did not know it could be accessed."
This is at least as big of a problem as the lack of a capabilities system. After all, how can you even implement a capability system for functions you don't know exist?
Although this problem is especially striking in custom code, like in this case, it's just as much of a problem for commercial software. I'm sure many people remember how difficult it was to secure a Windows 9x workstation just because of how many ways there were to get at any particular subsystem.
I don't know if there's a magic bullet for this sort of problem, but whatever the solution is, I think it must be deeper than just "improve the software"---not that we don't need to do that, but it's not enough. I believe the most promising avenue is security features at the programming-language level. This isn't just a case of the carpenter blaming his tools, though: we are the tool *makers* as well as the tool users.
Whenever I hear this sort of story, I can't help but think of Edsger Dijkstra:
"[the average computer user] has been served so poorly that he expects his system to crash all the time, and we witness a massive worldwide distribution of bug-ridden software for which we should be deeply ashamed."
Well, look on the bright side: at least it wasn't another voting system that screwed up the government this time.
Spot Bruce's error!
"Authentication isn't all or nothing; it should be granular to the operation."
Are you all asleep? Was this minor imperfection lost in the blinding glare of Bruce's fame and notoriety?
The word he should have used was "authorisation" (or "authorization" if you insist on having your own dialect). That's what is granular to the level of operation. Authentication is determining subject identity and you commonly either know it or you don't, but authorisation is permission for a subject to do a specific action (like R-E-R).
Fine-grained authentication, if there was such a thing, would let you do most authorized operations with just a login/password credential, but would ask for more certainty about your identity (like a biometric or a vote from a friend) if you requested a potentially dangerous operation. But you would be the same authenticated subject all the way through. I don't think that scheme would fit under the banner of an ACL - well not any ACL I've ever seen.
This correction has been a free service from the Expert Reality Check Committee.
The beauty of a capability-based system is that the ability to access something must be enumerated. For example, enumerating the right to access the X, Y, and Z systems implicitly means you do not have the right to access the W system. So, if you don't know the W system exists, you won't be allowing access to it.
Agreed, not a perfect solution. It's pretty easy to shortcut. I can imagine the following thought process:
"All the systems are public, so I'll just allow the capability to use all of them."
If the system in question had orthoganal capability systems, like:
* The right to write
* The right to read
* The right to use the W system
* The right to use the X system
Then, the login used could have been restricted in a pretty simple manner: eliminating the right to write works across all systems, and they should only be able to view the X system. Security in depth, yes?
I think that user-rights assignment, program-rights assignment, and intentional rights assignment (that is, intent-based) combined with ACLs can be a very powerful security mechanism, especially when used in the "Default Deny" model that many is correct.
"The word he should have used was "authorisation" (or "authorization" if you insist on having your own dialect)."
Yep. Good catch.
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.