Back Door Security Threat in Interbase Teaches Broader Lessons

  • Bruce Schneier
  • InternetWeek
  • March 12, 2001

When a hacker adds a back door to your computer systems for later unauthorized access, that’s a serious threat. But it’s an even bigger problem if you created the back door yourself.

It seems that Borland did just that with its Interbase database. All versions released for the past seven years (versions 4.x through 6.01) have a back door. And, by extension, so do all their customers. How it came about and how it was discovered should serve as a lesson to all IT managers.

Versions of Interbase before 1994 didn’t have any access-control mechanisms. When the company added access control in version 4.0, it used a peculiar system. The engineers created a special database within Interbase for account names and encrypted passwords. This solution created a new problem: In order to authenticate a user, the program had to access the database; but before the program could access the database, it had to authenticate a user.

Now there are many ways to solve this problem, but the approach used with Interbase, hard-coding the user name “politically” and the password “correct,” is not the one I would have chosen. This is the back door; anyone using that user name and password can access any Interbase database.

Lesson one: Deliberately adding a back door creates a security problem of unknown and changing magnitude. I call this the “window of exposure.” The moment this product was shipped, the vulnerability existed. But as long as no one knew about it, it wasn’t a problem. At this point, we have no idea if anyone in the hacking underground knew about the vulnerability, or if any criminals took advantage of it. Certainly, the programmers who coded the “feature” knew about it. They could have told others. Word could have gotten around.

Now the vulnerability is public. Within days of the announcement, there were reports of scans looking for the vulnerability. Borland issued a patch, but who knows what percentage of users will patch their systems? The vulnerability will remain in many systems for years.

Lesson two: Open source helps security, sometimes. Interbase was made open source in July 2000. The vulnerability was discovered six months later by a German software developer. If he hadn’t discovered the vulnerability, maybe we would still not know about it. If someone had looked at the code sooner, maybe we would have known sooner. Open source means that more people examine the source code, but it is no guarantee that vulnerabilities will be found or, even if found, fixed properly. If the person who discovered the vulnerability was intent on breaking into systems, this would have played out much differently.

Lesson three: Don’t assume that just because you’ve taken proper security measures, your vendors have too. Company X’s customers trust Company X. If Company X was using Interbase, those customers were also trusting Borland…only they had a false sense of security.

Back doors have the unfortunate property of being all or nothing. It’s like leaving your house key under the mat. If no one knows about it, it’s pretty safe. If everyone knows about it, it makes your door lock useless. Borland certainly belongs in the doghouse for this one.

The Interbase discovery is more common than you might think. Many products built with security shortcuts-hardcoded passwords, obfuscated access means, default logins-are only as secure as their secrecy. This is very fragile security-it fails as soon as the secret is revealed. Good security is resilient, and remains secure even in the face of full disclosure, of open source and of programmers leaving the company and revealing all they know. This is why smart security engineers insist on public peer review for all security systems. They know that this can only improve security.

Categories: Computer and Information Security

Sidebar photo of Bruce Schneier by Joe MacInnis.