RuggedCom Inserts Backdoor into Its Products

All RuggedCom equipment comes with a built-in backdoor:

The backdoor, which cannot be disabled, is found in all versions of the Rugged Operating System made by RuggedCom, according to independent researcher Justin W. Clarke, who works in the energy sector. The login credentials for the backdoor include a static username, “factory,” that was assigned by the vendor and can’t be changed by customers, and a dynamically generated password that is based on the individual MAC address, or media access control address, for any specific device.

This seems like a really bad idea.

No word from the company about whether they’re going to replace customer units.

EDITED TO ADD (5/11): RuggedCom’s response.

Posted on May 9, 2012 at 6:24 AM37 Comments

Comments

wiredog May 9, 2012 6:57 AM

Correct me if I’m wrong, but these devices are intended to only have remote access for debugging purposes. When not debugging, you physically unplug them from the modem or internet connection or you keep them inside the firewall.

I’ve programmed systems like that. They get deployed to companies that don’t have IT staffs that can administer the devices themselves. So when something goes wrong you say “Plug it into a phone line and give us the number” and call in to the device. Once it’s fixed, the unplug it.

Not being plugged in to a public network is an extremely good security policy. Better than any passphrase, however strong.

Clive Robinson May 9, 2012 7:01 AM

The question now it’s been discovered boils down to “Were they idiots or malicious?”

However there is a long history of this sort of thing and it keeps re-occurring and it would be more helpfull to ask why.

The problem orriginates from a viewpoit or perspective on “test software”. Test software to be effective has to be able to get data from all parts of the system irrespective of where it might be. The flip side of this is that such test software is also an almost perfect security breaching tool.

Now if “implementimg” test software on an end user product was just left to the experianced techs to make choices about then the chances are it would be left fully disabled by default.

However the less experianced the techs the more of it will get left on.

Unfortunatly leaving test software on gives a quick pay back on having it developed and the extra product resources to support it, by saving support costs and producing faster “bug fixes”

So when non technical managment look at the choice it’s virtualy a “no brainer” for them, it gets left on and fully enabled by default…

Non technical managment especially “marketing” managment are very very bad at security because as they will cheerfully tell you “security does not sell”. And it’s true from the markets perspective, the users want lots of shiny features and quick fixes to bugs and the company that does this best will beat the more security concious company out of business very quickly.

So the question of who’s to blaim is probbly best answered “it’s us” by buying in preffrence in secure products.

Which then raises another question which is “are the users actually capable of making security decisions correctly?” tow which the answer we know is no…

Thus I fully expect this sort of nonsense to continue untill there is a major change in the market to shift the tipping point. And no I’m not going to hold my breath on this. And please don’t sugest legislation, it’s been so long since anyone produced legislation that was not twisted to suit the suppliers and their associations that I hold no hope of it ever becoming untwisted again.

Clive Robinson May 9, 2012 7:13 AM

Am I the only one to notice RuggedCom is a “Siemens Business”, and those with even relativly short memories will recall that Siemens has significant “previous in this behaviour”…

For thos with realy short memoriess I’ll just say Stuxnet.

Bjorn May 9, 2012 7:50 AM

Well, Siemens acquired RuggedCom this year (2012) so there is correlation to Siemens products other than that they are in the same business. But it might say something about the security in the business.
I tried the backdoor on a device bought late 2011 and it did not seem to work.

aixwiz May 9, 2012 8:05 AM

This seems like a really bad idea.
We’ll have to submit this to the “Computer Security Understatement of the Year” award contest.

When a company puts a “backdoor” into their products they are circumventing the security system and leaves the customers vulnerable. When will the vendors learn that they can’t continue to put backdoors or other features that negate the security?

Harald May 9, 2012 8:25 AM

We put back doors into your products because you want us to. You have more problems with locking yourself out of a device, or wanting quick remote customer support, than you do with security breaches. (Especially when, as described above, your devices are protected in other ways).

Put another way – the incentives are in the wrong place. Companies are encouraged to create these back doors because they want to reduce customer support costs. The only incentive against them is the potential embarrassment, and we’re all so jaded these days that the cost of embarrassment is rather low.

Adam May 9, 2012 8:39 AM

People put backdoors in these kinds of devices so they can service them. It’s understandable albeit it’s pretty stupid if they can be remotely compromised.

I think it would have been better if a person had to be physically hold a button down or use some kind of dongle in order for the factory login to succeed. It may be of course that these things get bolted to ceilings, crawlspaces and other inaccessible locations, so a soft token might be required instead. But that would still be better than a backdoor.

jacob May 9, 2012 8:45 AM

I am just so disappointed. It seems everybody from the NSA, FBI, Homeland Security, Siemens, RuggedCom, who knows maybe my mac camera, and facebook all want to backdoor me…who needs aliens and their probes when all this is going on… Hey guys!! No means No!! 😉

On a more serious note. I wish I could find a study as far as why all this crap..
1. Money (customer service, recurring revenue, ads, etc)
2. Information (prevent terrorism)
3. Political power
Just a thought..

Chris W May 9, 2012 9:01 AM

Factory backdoor accounts aren’t implicitly “a really bad idea”, they simply should’ve added a small dip-switch on the device enabling or disabling it. This allows easy access in certain situations, such as closed networks or a service call and requires physical access to enable.
It wouldn’t have been an undocumented backdoor but an enhanced service feature.

Dr Zero May 9, 2012 9:31 AM

jacob, if I had to guess, I would say “convenience”.
Which I guess you could put under the “money” heading.

jacob May 9, 2012 10:10 AM

@chris. If I remember the article correctly, it was undocumented. That is the major hearburn and why they issued the advisory.

If IT didn’t know, it’s a problem for network security. The only way was if someone had a problem. Then tell you they can come in and fix it…

IT can turn off access if they know about it. Maybe traffic analysis would point out the problem, but knowing penetration testing, probably not.

B. D. Johnson May 9, 2012 10:23 AM

“Correct me if I’m wrong, but these devices are intended to only have remote access for debugging purposes. When not debugging, you physically unplug them from the modem or internet connection or you keep them inside the firewall.”

Yes, most industrial systems should be airgapped or thoroughly firewalled, but how many times would someone have to wake up in the middle of the night to plug it in for some remote repair before they just said “Fark it” and left it plugged in?

True, it’s horrible security behavior on the user end, but you have to plan for it on the design end. Goes back to the “The trouble with designing something completely idiotproof is underestimating the ingenuity of the complete idiot.”

Arclight May 9, 2012 11:42 AM

Large ISPs have a pretty simple solution for this. They have a unique password for each device, and they print that password on the bottom of the unit. This password is required for initial configuration and/or privilege escalation to change settings. This device password can be used, sometimes in concert with pushing a hardware reset button, to reset the user credentials and settings if they get corrupted.

Requiring physical access but still allowing “worst case” troubleshooting seems pretty sensible to me.

Arclight

Doug Coulter May 9, 2012 12:33 PM

I’d have to go with wiredog and Adam on this. I once did product development for a company that made things like paging and PBX systems who had sales in small volumes – but all over the planet – to people for whom crawling into the ceiling to hit a reset button would have been too much. And flying one of their guys halfway around the planet – or waiting for across-world shipping would have been a problem.

Same thing – an interface we could reach from the home plant, but only connected when there was trouble. Even then…the interface was so arcane that customers often left it connected, but there was never a breach. In this case…almost no hacker had the gear to send high speed dtmf – or bell 103 modem signals….anyway. Nothing real critical got controlled by any of this either, but I can easily think of things where it would have been a horrible idea.

But that’s really my point here – the same idea can be either good or horrible depending on the context – and people here tend to lean so far one way they forget that.

Jim May 9, 2012 1:39 PM

Malvin: I can’t believe it, Jim. That girl’s standing over there listening and you’re telling him about our back doors?

Jim Sting: [yelling] Mister Potato Head! Mister Potato Head! Back doors are not secrets!

Malvin: Yeah, but Jim, you’re giving away all our best tricks!

Jim Sting: They’re not tricks.

Tom May 9, 2012 3:26 PM

Just because a device is fire walled or air gapped doesn’t justify the existence of an undocumented backdoor. At the very least companies need the option to enable or disable the feature with disabled being the default. I happen to work in the energy sector and this sort of thing creates a ton of issues around CIP standards including exposing their customers to fines. Also just because something is on what should be a secure network doesn’t mean that you shouldn’t do everything reasonable to secure it further. This opens you up to insider threats and allows things to happen out of normal channels subject to change control and audit. Relying on a firewall is known as M&M security, because once you penetrate the outer shell it’s very soft on the inside.

Gweihir May 9, 2012 3:41 PM

This kind of stupidity will not die out until manufacturers become liable for any and all damages resulting from this and face criminal liability in addition.

Sam May 9, 2012 5:40 PM

Working in industrial automation, I’m sympathetic to the reasons they would have for doing this – as others have mentioned, your customers will, 100% of the time, view it as more important to be able to fix a “line down” issue quickly by logging in remotely than to ensure unauthorized entities gain access to the equipment. But with the advent of Stuxnet and the growing awareness of the vulnerable nature of these systems, somethings gonna have to give.

The only way forward I can see is to give the customers the choice of how to secure these units (or not). One way is to have the units shipped out with a default password, which the customer will be encouraged to change, but then they are responsible for telling the vendor that new password when calling in for support. I guarantee you most users will never change that password or do anything else to proactively secure their systems, but this is becoming a CYA issue for vendors.

Sam May 9, 2012 5:44 PM

Woops. Replace “unauthorized entities gain” with “unauthorized entities don’t gain” in my comment above.

Dirk Praet May 9, 2012 7:16 PM

From the company bulletin quoted by Frank, Ch. Eigler: “Please note that RuggedRouter(RX1000,RX1100) & RuggedBackBone(RX15xx, RX5000) products are not affected by this vulnerability. These products are designed to protect and secure operations networks that must be directly connected to the Internet or other untrusted systems.”

If their products are designed to protect and secure operations networks, then how secure is their code review process if their engineers can’t even spot a harcoded backdoor in the code ? This is not a vulnerability, it’s a deliberate backdoor they hoped nobody would find out about. This not only makes them idiots AND malicious, but pretty darn liars too. Their entire story is just as credible as that of PR guy of a chemical plant that has just blown up, claiming in front of the camera that there is zero danger to the public health with in the background a 30-meter high pink flame, orange clouds and a hazmat team trying to extinguish a collegue looking like the Toxic Avenger.

Bill Minuke May 9, 2012 11:13 PM

It’s based on Debian and has ssh. Rather than “old school” back doors with username/password, did nobody at RuggedCom look at the ssh manual and realize they could put a public key in the known hosts file and securely log in? Thus the company doesn’t need a back door. They can actually be ethical and document what they did. If I had a product that had a secret back door in my organization, I would rip all of that manufacturers products out of production and pass a policy banning use of all products from unethical companies. Having crappy software is one thing, we’re used to that, having the manufacturer build insecurity into the product on purpose because they are unwilling to hire software engineers who follow good security practices that’s inexcusable.

Jack May 10, 2012 12:11 AM

You’d be surprised how much hardware and software have back doors built into them, much of it legally.

GOOGLE: Cisco routers back doors

and you’ll find hours of reading material alone just for one company.

WIKILEAKS: published information on dozens of companies making spyware for hardware and software and selling it to governments.

When is the last time you checked the firmware on your PCI devices and network card?

Your router?

Dumped and checksummed/debugged your BIOS lately?

Why aren’t the anti-malware companies like Symantec and others climbing over each other in an effort to invent the technology and utilize it via the cloud to create GIANT databases of legit firmware for hardware in the fight against the most serious of root kits? Are they in bed with big bro?

How many so called remote exploits were patched this week in Windows? This month? This year? Since its release? Start from the beginning of the Windows version release and count all of the remote exploits up to present day and compare that to OpenBSD for example.

Ferris Bueller May 10, 2012 12:22 AM

U.S. gov’t wiretapping laws and your network
https://www.networkworld.com/news/2007/012307-us-govt-wiretapping-laws-and.html

“Activists have long grumbled about the privacy implications of the legal “backdoors” that networking companies like Cisco build into their equipment–functions that let law enforcement quietly track the Internet activities of criminal suspects. Now an IBM researcher has revealed a more serious problem with those backdoors: They don’t have particularly strong locks, and consumers are at risk.”
http://www.forbes.com/2010/02/03/hackers-networking-equipment-technology-security-cisco.html

Danny Moules May 10, 2012 6:06 AM

“Correct me if I’m wrong, but these devices are intended to only have remote access for debugging purposes. When not debugging, you physically unplug them from the modem or internet connection or you keep them inside the firewall.”

‘Hi, my name is Joe Bloggs and I’m calling from Maintenance. How’s the night shift going? Everything’s going crazy here and I need you do me a favour. You see the black box labelled ‘InsecureCo’? Yes, that’s the one. Can you just plug that into the wall socket? Just for five minutes so I can run some diagnostics. Mmmhmm…. yep, it’s working! [two minutes later] That’s all done, you can unplug it now. Thanks again!’

Of course, ‘Joe Bloggs’ is actually me – who simply searched for his name on LinkedIn – and I’m using a disposable phone.

“Not being plugged in to a public network is an extremely good security policy. Better than any passphrase, however strong.”

You’re assuming attackers target only technical systems and not human systems. This assumption is incorrect.

Dirk Praet May 10, 2012 8:13 AM

@ Danny Moules

And for all others pulling the “convenience” card I’d like to add that their argument is equally flawed. You can easily compare that to consistently using root in a non-RBAC environment. That’s poor security, full stop. The question is not if it’s going to bite you in the ass, but when. Anyone here who has never accidentally erased an entire file system executing rm -rf * in the wrong place or had a system compromised through some stupid service/application running as superuser ?

LinkTheValiant May 10, 2012 9:02 AM

Systems designers should not be leaving security to the lowest common denominator. I sympathize with those of you who believe in the “remote convenience” concept. But there is NO excuse for having an undocumented deliberate backdoor.

By all means, satisfy your non-security-conscious customers and put in remote access, but at the very least let IT know about it. Otherwise you are no better than the inside man at the museum who lets the rest of the thieves in.

Roger May 10, 2012 9:29 AM

Apart from the arguments about whether or not they should have a “field service” account, what they really screwed up is their key diversification scheme.

They actually have diversified keys for every device, created as a hash of the MAC. If they had used a proven KDF to do this, it would not have been such a serious problem. Unfortunately, they actually used a really weak home-brewed hash.

Gweihir May 10, 2012 9:57 AM

@Bill Minuke: Exactly right. The problem is that the people messing this up have no clue about security. They probably do not even know that something like public-key cryptography exists.

I am convinced by now that for most practical purposes, systems can be very well secured and that attacker effort can be driven through the roof. But this requires people that do understand security (and you can still get a degree in any IT area without mandatory lectures in software security), and people that understand the limits of their security skills. The Dunning-Krueger effect is strong in the practice of IT security and in Software Engineering in general.

Clive Robinson May 10, 2012 11:07 AM

@ Gweihir,

A friend of mine who currently works in a “psychocognition role” (nope I don’t know either before you ask 😉 refers to the Dunning–Kruger effect as “A right handed North American detector”…

She has told me that the effect realy only holds in the US as the reverse of the effect is seen in Asian societies and is marginal at best in most North European societies.

Also apparently the effect is not very prevelent in those who are either (or both) left handed or on the High Functioning Autistic Spectrum (Aspergers, dsylexia and even ADHD). But as these people are almost universally excluded from research work there is no “accademic papers”.

Also it appears to be “work related” and seen extensivly in certain professions almost universaly (apparently marketiers, advertisers, accountants and lawyers are in the group along with “The Masters of the Universe” of banking and the finance industries).

Now as many HFAS people are “engineers engineers” (or is it the other way around 😉 this would add further weight to the idea that this “glaring error in judgment” is “managment/marketing” inspired/driven. As managment/marketing people tend to be “social communicators” (ie waste most of their overpaid work time “networking” their image up in other peoples heads) and generaly significant self interested risk takers not impartial and reasoned thinkers capable of making sound judgments on long term issues…

g May 10, 2012 1:25 PM

@ Dirk, Bill:

You’re both parsing that wrongly, although in slightly different ways.

The products ‘designed to protect & secure’* use a debian based OS that does not contain the vulnerability/back door under discussion.

The ‘old school’ back door exists in their older/simpler/lower cost?* devices which use an OS they rolled themselves.

*does the protect & secure phrase mean they recommend the Linux based boxes as edge/gateway to their other products in standard configs? That would mitigate the threat somewhat despite not exactly being best practice.also, it just that the debian ones are newer, or come out of a different r&d lab or something, that they’ve changed over or are maintaining 2 OS? Their website isn’t playing too well with my phone so can’t find out.

Ed Franks May 10, 2012 2:07 PM

Firmware-driven PCI cards and network cards can be spoofed with software by a MAC change in a software firewall (ie: pfsense, iptables). If this practice is followed at the gateway, and better yet, at every NIC, there will bo no backdoor there for a 3-letter agency to figure out the make/model of your NIC

Klaus May 11, 2012 1:46 AM

To elucidate what Franks was saying, one can change the MAC address of a network card IF the router is a computer running some Linux or BSD. I do not think this can be done easily in an appliance. Further, if one makes the changed MAC, have a different OUI than the original vendor’s i/f had, it would obfuscate attempts to try some firmware backdoor entry for that brand of NIC. And besides doing this spoof at the gateway router external NIC, if this MAC spoof is performed on all the computers’ NICs which are facing that gateway router, so much the better.

Terry Cloth May 13, 2012 6:36 AM

Export controls?

From the RuggedCom statement:

There are two classifications of ROS firmware:
1) Export controlled that includes cryptography keys greater than 56 bits.

Does the U.S. still have crypto export laws on the books? Last I heard, they’d lifted the ban, though they said they still wanted you to tell them you were shipping it out. (That’s from when Debian decommissioned the non-US repository.)

jbs May 15, 2012 4:47 PM

 _________
|  JC CREW |
 ---------
    \                                  ___-------___
     \                             _-~~             ~~-_
      \                         _-~                    /~-_
             /^\__/^\         /~  \                   /    \
           /|  O|| O|        /      \_______________/        \
          | |___||__|      /       /                \          \
          |          \    /      /                    \          \
          |   (_______) /______/                        \_________ \
          |         / /         \                      /            \
           \         \^\\         \                  /               \     /
             \         ||           \______________/      _-_       //\__//
               \       ||------_-~~-_ ------------- \ --/~   ~\    || __/
                 ~-----||====/~     |==================|       |/~~~~~
                  (_(__/  ./     /                    \_\      \.
                         (_(___/                         \_____)_)

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.