FDA Recommendations on Medical-Device Cybersecurity

The FDA has issued a report giving medical devices guidance on computer and network security. There's nothing particularly new or interesting; it reads like standard security advice: write secure software, patch bugs, and so on.

Note that these are "non-binding recommendations," so I'm really not sure why they bothered.

EDITED TO ADD (1/13): Why they bothered.

Posted on January 10, 2017 at 7:15 AM • 30 Comments


Frank ScavoJanuary 9, 2017 4:38 PM

It is for legal reasons. In FDA parlance, "non-binding recommendations" is to differentiate them from regulations that are promulgated and, in fact, are binding on organizations regulated by FDA. Interestingly, when FDA issues industry guidance, FDA itself is not bound to follow its own guidance.

JasonJanuary 9, 2017 5:35 PM

There was a perception among medical device manufacturers that altering the software in any way required the device to be recertified. These guidelines counter that economically convenient interpretation.

FrankJanuary 9, 2017 6:30 PM

FDA does not certify devices, or device software. The new guidelines don't change that. They merely spell out FDA recommendations on security of embedded systems. I think the issue is that older device software was designed without the view that these devices might be on a network that has Internet connectivity.

DougJanuary 9, 2017 7:00 PM

'Not sure why they bothered'.

I worked in the device industry for over 30 years so guidance docs are kind of important.

The law that regulates the device industry is pretty sparse on details (21 CFR Part 820 is the main reg and it's a tiny little handbook) and can be interpreted in a lot of ways. The FDA tries to work with the industry to help them understand how the FDA is thinking about today. This is important because what was acceptable yesterday may not be acceptable today. The law won't change but the interpretation and enforcement will. When 820 was written, there was no network and computers were crude stand alone devices. The law hasn't changed but the FDA is now driving design processes for devices that use these technologies.

The FDA works with the industry and topic experts and generates guidance documents that help industry understand how the FDA is thinking about . By knowing what the FDA is thinking, we can adapt our design, validation, and manufacturing efforts to meet these expectations. Guidance documents drive much of what we do.

The concepts associated with cyber security are well known but implementing those concepts within an industry that has 5+ year development cycles and devices that are sold and used for 10 - 20 years is far from easy. Think about that. What OS would you have picked to use on your device 5 years ago? What about a db or even monitors? How the hell are you going to maintain a secure system designed 5 years ago when it's still in use in 2036? How would you approach design? How would you approach long term maintenance? What about just the hardware? Every change must be validated to assure you didn't break something.

If you're like most folks, you'd be tempted to just put it off for future discussions. Complicate that by knowing the actual designer of the system may be a 3rd party design house rather than the company selling the device.

It's not uncommon for companies to argue about how much they must do and when to do it and the decision isn't always 'doing it early is better than doing it later'. People spend tons of time arguing over what the FDA is likely to do in any given situation. This type of guidance document can stop those discussions and get companies to do the right thing at the right time.

It's important to realize that these guidance docs also get translated into inspection guidelines for the FDA and changes the way they do audits.

So, a long answer but that's why this is important. It will drive behavior. That's why they bothered.

Clive RobinsonJanuary 9, 2017 7:43 PM

@ Bruce,

Note that these are "non-binding recommendations," so I'm really not sure why they bothered.

Think of them as "Best Practice" recomendations from an industry specific group.

Then chat to a lawyer about their implications in a civil suit for damages.

That's the down side for those daft enough not to heed them.

The upside for customers are that they require no long slow legislative process, that would be out of data before it became law.

As I've said quite a few times before the way the US tends to go about technical legislation is possibly the worst way you can do it. The EU whilst by no means perfect has a framework system which can be seen by how you get a CE mark on items that are "to be placed on the market" (see the blue book). It tends to be much faster in response to necessary changes.

One area that the EU method is not good at is the introduction of a fundemental new disruptive technology in a well established market segment (see SDR and RT&TTE requirments). These tend to be black swan events, but the US FCC fares even worse (see the debacle with SDR based Wireless Networking, and the even earlier failure of "Reverse-SMA" connectors).

rJanuary 9, 2017 7:59 PM


You _really_ don't like those SMA connectors do you?

My buddy got a pacemaker, it stopped.

It was under recall, but they installed it anyways - good for them - bad for us.

It might've just been defective(?), but they did remove both it and it's little tail too.

rJanuary 9, 2017 8:02 PM

Stuff that goes into your body that is essentially an RTOS should not be left to IoT companies, it should be done in the way of Industrial Control Systems if not NASA quality hardware with redundancy and shielding.

rJanuary 9, 2017 8:10 PM

Can you imagine the dangers of an insulin pump with only one sensor?

Can they jam open?

There should be a moratorium on companies involved in this stuff, my buddy made it one week before his pacemaker outright stopped on him and his wife.

TedJanuary 9, 2017 9:50 PM

The FDA’s Center for Devices and Radiological Health (CDRH) will be hosting a webinar on the above document this Thursday, January 12, 2016. [1] A transcript, audio recording, and slides from the presentation will be available afterwards. [2]

The FDA has both mandatory medical device reporting (MDR) requirements (for manufacturers, importers, and device user facilities) as well as voluntary reporting protocols (for healthcare professionals, patients, caregivers and consumers.) [3]

Final guidance for manufacturer reporting was issued on November 7, 2016, and presented via a November 30 webinar. [4] From what I understand, the MDR requirements apply to events that have actually occurred, and are managed under title 21 of the Code of Federal Regulations part 803 “Medical Device Reporting.”

[1] Webinar "Postmarket Management of Cybersecurity in Medical Devices Final Guidance"
[2] Medical Device Webinars and Stakeholder Calls
[3] Medical Device Reporting (MDR)
[4] Webinar "Final Guidance on Medical Device Reporting for Manufacturers"

keinerJanuary 10, 2017 1:34 AM

"Note that these are "non-binding recommendations," so I'm really not sure why they bothered."

...because you have NO idea how business goes in health care, or?

MarcoJanuary 10, 2017 1:59 AM

Why expecting particullar new or interesting guidance?
Not only the medical device industry (as well as the security industry) is failing at the basics, and is continously introducing new kung-fu magic.

Doctor 100 MilesJanuary 10, 2017 3:37 AM

AU$ 60 Billion White Elephant

Australian Conservative Coalition Government's E-Health push unavailable to most due to slow internet speeds.The Liberal/National Party "Cheaper" National Broadband Network was pitched as brining affordable high speed internet to 97% of Australia. Potential NBN customers are sticking with ADSL 1/2/2+, or paying to move back to ADSL2+ services as FTTN fails to impress. Some even paying thousands to have fibre run to their homes by other providers.


My InfoJanuary 10, 2017 9:14 AM

Re: original post

Note that these are "non-binding recommendations," so I'm really not sure why they bothered.

I'm not sure either, but that's generally what happens when construction workers become computer experts overnight.

Construction workers, heavy equipment operators, etc. mind you. Not only sexist and exclusively male, but let's just say their Internet access needs to be *ahem* filtered. Just ask the homeboys from Battle Ground, Washington.

They were the ones who installed that traffic light that allegedly caused a traffic accident with alleged injuries because the lights were allegedly green all four ways at the same time, according to both parties and bystanders....

keinerJanuary 10, 2017 11:07 AM

...it will take some (more) deaths before drug agencies start beefing up their computer skills. It always happens in the aftermath of some laaarge screw-up. Before: No way, as the industry says no to additional costs and nobody can enforce against the industry.

Good luck for the FDA with Donald Duckbrain...!

Fred PJanuary 10, 2017 4:00 PM

"I'm really not sure why they bothered." - because it's guidance from the FDA; following it can increase the chance of approval. Furthermore, it can reduce the pain and stress of an audit. Following these are cheap compared to the potential repercussions (ex: not being able to sell in the U.S.A. for an extended period of time).

I suspect that well over 80% of the projects from my company will be following this 3 years from now; most that won't aren't medical devices.

Peter QuirkJanuary 10, 2017 4:57 PM

As I understand these guidelines, they do not address recently discovered battery attacks on implantable wireless devices. Pacemaker batteries can be run down in a matter of days by an external device constantly pinging the pacemaker. No vulnerabilities required.

Who?January 11, 2017 3:17 AM

@ Bruce

[...] it reads like standard security advice: write secure software, patch bugs, and so on.

It may be a standard security advice, not a standard security practice. I am not asking developers and corporations to write secure software (they should, however, try it as hard as possible), I am asking developers and corporations to support their own devices and software. Two years after releasing an expensive device it gets unsupported: no more patches, no more bug fixes, no more security fixes either, no support at all. Corporations should be required by law to support their devices at least ten years!

keinerJanuary 11, 2017 4:16 AM

... Make America safe again, we need more weapons!


And to protect the innocent, some silencers


“It’s about safety,” Trump Jr. explained in a September video interview with the founder of SilencerCo, a Utah silencer manufacturer. “It’s a health issue, frankly.”

In the video, after he’s shown shooting several guns with silencers, Trump Jr. says they can help with getting “little kids into the game.”

albertJanuary 11, 2017 1:43 PM

I'm guessing, but I don't think IoT device makers make medical devices. Both medical device manufacturers and ICS manufacturers can face huge legal challenges if they screw up, even without bodily injuries involved.

PLC manufacturers have it a little easier, since those devices are user-programmable. BTW, PLC designs date back to pre-Internet days, so security is an issue. You can't just slap on an Ethernet board and call it a day.

Requiring extended support for medical devices (MDs) is a good idea, but I would require US-based entities, legally responsible for the products, as long as they are in use. Companies respond to you when you hold a purse string.

I hate to say this, but as long as generic PCs are gateways to a device, good security will be unattainable.

The FDA can be thought of as akin to the FAA, only much more corrupt. In fairness, their medical device people may actually do a decent job, but the politics remain. And politics trumps everything.

I'll skip my usual rant about IoT security.

. .. . .. --- ....

My InfoJanuary 11, 2017 2:02 PM

"Non-binding recommendations..."
"Following these are cheap compared to the potential repercussions"

No. Computer security is like this.

Imagine you "own" a house — one of a row of houses with white picket fences in a nice subdivision in southern California — and you decide to improve your security by installing a nice chain-link cyclone fence.

First of all, your neighbors will call a neighborhood meeting and place an enforcement lien against your property for failing to comply with the neighborhood ordinances and the Realtor's conditions, covenants, and restrictions.

Second, they will come by night with bolt-cutters and cut holes in your fence.

Third, they will tip off some gangsters from L.A., who will start casing your house for a burglary because they will be wondering what you have in your house that is so valuable that you need a chain-link cyclone fence around the property, and how you had the money to put up such a fence in the first place.

DroneJanuary 11, 2017 7:29 PM

"The FDA has issued a report giving medical devices guidance on computer and network security."

So medical devices are getting smart enough to follow instructions? I hope so, it's more than humans can do.

Nile HeffernanJanuary 12, 2017 10:13 AM

Clive Robinson's got most of it with

"Think of them as "Best Practice" recomendations from an industry specific group.

Then chat to a lawyer about their implications in a civil suit for damages."

Most IT professionals view the law as 'That legal stuff which doesn't apply to us because of the disclaimer in our small print', and the imbalance of power between software vendors and users is such that this view is usually correct; but healthcare is a very different legal environment.

derpveloperJanuary 12, 2017 8:01 PM

Large companies often use a bunch of small 3rd party developers to provide them with software different components for various tasks. One team might be building network drivers, another bunch somewhere else in another country will be working on audio. Sometimes the large company will not want to pay for your software, but put it in there product anyway and then you can decide if you want to spend the next ten years in a legal nightmare or not.

Meanwhile because they insist on closed source software then nothing is getting patched. 10 years later after the legal dispute is finalized, perhaps a patch is issued, perhaps not. It may be that the gaping security hole is intentional to give easy access to the software, or ensure it's widespread distribution with easy access to push out what ever updates at any time using some online interface with well known credentials, plus an FTP server with a well known password that updates can be sent to from various developers who might share that password with you if you ask kindly.

The board and CEO have other things on their mind and couldn't code cement into a cement mixer or build a good foundation. Where will we have lunch, business or first class, which state, and what are the hotels like there? Occasionally they will have a CEO who is a proxy manager, and the company is actually run by a bunch of shareholders with less qualifications or experience than my little finger (basically you could chop it off it's that useful, apart from getting small screws out from under circuit boards when no tweezers off an kind can be found).

The share holders may not all be entirely useless, one may actually have been the fellow who designed the product in the first place, but he'll soon be sued and shown the door for knowing what he is talking about and pointing out important things like "security" which will impact the profit margin.

Somewhere in that mess are people trying to do their actual job, and also a bunch of your typical freeloaders, and a least a couple of rogues, who will be fired at some point, but likely only God knows what they did in the company. So the product is selling anyway, and who ever wrote the software has long since been replaced by some kid straight out of university who is qualified with latest certificates and has no practical experience, and maybe messed around with a router once if you are lucky.

Clive RovinsonJanuary 13, 2017 3:44 AM

@ derpveloper,

Yup, that about describes the better companies on the investor list (but among the worst on the patient list).

It's one of the reasons I think NIST have been realy falling down on their job. As I've mentioned several times on this blog I think their needs to be a framework standard for all infrastructure and medical implants etc where the safety cost etc of replacing items is high.

The reason is quite simple these products have expected hardware operating life of 10-20years currently due to limited energy storage issues. Given the fact that human life expectancy is upwards of a century in some groups of people, and steadily rising in others you do not want a device put in your body in your early to mid adult (20-45) years that's going to have to be replaced by major surgery atleast a couple of times in your remaining expected life. Ignoring the safety implications the cost of such surgery is measured in three to four years of average adult income.

Likewise the hardware side of smart meters etc is not good. The old mechanical meters were good for thirty pluss years, those modern smart meters maybe fifteen to twenty years for the sampling head end, the display end maybe five years or less. The cost of replacing the head end is around a "manday" of semi-skilled labour, but rising due to technology and lifestyle changes.

But those hardware lifetimes and replacment costs are outliers when you consider software, communications and security. There are various metrics when it comes to faults in the specification, design, coding, and embeding of software into hardware, all it realy gets down to is that the fault rate is usually under 10% before you go into maintanence, which can mean adding further faults whilst endevoring to fix other faults. To see this imagine that a fault in the specification leaves out functionality that is then found to be needed. This involves punching holes in existing code and adding new code, often by people who don't have "the feel of the code" the original coders did. In embedded systems "knowing the code" can be insufficient especially if there is any kind of Real Time component in the system, which for these types of device is almost a certainty.

To do software maintenance in fielded systems requires, communications and an update process that both need to be reliable in operation (unless you want bricks). But the update also need to be done on the systems whilst they are in operation, without interrupting that operation. This is actually a dificult ask full of hidden issues (such as data retention, updating, zeroing etc before during and after the update). There are ways to do this that are known to work, but you have to have reliable knowledge and experience that is rarer than hens teeth in the industry (where many code cutters don't know enough about ADTs to properly insert in a doubly linked list or why you might need one).

Thus you need security of the update process so that only people who are autherised to do so can do so. This gives us a problem in that so far we've not had a reliable security protocol that has when implemented lasted a decade without the need to be updated. Simarly with the underlying algorithms.

Thus the code has to be written in a very plug and play modular style. I won't go into a list of what that actually means because it would be approaching the contents list length of a large indepth technical work.

Importantly though there is another issue which is "standard operating". Medical implants are becoming more common and thus the numbers seen in Accident / Emergency / Trauma centers and by first responders are on the increase. Further there is a trend to take that sort of work to the victim/patient to reduce time and increase the surviability figures. Which means doctors etc need a "one box talks to all implants" and it needs to be small, portable, rugged and highly reliable.

There is nothing that currently meets those requirments even for a single manufacturer of implants...

We need standards, they need to be updatable quickly and effectivly and this needs to be reflected in the embedded devices, and all manufacturers have to comply without managment "having deferment or options on implementation".

rjhJanuary 13, 2017 3:12 PM

Many of these comments reveal a typical ignorance of medical device regulation by the software community. You can get into really serious trouble if you actually act based on this ignorance. The site http://www.fda.gov/Training/CDRHLearn/ is specifically educational material, and anyone who is serious about medical devices should understand most of what is found on http://www.fda.gov/MedicalDevices/default.htm

Failure to comply with FDA regulations is a felony. Knowing willful violations are on a par with bank robbery (another federal crime). Ignorance is generally treated very kindly, with substantial efforts being made by the FDA to provide good educational materials and guidance. This guidance is part of that guidance.

That's not understood by most software developers. They point to contracts with fine print. They talk about insurance. They talk about marketing tradeoffs. Think bank robbery. None of those would get you out of a bank robbery charge if you rob a bank, and none of those get you out of trouble with the FDA.

The core FDA rule is that you must have a valid and meaningful risk management process, and an appropriate mitigation, repair, and recovery process. That's the law. This risk managment process must cover all safety and efficacy risks. Cybersecurity is just recently added as a specific risk. The original processes worried about things like sterilization, unnecessary surgery, and other injuries. All those remain part of the risk management process.

The guidance is called guidance because it is your responsibility as the device manufacturer to assess the risks and act appropriately when deciding whether to follow the guidance. The FDA rules apply to tongue depressors, examination tables, CT scanners, pacemakers, etc. Rather than have a bureaucracy 100 times larger than it is, they delegate to the manufacturers. The manufacturer decides what cyber security protections are appropriate for tongue depressors. The manufacturer decides what sterilization processes are appropriate for a physicians workstation. The FDA gave guidance, and you can deviate from that guidance if it is appropriate. But if you deviate from the guidance you must be able to show a responsible risk analysis for the specific situation that justifies the deviation.

Failure to follow responsible risk processes can get you a jail term. Making ordinary mistakes is expected. Technology and environmental risk changes are expected. A responsible risk process will include improvement processes to correct for problems and adapt to changes. Mistakes and change do not normally trigger penalties.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient, an IBM Company.