aDolus Blog

When the Security Researchers Come Knocking, Don’t Shoot the Messenger

Written by Eric Byres | Nov 9, 2018 2:46:44 AM

Our own Jonathan Butts and Billy Rios were interviewed this month on the CBS Morning News about their research showing that medical devices like pacemakers and insulin pumps can be hacked by… basically anybody.  These devices all contain embedded controllers, but unlike most modern computer technologies, they haven’t been designed with security in mind.

“We’ve yet to find a device that we’ve looked at that we haven’t been able to hack”, said Jonathan.

Billy also speaks to the one-way nature of medical equipment exploits, noting that it’s not just a matter of issuing a new credit card or changing a password when bad guys take advantage of the flaw. Victims of these kinds of attacks can end up dead.

You can see the full interview here:

http://www.cbsnews.com/video/how-medical-devices-like-pacemakers-insulin-pumps-can-be-hacked/

The Washington Post did a story on the same subject, featuring Billy and Jonathan back in October.

Poor security design is clearly widespread throughout the medical device industry.  As readers of our blog know, devices with embedded controllers are found in the electrical power industry, oil & gas, manufacturing, aerospace, defense, and a host of other critical infrastructure sectors. And many of those devices have had serious security vulnerabilities exposed in the past decade. But what makes this story concerning is that the medical industry seems especially behind in its approach to vulnerability management.

Billy and Jonathan uncovered the vulnerabilities associated with a Medtronic pacemaker way back in January last year. They then disclosed their findings in a detailed report to the vendor. Unfortunately, Medtronics denied that action was necessary and did nothing to address the problem or warn users.  It took a live, very public demonstration at Black Hat USA 2018 to capture the attention of the FDA and the vendor.

That isn’t the way responsible vulnerability disclosure is supposed to work. When researchers discover a vulnerability and privately share it with the vendor (and/or appropriate government agencies), the vendor needs to take that vulnerability seriously. That way the users of its products get a chance to patch before the dark side of the cyber world starts to exploit the weakness. Requiring researchers to broadcast the news to the world to get action is simply terrible security practice.

As a former CTO of a large industrial device manufacturer, I have faced my share of researchers bringing news of vulnerabilities in my company’s products. Some of the vulnerabilities proved to be very serious, while others simply a misunderstanding of how the product would be deployed in the field. Regardless, we took every vulnerability report seriously, immediately engaging the researchers so we could learn as much as possible about their testing techniques and findings. Sometimes, when we thought the researcher was onto a particularly serious or complex problem, we flew them into our development center so we could start addressing the issues as quickly and completely as possible.

The bottom line is that device manufacturers need to start seeing security researchers as partners, not annoyances. When a researcher finds a vulnerability, they are basically doing free QA testing that the quality and security teams should have done before the product ever shipped. It’s time that companies like Medtronic started working with security researchers, not fighting them. Instead, we should all be fighting the bad guys together.  It is the only way our critical systems will become more secure.

Follow Us: