Acoustic Attack Against Accelerometers

Interesting acoustic attack against the MEMS accelerometers in devices like FitBits.

Millions of accelerometers reside inside smartphones, automobiles, medical devices, anti-theft devices, drones, IoT devices, and many other industrial and consumer applications. Our work investigates how analog acoustic injection attacks can damage the digital integrity of the capacitive MEMS accelerometer. Spoofing such sensors with intentional acoustic interference enables an out-of-spec pathway for attackers to deliver chosen digital values to microprocessors and embedded systems that blindly trust the unvalidated integrity of sensor outputs. Our contributions include (1) modeling the physics of malicious acoustic interference on MEMS accelerometers, (2) discovering the circuit-level security flaws that cause the vulnerabilities by measuring acoustic injection attacks on MEMS accelerometers as well as systems that employ on these sensors, and (3) two software-only defenses that mitigate many of the risks to the integrity of MEMS accelerometer outputs.

This is not that a big deal with things like FitBits, but as IoT devices get more autonomous—and start making decisions and then putting them into effect automatically—these vulnerabilities will become critical.

Academic paper.

Posted on April 4, 2017 at 6:23 AM13 Comments

Comments

Brian April 4, 2017 8:04 AM

I can see two obvious malicious uses for this technology:

1) Controlling Autonomous vehicles (self-driving cars, drones, etc.)
2) Breaching air gaps.

Andy April 4, 2017 8:08 AM

Preface: Did not read paper.

How do you design your system poorly enough to allow a sensor to inject code into your runtime? Are we operating at too high of an abstraction these days or has the market need to fill millions of dev jobs just inevitably lead to lowered standards? Maybe there’s some other explanation?

Tino April 4, 2017 8:15 AM

Andy, as the excerpt posted here states:
“Spoofing such sensors with intentional acoustic interference enables an out-of-spec pathway for attackers to deliver chosen digital values to microprocessors and embedded systems that blindly trust the unvalidated integrity of sensor outputs.”

It’s not about code injection.

But I wonder what my MacBook does when it thinks it’s constantly falling…

Clive Robinson April 4, 2017 8:59 AM

Nice to see the academics are finally getting around to “Active EmSec Fault Injection Attacks” again (the last was a couple of grads over at the UK Cambridge Computer labs who “lit up” a TRNG and took it’s output from one in four billion down to one in a hundred).

This sort of thing has been known about since the 1980’s to my knowledge, when I started independently investigating them, and as I’m an engineer not a genius I can only assume other engineers etc likewise did their own investigations both before and aftet that.

As @Bruce notes,

    but as IoT devices get more autonomous — and start making decisions and then putting them into effect automatically — these vulnerabilities will become critical.

But he forgot to mention “and very much easier with improving technology“.

Just a little reminder… Who remembers BadBIOS and how the person got “bad mouthed” and ridiculed? Then the question of using sound came up, again it was ridiculed by many, but not by engineers who knew that not only could you use sound as a data path but had actually done so. Further some of us mentioned the problem with the BIOS and loading “driver code”. Again a certain degree of ridicule, or ignoring as it did not echo in the chamber.

But finally some academics do the sound channel with a couple of laptops in a corridor probably spending more time on writing the paper than the cod. And the sherzam the next thing you know “advertising malware” is using the technique…

As for the BIOS loading code at boot time, it was documented from the 1980’s but people could not be bothered to read it… Except for some engineers at a rapacious Chinese Laptop manufacturer that used it to embed malware into peoples PCs for extra profit. When Lenovo got caught they were all appologetic etc, but finally people got to take on board the BIOS issue…

So just a word to the wise “Wake up and smell the coffee” on this one as it’s very probably grown legs this time, and will come running into your life and give you a rude awakening if you carry on sleepwalking about EmSec Active Fault Injection Attacks… After all there is nothing new about these MEMS in this respect “All Transducers” have similar problems (see the history of TEMPEST to see some of it).

Then ask yourself a question, do you realy want to be a piece of road kill wrapped around a street sign etc?

Thinking about Factory April 4, 2017 9:27 AM

All the comments here are about trivial consumer devices, and possibly cars.

That worries me about this flaw is the use of these devices in manufacturing and industrial devices. Accelerometers are used for determining orientation (which way is up/level), as well as movement and vibration sensing.

At the least, this flaw could cause DOS on physical manufacturing devices – possibly even to the level of physical destruction. Imagine large moving parts with sensors to make sure they don’t collide – and the sensors are lying to you.

The paper (best I can tell) leaves out the actual frequencies that cause the sensor to mis-read. Makes me wonder if loud music from employees in loud shop floors could inadvertently cause problems. Would be scary to find a random riff from a Led Zeppelin hit could cause physical destruction.

Mike Barno April 4, 2017 12:49 PM

@ Thinking about Factory,

Would be scary to find a random riff from a Led Zeppelin hit could cause physical destruction.

Those riffs have been causing fried amps, blown speakers, damaged eardrums, and rioting audiences for decades. Acoustic attack, indeed.

A Mac User April 4, 2017 1:07 PM

OK, let’s see … driving happily along a mountain road with very loud, stereo blaring and suddenly the car’s motion sensors generate invalid data which the traction-control system interprets as requiring brakes to be fully applied to both wheels on the “drop-off side” of the road (yes, some systems can do this) and the next minute involves testing the car’s roll-survival and maybe tree-crunching capabilities. Of course, depending upon the kind of “music” that was being played, some may just consider all that to be “thinning out the herd!” #;-)

Ian Mason April 4, 2017 1:28 PM

I can think of one safety critical use of accelerometers that should give anybody pause for thought if there is a possibility of tampering with them: Permissive Action Links (PALs) in nuclear warheads.

PALs are devices that stop nuclear weapons being armed and detonated. One of the techniques used to inhibit firing is to check that the weapon is seeing or has seen the environmental conditions that ought to be seen in actual deployment before firing. For bombs and missile warheads this includes sensing a suitable period of free fall using an accelerometer.

PINBACK: “Okay bomb, arm yourself.”
Bomb #20: “Armed.”
… later …
Bomb #20: “Let there be light.”

Tatütata April 4, 2017 3:44 PM

The academics used the same disclosure strategy as for software vulnerabilities, i.e., contacting the suppliers and providing them a deadline before going public.

I dunno whether this is the best way to go about this. The pipeline for producing, distributing and assembling vulnerable hardware devices is much longer than the software life cycle, and delaying public disclosure only increases the number of affected systems, with little possibility of upgrading in the field.

Furthermore, I can very well imagine a scenario where one of the savvier suppliers would rush to file patent applications for the more obvious corrective and thus gain an edge over their competitors. (Unless university folks did this themselves in the first place, or at least discussed in their paper some possible workarounds in order to pre-empt such a strategy…) If the fault is immediately made public, then everyone works under the same sense of urgency.

The demo reminds me of microphonic behaviour in local oscillators, or frequency pushing from audio floating on the DC supply. In an earlier life I tested my designs by banging on the housing before the unit went for a ride on the vibration table. I wonder whether these “smart” phones are even specified for any level of vibration.

This vulnerability also reminds me of the lousy common mode and harmonic handling of electrical utility meters mentioned by Clive (?) a few weeks back.

Meir Maor April 4, 2017 11:28 PM

This kind of attack can work well to amplify other weaknesses.
Many years ago I worked with a mission critical military system which could not have any downtime, ever. It was built with redundancy after redundancy many different physical sites each with redundant servers redundant power, redundant networking, the works.
One sunny afternoon a single connected sensor started spewing out only zeros which were not a valid value. The sensor output was encrypted and the integrity was cryptographicly veified. When it reached the server it performed a simple devision by zero and went down. Another server took the load and went down, a remote server took over and fell and quickly all redundant servers crashed with the same error. In this case it was a malfunction not an attack, but injecting data into sensors can be very serious.

Andy April 5, 2017 9:34 AM

@Tino, so more of a data upload sideband? That’s nearly pointless. Like 20,000 FPS video based audio exultation (off the crisp pack, if you remember).

Most obvious use would be, “Here, let me install an app on your phone and point this ultrasound machine at it. I’ll let you scan the app for malware first…”

Gert van den Berg April 23, 2017 4:19 AM

If it allows for spoofing output values of accelerometers, it might be usable to trigger airbags in current vehicles… No IoT devices required…

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.