Schneier on Security
A blog covering security and security technology.
« Using the iWatch for Authentication |
| Jacob Appelbaum's 29C3 Keynote Speech »
February 15, 2013
Guessing Smart Phone PINs by Monitoring the Accelerometer
"Practicality of Accelerometer Side Channels on Smartphones," by Adam J. Aviv. Benjamin Sapp, Matt Blaze, and Jonathan M. Smith.
Abstract: Modern smartphones are equipped with a plethora of sensors that enable a wide range of interactions, but some of these sensors can be employed as a side channel to surreptitiously learn about user input. In this paper, we show that the accelerometer sensor can also be employed as a high-bandwidth side channel; particularly, we demonstrate how to use the accelerometer sensor to learn user tap and gesture-based input as required to unlock smartphones using a PIN/password or Android's graphical password pattern. Using data collected from a diverse group of 24 users in controlled (while sitting) and uncontrolled (while walking) settings, we develop sample rate independent features for accelerometer readings based on signal processing and polynomial fitting techniques. In controlled settings, our prediction model can on average classify the PIN entered 43% of the time and pattern 73% of the time within 5 attempts when selecting from a test set of 50 PINs and 50 patterns. In uncontrolled settings, while users are walking, our model can still classify 20% of the PINs and 40% of the patterns within 5 attempts. We additionally explore the possibility of constructing an accelerometer-reading-to-input dictionary and find that such dictionaries would be greatly challenged by movement-noise and cross-user training.
Posted on February 15, 2013 at 6:48 AM
• 21 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
Wouldn't this be easily countered by randomizing the location of numbers on the virtual keypad?
hardly ... if an malicious app is allowed to access the accelerometer, it would be easy to also gain access to the display controling routines (at least via social engineering the user to accept some rather extensive rights) and reconstruct the layout of a "randomized keypad"
If the app is malicious enough to access "display controling routines" why not access the direct typed locations rather than depend on accelerometers?
Perhaps the more interesting question is what else can you decipher from such side channels. Can it function as an effective keylogger for example?
The unlock PIN requires physical access. There are plenty of things one might enter on a phone that don't.
No point securing the unlock PIN screen e.g. shutting down all sensors at unlock if the useful stuff to steal is not the unlock PIN itself.
this is why i am afraid of camera functionality without having to unlock the device, or whatever does screen captures-am no guru, but paranoid. Am sure someone is researcing on that too. All you have to do is keep capturing the screen as soon as the unlock key is activated.
Guys, you can not simply take a screen shot of an other app's screen on an android device. There’s no even a permission apps can request to do it.
Unless you social engineer your victim to root the phone or connect it to the SDK via USB. But if you succeed in doing that, you don't need to bother about the accelerometer, anyway.
So, yes, a randomized keypad will probably thwart accelerometer monitoring attacks.
A randomized keypad, unfortunately, would also thwart users sufficiently that most people who turned on such a feature would probably hit the unlock attempt limit, and lock or wipe their phone (depending on the config), within a few weeks.
@vwm - Hold down Power + Down Volume for a couple of seconds & Android will capture & save a screenshot. That even includes the unlock screen; I just tested it on my Galaxy S II. The functionality is there so there is a potential for malware to exploit it.
I had always thought that Skype claimed to encrypt all its traffic, including its instant messaging system. I have used Nirsoft's SkypeLogView program to access and print out some Skype instant message sessions. I did those sessions on my office computer. When I logged into the same Skype account on my home computer and ran SkypeLogView, the same instant message session records showed up, although I had not used that computer for those IM sessions.
Does this mean that Skype instant messages are sent in cleartext through Skype servers or, even scarier, through Skype's peer-to-peer network?
And here I was thinking that there really was a serious IM program that automatically encrypted its IM traffic and that with SkypeLogView the IM sessions could be easily printed out!
any sensor input is better than none and improves the accuracy of the password guess
think how good NASA got with their remote sensing on distant moons etc - a few photons or changes in magnetic or gravity reveal size, composition, ... same principles. filter out the noise and analyze the signal to "fill in the blanks"
This just goes back to the age old priniciple. You must trust the machine you entrust your shared secret to. I think, in terms of smartphones, this is going to mean locking the phone down during user verification (and, if nothing else, notifying malicious apps of when someone has logged in due to the downtime).
We want our smartphone to replace a whole bevvy of old tools. The old tools had isolation, no surprise the smartphone is in desperate need of it.
Meanwhile, I wonder how hard it is to guess N% of PINs with this technique, as opposed to, say, displaying a fake login and trying to trap N% of users that way.
@JohnJ, yes, Android OS is able to capture Screenshots (introduced with "Ice Cream Sandwich"). I heard, there are also some Apps pre-installed by some telcos for that purpose.
But any App that you can install with normal user privileges will have a hard time doing shots of other app's screens.
There might be some way to hack around the restrictions. But it requires actual hacking that device. And once you succeed with that, you can probably do any kind of nasty stuff -- without bothering about the Accelerometer.
On the other hand, any App can request use of the Accelerometer. No hacking required for that. Just add some more or less cool gesture-recognition feature, then it's not even suspicions.
A better question - can the use of a compromised smart phone & accelerometer be used to monitor keystrokes of an actual PC keyboard, i.e., by monitoring the vibrations and algorithmically determining unique keystrokes based on frequency analysis? The phone would need to be on a desk near the keyboard, of course.
Or alternately, using the smartphone's audio input to do the same.
Anyone aware of any research in this area and efficacy of POCs?
@NickP - very interesting, thank you for these links.
Are you aware that, within the Skype client, you can pull up every conversation with a user by clicking on the clock at the top of the Skype window? So, Skype probably automatically pulls every conversation you've had from their servers into a plaintext file on your computer, though it's probably encrypted on the wire.
Anyway, Skype's isn't exactly secure.
It would help if we could kill apps. Whenever I try to kill my apps on my Android phone, within 10 minutes over a dozen are back up and running again. It's like the days of Windows crapware all over again. I'm really starting to hate my Android phone.
If you need to install a malicious application on the phone to enable this method, then this attack's value is only academic. If you can monitor the sensors remotely, or without the need for an external application, then you'll have my attention.
If you can monitor the sensors remotely, or without the need for an external application, then you'll have my attention
Whilst it would certainly get my attention that is in the main not the problem, it's how it happens which is the real problem.
The problem starts with the cognative disconect these various methods get around.
If I'd said to you when Apple first put accelermeters into the phones that it was a security risk because you could read what was being typed in, you would probably have looked at me as though I'd spent a little to much time in the sun...
Now it's been fairly well demonstrated to some parts of the security industry those parts accept it as a given, hence your comment.
But what of others?
They are still in that cognative disconnect status between what they think they know and the actuality of what can be done.
Thus when they load a game or whatever they think it's OK to let it have access to the accelerometers because they cannot see any risk involved with doing so.
Likewise most programers out there will not see an issue with access to them and their program either, hence their program provides transparent access to another application etc.
Eventually you realise that such viewpoints are held even by OS designers as well and that whole computers can become transparent to data leakage through them. A practical example of such a transparency issue being Matt Blaze and some of his student's "Keyboard Jitter bug" enabling keystroke information to be covertly leaked on network packets ( http://www.crypto.com/papers/... ).
Which in effect does exactly what you are worried about.
say we want to find out more about a user, but we can not hack his phone ...
we could also analyze the brightness of the screen to find out which app the user is using (patterns how brightness/color changes), and how long the login screen is up (how many letters are used here) etc
so we do not need to have direct line-of-sight or a hack to deduce information from it
Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.