Page 483

Guessing Smart Phone PINs by Monitoring the Accelerometer

Practicality of Accelerometer Side Channels on Smartphones,” by Adam J. Aviv. Benjamin Sapp, Matt Blaze, and Jonathan M. Smith.

Abstract: Modern smartphones are equipped with a plethora of sensors that enable a wide range of interactions, but some of these sensors can be employed as a side channel to surreptitiously learn about user input. In this paper, we show that the accelerometer sensor can also be employed as a high-bandwidth side channel; particularly, we demonstrate how to use the accelerometer sensor to learn user tap and gesture-based input as required to unlock smartphones using a PIN/password or Android’s graphical password pattern. Using data collected from a diverse group of 24 users in controlled (while sitting) and uncontrolled (while walking) settings, we develop sample rate independent features for accelerometer readings based on signal processing and polynomial fitting techniques. In controlled settings, our prediction model can on average classify the PIN entered 43% of the time and pattern 73% of the time within 5 attempts when selecting from a test set of 50 PINs and 50 patterns. In uncontrolled settings, while users are walking, our model can still classify 20% of the PINs and 40% of the patterns within 5 attempts. We additionally explore the possibility of constructing an accelerometer-reading-to-input dictionary and find that such dictionaries would be greatly challenged by movement-noise and cross-user training.

Article.

Posted on February 15, 2013 at 6:48 AMView Comments

Using the iWatch for Authentication

Usability engineer Bruce Tognazzini talks about how an iWatch—which seems to be either a mythical Apple product or one actually in development—can make authentication easier.

Passcodes. The watch can and should, for most of us, eliminate passcodes altogether on iPhones, and Macs and, if Apple’s smart, PCs: As long as my watch is in range, let me in! That, to me, would be the single-most compelling feature a smartwatch could offer: If the watch did nothing but release me from having to enter my passcode/password 10 to 20 times a day, I would buy it. If the watch would just free me from having to enter pass codes, I would buy it even if it couldn’t tell the right time! I would happily strap it to my opposite wrist! This one is a must. Yes, Apple is working on adding fingerprint reading for iDevices, and that’s just wonderful, but it will still take time and trouble for the device to get an accurate read from the user. I want in now! Instantly! Let me in, let me in, let me in!

Apple must ensure, however, that, if you remove the watch, you must reestablish authenticity. (Reauthorizing would be an excellent place for biometrics.) Otherwise, we’ll have a spate of violent “watchjackings” replacing the non-violent iPhone-grabs going on today.

Posted on February 14, 2013 at 11:42 AMView Comments

Anti-Cheating Security in Casinos

Long article.

With over a thousand cameras operating 24/7, the monitoring room creates tremendous amounts of data every day, most of which goes unseen. Six technicians watch about 40 monitors, but all the feeds are saved for later analysis. One day, as with OCR scanning, it might be possible to search all that data for suspicious activity. Say, a baccarat player who leaves his seat, disappears for a few minutes, and is replaced with another player who hits an impressive winning streak. An alert human might spot the collusion, but even better, video analytics might flag the scene for further review. The valuable trend in surveillance, Whiting says, is toward this data-driven analysis (even when much of the job still involves old-fashioned gumshoe work). “It’s the data,” he says, “And cameras now are data. So it’s all data. It’s just learning to understand that data is important.”

Posted on February 14, 2013 at 6:32 AMView Comments

New al Qaeda Encryption Tool

There’s not a lot of information—and quite a lot of hyperbole—in this article:

With the release of the Asrar Al Dardashah plugin, GIMF promised “secure correspondence” based on the Pidgin chat client, which supports multiple chat platforms, including Yahoo Messenger, Windows Live Messenger, AOL Instant Messenger, Google Talk and Jabber/XMPP.

“The Asrar Al Dardashah plugin supports most of the languages in the world through the use of Unicode encoding, including Arabic, English, Urdu, Pashto, Bengali and Indonesian,” stated the announcement, which was posted on several top online Jihadist forums and GIMF’s official website.

“The plugin is easy and quick to use, and, like its counterpart, the Asrar Al Mujahideen program, it uses the technical algorithm RSA for asymmetric encryption, which is based [on] a pair of interrelated keys: a public key allocated for encrypting and a private key used for decrypting,” GIMF’s statement said. “To use the plugin, both of the communicating parties should install and activate the plugin and produce and import the Asrar Al Mujahideen private key into the Asrar Al Dardashah plugin, which automatically produces the corresponding public key of 2048-bit-length for use. It offers a level of encryption which has not been cracked or broken and can be relied upon entirely to protect the confidentiality of sensitive communication[s].”

Posted on February 13, 2013 at 6:13 AMView Comments

Massive Police Shootout in Cleveland Despite Lack of Criminals

This is an amazing story. I urge you to read the whole thing, but here’s the basics:

A November car chase ended in a “full blown-out” firefight, with glass and bullets flying, according to Cleveland police officers who described for investigators the chaotic scene at the end of the deadly 25-minute pursuit.

But when the smoky haze—caused by rapid fire of nearly 140 bullets in less than 30 seconds—dissipated, it soon became clear that more than a dozen officers had been firing at one another across a middle school parking lot in East Cleveland.

At the end of the scene, both unarmed—and presumably innocent—people in the car were dead.

There’s a lot that can be said here, but I don’t feel qualified to say it. There’s a whole body of research on decision making under stress—police, firefighters, soldiers—and how easy it is to get caught up in the heat of the moment. I have read one book on that subject, Sources of Power, but that was years ago.

What interests me right now is how this whole situation was colored by what “society” is talking about and afraid of, which became the preconceptions the officers brought to the event. School shootings are in the news, so as soon as the car drove into a school parking lot, the police assumed the worst. Firefights with dangerous criminals are what we see on TV, so that’s not unexpected, either. When you read the story, it’s clear how many of the elements that the officers believed—police cars being rammed, for example—are right out of television violence. This would have turned out very differently if the officers had assumed that, as is almost always true, the two people in the car were just two people in a car.

I’m also curious as to how much technology contributed to this. Reports on the radio brought more and more officers to the scene, and misinformation was broadcast over the radio.

Again, I’m not really qualified to write about any of this. But it’s what I’ve been thinking about.

Posted on February 12, 2013 at 12:55 PMView Comments

Our New Regimes of Trust

Society runs on trust. Over the millennia, we’ve developed a variety of mechanisms to induce trustworthy behavior in society. These range from a sense of guilt when we cheat, to societal disapproval when we lie, to laws that arrest fraudsters, to door locks and burglar alarms that keep thieves out of our homes. They’re complicated and interrelated, but they tend to keep society humming along.

The information age is transforming our society. We’re shifting from evolved social systems to deliberately created socio-technical systems. Instead of having conversations in offices, we use Facebook. Instead of meeting friends, we IM. We shop online. We let various companies and governments collect comprehensive dossiers on our movements, our friendships, and our interests. We let others censor what we see and read. I could go on for pages.

None of this is news to anyone. But what’s important, and much harder to predict, are the social changes resulting from these technological changes. With the rapid proliferation of computers—both fixed and mobile—computing devices and in-the-cloud processing, new ways of socialization have emerged. Facebook friends are fundamentally different than in-person friends. IM conversations are fundamentally different than voice conversations. Twitter has no pre-Internet analog. More social changes are coming. These social changes affect trust, and trust affects everything.

This isn’t just academic. There has always been a balance in society between the honest and the dishonest, and technology continually upsets that balance. Online banking results in new types of cyberfraud. Facebook posts become evidence in employment and legal disputes. Cell phone location tracking can be used to round up political dissidents. Random blogs and websites become trusted sources, abetting propaganda. Crime has changed: easier impersonation, action at a greater distance, automation, and so on. The more our nation’s infrastructure relies on cyberspace, the more vulnerable we are to cyberattack.

Think of this as a “security gap”: the time lag between when the bad guys figure out how to exploit a new technology and when the good guys figure out how to restore society’s balance.

Critically, the security gap is larger when there’s more technology, and especially in times of rapid technological change. More importantly, it’s larger in times of rapid social change due to the increased use of technology. This is our world today. We don’t know *how* the proliferation of networked, mobile devices will affect the systems we have in place to enable trust, but we do know it *will* affect them.

Trust is as old as our species. It’s something we do naturally, and informally. We don’t trust doctors because we’ve vetted their credentials, but because they sound learned. We don’t trust politicians because we’ve analyzed their positions, but because we generally agree with their political philosophy—or the buzzwords they use. We trust many things because our friends trust them. It’s the same with corporations, government organizations, strangers on the street: this thing that’s critical to society’s smooth functioning occurs largely through intuition and relationship. Unfortunately, these traditional and low-tech mechanisms are increasingly failing us. Understanding how trust is being, and will be, affected—probably not by predicting, but rather by recognizing effects as quickly as possible—and then deliberately creating mechanisms to induce trustworthiness and enable trust, is the only thing that will enable society to adapt.

If there’s anything I’ve learned in all my years working at the intersection of security and technology, it’s that technology is rarely more than a small piece of the solution. People are always the issue and we need to think as broadly as possible about solutions. So while laws are important, they don’t work in isolation. Much of our security comes from the informal mechanisms we’ve evolved over the millennia: systems of morals and reputation.

There will exist new regimes of trust in the information age. They simply must evolve, or society will suffer unpredictably. We have already begun fleshing out such regimes, albeit in an ad hoc manner. It’s time for us to deliberately think about how trust works in the information age, and use legal, social, and technological tools to enable this trust. We might get it right by accident, but it’ll be a long and ugly iterative process getting there if we do.

This essay was originally published in The SciTech Lawyer, Winter/Spring 2013.

Posted on February 12, 2013 at 6:53 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.