Entries Tagged "sensors"

Page 2 of 13

Tracking People By Smart Phone Accelerometers

Interesting research: “We Can Track You If You Take the Metro: Tracking Metro Riders Using Accelerometers on Smartphones“:

Abstract: Motion sensors (e.g., accelerometers) on smartphones have been demonstrated to be a powerful side channel for attackers to spy on users’ inputs on touchscreen. In this paper, we reveal another motion accelerometer-based attack which is particularly serious: when a person takes the metro, a malicious application on her smartphone can easily use accelerator readings to trace her. We first propose a basic attack that can automatically extract metro-related data from a large amount of mixed accelerator readings, and then use an ensemble interval classier built from supervised learning to infer the riding intervals of the user. While this attack is very effective, the supervised learning part requires the attacker to collect labeled training data for each station interval, which is a significant amount of effort. To improve the efficiency of our attack, we further propose a semi-supervised learning approach, which only requires the attacker to collect labeled data for a very small number of station intervals with obvious characteristics. We conduct real experiments on a metro line in a major city. The results show that the inferring accuracy could reach 89% and 92% if the user takes the metro for 4 and 6 stations, respectively.

The Internet of Things is the Internet of sensors. I’m sure all kinds of surveillance is possible from all kinds of sensing inputs.

Posted on June 8, 2015 at 6:09 AMView Comments

Metal Detectors at Sports Stadiums

Fans attending Major League Baseball games are being greeted in a new way this year: with metal detectors at the ballparks. Touted as a counterterrorism measure, they’re nothing of the sort. They’re pure security theater: They look good without doing anything to make us safer. We’re stuck with them because of a combination of buck passing, CYA thinking, and fear.

As a security measure, the new devices are laughable. The ballpark metal detectors are much more lax than the ones at an airport checkpoint. They aren’t very sensitive—people with phones and keys in their pockets are sailing through—and there are no X-ray machines. Bags get the same cursory search they’ve gotten for years. And fans wanting to avoid the detectors can opt for a “light pat-down search” instead.

There’s no evidence that this new measure makes anyone safer. A halfway competent ticketholder would have no trouble sneaking a gun into the stadium. For that matter, a bomb exploded at a crowded checkpoint would be no less deadly than one exploded in the stands. These measures will, at best, be effective at stopping the random baseball fan who’s carrying a gun or knife into the stadium. That may be a good idea, but unless there’s been a recent spate of fan shootings and stabbings at baseball games—and there hasn’t—this is a whole lot of time and money being spent to combat an imaginary threat.

But imaginary threats are the only ones baseball executives have to stop this season; there’s been no specific terrorist threat or actual intelligence to be concerned about. MLB executives forced this change on ballparks based on unspecified discussions with the Department of Homeland Security after the Boston Marathon bombing in 2013. Because, you know, that was also a sporting event.

This system of vague consultations and equally vague threats ensure that no one organization can be seen as responsible for the change. MLB can claim that the league and teams “work closely” with DHS. DHS can claim that it was MLB’s initiative. And both can safely relax because if something happens, at least they did something.

It’s an attitude I’ve seen before: “Something must be done. This is something. Therefore, we must do it.” Never mind if the something makes any sense or not.

In reality, this is CYA security, and it’s pervasive in post-9/11 America. It no longer matters if a security measure makes sense, if it’s cost-effective or if it mitigates any actual threats. All that matters is that you took the threat seriously, so if something happens you won’t be blamed for inaction. It’s security, all right—security for the careers of those in charge.

I’m not saying that these officials care only about their jobs and not at all about preventing terrorism, only that their priorities are skewed. They imagine vague threats, and come up with correspondingly vague security measures intended to address them. They experience none of the costs. They’re not the ones who have to deal with the long lines and confusion at the gates. They’re not the ones who have to arrive early to avoid the messes the new policies have caused around the league. And if fans spend more money at the concession stands because they’ve arrived an hour early and have had the food and drinks they tried to bring along confiscated, so much the better, from the team owners’ point of view.

I can hear the objections to this as I write. You don’t know these measures won’t be effective! What if something happens? Don’t we have to do everything possible to protect ourselves against terrorism?

That’s worst-case thinking, and it’s dangerous. It leads to bad decisions, bad design and bad security. A better approach is to realistically assess the threats, judge security measures on their effectiveness and take their costs into account. And the result of that calm, rational look will be the realization that there will always be places where we pack ourselves densely together, and that we should spend less time trying to secure those places and more time finding terrorist plots before they can be carried out.

So far, fans have been exasperated but mostly accepting of these new security measures. And this is precisely the problem—most of us don’t care all that much. Our options are to put up with these measures, or stay home. Going to a baseball game is not a political act, and metal detectors aren’t worth a boycott. But there’s an undercurrent of fear as well. If it’s in the name of security, we’ll accept it. As long as our leaders are scared of the terrorists, they’re going to continue the security theater. And we’re similarly going to accept whatever measures are forced upon us in the name of security. We’re going to accept the National Security Agency’s surveillance of every American, airport security procedures that make no sense and metal detectors at baseball and football stadiums. We’re going to continue to waste money overreacting to irrational fears.

We no longer need the terrorists. We’re now so good at terrorizing ourselves.

This essay previously appeared in the Washington Post.

Posted on April 15, 2015 at 6:58 AMView Comments

Automatic Scanning for Highly Stressed Individuals

This borders on ridiculous:

Chinese scientists are developing a mini-camera to scan crowds for highly stressed individuals, offering law-enforcement officers a potential tool to spot would-be suicide bombers.

[…]

“They all looked and behaved as ordinary people but their level of mental stress must have been extremely high before they launched their attacks. Our technology can detect such people, so law enforcement officers can take precautions and prevent these tragedies,” Chen said.

Officers looking through the device at a crowd would see a mental “stress bar” above each person’s head, and the suspects highlighted with a red face.

The researchers said they were able to use the technology to tell the difference between high-blood oxygen levels produced by stress rather than just physical exertion.

I’m not optimistic about this technology.

Posted on August 13, 2014 at 6:20 AMView Comments

Building Retro Reflectors

A group of researchers have reverse-engineered the NSA’s retro reflectors, and has recreated them using software-defined radio (SDR):

An SDR Ossmann designed and built, called HackRF, was a key part of his work in reconstructing the NSA’s retro-reflector systems. Such systems come in two parts – a plantable “reflector” bug and a remote SDR-based receiver.

One reflector, which the NSA called Ragemaster, can be fixed to a computer’s monitor cable to pick up on-screen images. Another, Surlyspawn, sits on the keyboard cable and harvests keystrokes. After a lot of trial and error, Ossmann found these bugs can be remarkably simple devices – little more than a tiny transistor and a 2-centimetre-long wire acting as an antenna.

Getting the information from the bugs is where SDRs come in. Ossmann found that using the radio to emit a high-power radar signal causes a reflector to wirelessly transmit the data from keystrokes, say, to an attacker. The set-up is akin to a large-scale RFID- chip system. Since the signals returned from the reflectors are noisy and often scattered across different bands, SDR’s versatility is handy, says Robin Heydon at Cambridge Silicon Radio in the UK. “Software-defined radio is flexibly programmable and can tune in to anything,” he says.

The NSA devices are LOUDAUTO, SURLYSPAWN, TAWDRYYARD, and RAGEMASTER. Here are videos that talk about how TAWDRYYARD and LOUDAUTO work.

This is important research. While the information we have about these sorts of tools is largely from the NSA, it is fanciful to assume that they are the only intelligence agency using this technology. And it’s equally fanciful to assume that criminals won’t be using this technology soon, even without Snowden’s documents. Understanding and building these tools is the first step to protecting ourselves from them.

Posted on June 23, 2014 at 6:51 AMView Comments

Heartwave Biometric

Here’s a new biometric I know nothing about:

The wristband relies on authenticating identity by matching the overall shape of the user’s heartwave (captured via an electrocardiogram sensor). Unlike other biotech authentication methods—like fingerprint scanning and iris-/facial-recognition tech—the system doesn’t require the user to authenticate every time they want to unlock something. Because it’s a wearable device, the system sustains authentication so long as the wearer keeps the wristband on.

EDITED TO ADD (12/13): A more technical explanation.

Posted on December 5, 2013 at 1:16 PMView Comments

iPhone Sensor Surveillance

The new iPhone has a motion sensor chip, and that opens up new opportunities for surveillance:

The M7 coprocessors introduce functionality that some may instinctively identify as “creepy.” Even Apple’s own description hints at eerie omniscience: “M7 knows when you’re walking, running, or even driving…” While it’s quietly implemented within iOS, it’s not secret for third party apps (which require an opt-in through pop-up notification, and management through the phone’s Privacy settings). But as we know, most users blindly accept these permissions.

It all comes down to a question of agency in tracking our physical bodies.

The fact that my Fitbit tracks activity without matching it up with all my other data sources, like GPS location or my calendar, is comforting. These data silos can sometimes be frustrating when I want to query across my QS datasets, but the built-in divisions between data about my body ­—and data about the rest of my digital life—leave room for my intentional inquiry and interpretation.

Posted on October 16, 2013 at 7:33 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.