It’s been long known that individual analog devices have their own fingerprints. Decades ago, individual radio transmitters were identifiable and trackable. Now, researchers have found that accelerometers in smartphone are unique enough to be identifiable.
The researchers focused specifically on the accelerometer, a sensor that tracks three-dimensional movements of the phone essential for countless applications, including pedometers, sleep monitoring, mobile gaming but their findings suggest that other sensors could leave equally unique fingerprints.
“When you manufacture the hardware, the factory cannot produce the identical thing in millions,” Roy said. “So these imperfections create fingerprints.”
Of course, these fingerprints are only visible when accelerometer data signals are analyzed in detail. Most applications do not require this level of analysis, yet the data shared with all applications — your favorite game, your pedometer — bear the mark. Should someone want to perform this analysis, they could do so.
The researchers tested more than 100 devices over the course of nine months: 80 standalone accelerometer chips used in popular smartphones, 25 Android phones and two tablets.
The accelerometers in all permutations were selected from different manufacturers, to ensure that the fingerprints weren’t simply defects resulting from a particular production line.
With 96-percent accuracy, the researchers could discriminate one sensor from another.
Posted on April 30, 2014 at 1:05 PM •
Good essay on the Quantified Toilet hoax, and the difference between public surveillance and private self-surveillance.
Posted on April 30, 2014 at 8:58 AM •
This is interesting:
Touch ID takes a 88×88 500ppi scan of your finger and temporarily sends that data to a secure cache located near the RAM, after the data is vectorized and forwarded to the secure enclave located on the top left of the A7 near the M7 processor it is immediately discarded after processing. The fingerprint scanner uses subdermal ridge flows (inner layer of skin) to prevent loss of accuracy if you were to have micro cuts or debris on your finger.
With iOS 7.1.1 Apple now takes multiple scans of each position you place finger at setup instead of a single one and uses algorithms to predict potential errors that could arise in the future. Touch ID was supposed to gradually improve accuracy with every scan but the problem was if you didn’t scan well on setup it would ruin your experience until you re-setup your finger. iOS 7.1.1 not only removes that problem and increases accuracy but also greatly reduces the calculations your iPhone 5S had to make while unlocking the device which means you should get a much faster unlock time.
Posted on April 29, 2014 at 6:47 AM •
Handycipher is a new pencil-and-paper symmetric encryption algorithm. I’d bet a gazillion dollars that it’s not secure, although I haven’t done the cryptanalysis myself.
Posted on April 28, 2014 at 6:45 AM •
It’s called “Tentacles.”
As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.
Posted on April 25, 2014 at 4:17 PM •
Interesting essay about how Google’s lack of transparency is hurting their trust:
The reality is that Google’s business is and has always been about mining as much data as possible to be able to present information to users. After all, it can’t display what it doesn’t know. Google Search has always been an ad-supported service, so it needs a way to sell those users to advertisers — that’s how the industry works. Its Google Now voice-based service is simply a form of Google Search, so it too serves advertisers’ needs.
In the digital world, advertisers want to know more than the 100,000 people who might be interested in buying a new car. They now want to know who those people are, so they can reach out to them with custom messages that are more likely to be effective. They may not know you personally, but they know your digital persona — basically, you. Google needs to know about you to satisfy its advertisers’ demands.
Once you understand that, you understand why Google does what it does. That’s simply its business. Nothing is free, so if you won’t pay cash, you’ll have to pay with personal information. That business model has been around for decades; Google didn’t invent that business model, but Google did figure out how to make it work globally, pervasively, appealingly, and nearly instantaneously.
I don’t blame Google for doing that, but I blame it for being nontransparent. Putting unmarked sponsored ads in the “regular” search results section is misleading, because people have been trained by Google to see that section of the search results as neutral. They are in fact not. Once you know that, you never quite trust Google search results again. (Yes, Bing’s results are similarly tainted. But Microsoft never promised to do no evil, and most people use Google.)
Posted on April 24, 2014 at 6:45 AM •
Surveillance is getting cheaper and easier:
Two artists have revealed Conversnitch, a device they built for less than $100 that resembles a lightbulb or lamp and surreptitiously listens in on nearby conversations and posts snippets of transcribed audio to Twitter. Kyle McDonald and Brian House say they hope to raise questions about the nature of public and private spaces in an era when anything can be broadcast by ubiquitous, Internet-connected listening devices.
This is meant as an art project to raise awareness, but the technology is getting cheaper all the time.
The surveillance gadget they unveiled Wednesday is constructed from little more than a Raspberry Pi miniature computer, a microphone, an LED and a plastic flower pot. It screws into and draws power from any standard bulb socket. Then it uploads captured audio via the nearest open Wi-Fi network to Amazon’s Mechanical Turk crowdsourcing platform, which McDonald and House pay small fees to transcribe the audio and post lines of conversation to Conversnitch’s Twitter account.
Consumer spy devices are now affordable by the masses. For $54, you can buy a camera hidden in a smoke detector. For $80, you can buy one hidden in an alarm clock. There are many more options.
Posted on April 23, 2014 at 2:33 PM •
Interesting research on the security of code written in different programming languages. We don’t know whether the security is a result of inherent properties of the language, or the relative skill of the typical programmers of that language.
EDITED TO ADD (5/14): Direct link to The report.
Posted on April 23, 2014 at 7:53 AM •
To repeat, Heartbleed is a common mode failure. We would not know about it were it not open source (Good). That it is open source has been shown to be no talisman against error (Sad). Because errors are statistical while exploitation is not, either errors must be stamped out (which can only result in dampening the rate of innovation and rewarding corporate bigness) or that which is relied upon must be field upgradable (Real Politik). If the device is field upgradable, then it pays to regularly exercise that upgradability both to keep in fighting trim and to make the opponent suffer from the rapidity with which you change his target.
The whole thing is worth reading.
Posted on April 22, 2014 at 7:52 AM •
Russian law gives Russia’s security service, the FSB, the authority to use SORM (“System for Operative Investigative Activities”) to collect, analyze and store all data that transmitted or received on Russian networks, including calls, email, website visits and credit card transactions. SORM has been in use since 1990 and collects both metadata and content. SORM-1 collects mobile and landline telephone calls. SORM-2 collects internet traffic. SORM-3 collects from all media (including Wi-Fi and social networks) and stores data for three years. Russian law requires all internet service providers to install an FSB monitoring device (called “Punkt Upravlenia”) on their networks that allows the direct collection of traffic without the knowledge or cooperation of the service provider. The providers must pay for the device and the cost of installation.
Collection requires a court order, but these are secret and not shown to the service provider. According to the data published by Russia’s Supreme Court, almost 540,000 intercepts of phone and internet traffic were authorized in 2012. While the FSB is the principle agency responsible for communications surveillance, seven other Russian security agencies can have access to SORM data on demand. SORM is routinely used against political opponents and human rights activists to monitor them and to collect information to use against them in “dirty tricks” campaigns. Russian courts have upheld the FSB’s authority to surveil political opponents even if they have committed no crime. Russia used SORM during the Olympics to monitor athletes, coaches, journalists, spectators, and the Olympic Committee, publicly explaining this was necessary to protect against terrorism. The system was an improved version of SORM that can combine video surveillance with communications intercepts.
EDITED TO ADD (4/23): This article from World Policy Journal is excellent.
Posted on April 21, 2014 at 5:55 AM •
Sidebar photo of Bruce Schneier by Joe MacInnis.