It’s amazing that this is even possible: “SonarSnoop: Active Acoustic Side-Channel Attacks“:
Abstract: We report the first active acoustic side-channel attack. Speakers are used to emit human inaudible acoustic signals and the echo is recorded via microphones, turning the acoustic system of a smart phone into a sonar system. The echo signal can be used to profile user interaction with the device. For example, a victim’s finger movements can be inferred to steal Android phone unlock patterns. In our empirical study, the number of candidate unlock patterns that an attacker must try to authenticate herself to a Samsung S4 Android phone can be reduced by up to 70% using this novel acoustic side-channel. Our approach can be easily applied to other application scenarios and device types. Overall, our work highlights a new family of security threats.
Posted on September 5, 2018 at 6:05 AM •
Google is tracking you, even if you turn off tracking:
Google says that will prevent the company from remembering where you’ve been. Google’s support page on the subject states: “You can turn off Location History at any time. With Location History off, the places you go are no longer stored.”
That isn’t true. Even with Location History paused, some Google apps automatically store time-stamped location data without asking.
For example, Google stores a snapshot of where you are when you merely open its Maps app. Automatic daily weather updates on Android phones pinpoint roughly where you are. And some searches that have nothing to do with location, like “chocolate chip cookies,” or “kids science kits,” pinpoint your precise latitude and longitude - accurate to the square foot - and save it to your Google account.
On the one hand, this isn’t surprising to technologists. Lots of applications use location data. On the other hand, it’s very surprising—and counterintuitive—to everyone else. And that’s why this is a problem.
I don’t think we should pick on Google too much, though. Google is a symptom of the bigger problem: surveillance capitalism in general. As long as surveillance is the business model of the Internet, things like this are inevitable.
Posted on August 14, 2018 at 6:22 AM •
Interesting research in using traffic analysis to learn things about encrypted traffic. It’s hard to know how critical these vulnerabilities are. They’re very hard to close without wasting a huge amount of bandwidth.
The active attacks are more interesting.
EDITED TO ADD (7/3): More information.
I have been thinking about this, and now believe the attacks are more serious than I previously wrote.
Posted on July 2, 2018 at 9:35 AM •
Forbes reports that the Israeli company Cellebrite can probably unlock all iPhone models:
Cellebrite, a Petah Tikva, Israel-based vendor that’s become the U.S. government’s company of choice when it comes to unlocking mobile devices, is this month telling customers its engineers currently have the ability to get around the security of devices running iOS 11. That includes the iPhone X, a model that Forbes has learned was successfully raided for data by the Department for Homeland Security back in November 2017, most likely with Cellebrite technology.
It also appears the feds have already tried out Cellebrite tech on the most recent Apple handset, the iPhone X. That’s according to a warrant unearthed by Forbes in Michigan, marking the first known government inspection of the bleeding edge smartphone in a criminal investigation. The warrant detailed a probe into Abdulmajid Saidi, a suspect in an arms trafficking case, whose iPhone X was taken from him as he was about to leave America for Beirut, Lebanon, on November 20. The device was sent to a Cellebrite specialist at the DHS Homeland Security Investigations Grand Rapids labs and the data extracted on December 5.
This story is based on some excellent reporting, but leaves a lot of questions unanswered. We don’t know exactly what was extracted from any of the phones. Was it metadata or data, and what kind of metadata or data was it.
The story I hear is that Cellebrite hires ex-Apple engineers and moves them to countries where Apple can’t prosecute them under the DMCA or its equivalents. There’s also a credible rumor that Cellebrite’s mechanisms only defeat the mechanism that limits the number of password attempts. It does not allow engineers to move the encrypted data off the phone and run an offline password cracker. If this is true, then strong passwords are still secure.
EDITED TO ADD (3/1): Another article, with more information. It looks like there’s an arms race going on between Apple and Cellebrite. At least, if Cellebrite is telling the truth—which they may or may not be.
Posted on February 27, 2018 at 5:58 AM •
Edward Snowden and Nathan Freitas have created an Android app that detects when it’s being tampered with. The basic idea is to put the app on a second phone and put the app on or near something important, like your laptop. The app can then text you—and also record audio and video—when something happens around it: when it’s moved, when the lighting changes, and so on. This gives you some protection against the “evil maid attack” against laptops.
Micah Lee has a good article about the app, including some caveats about its use and security.
Posted on January 3, 2018 at 6:17 AM •
The FDA has approved a pill with an embedded sensor that can report when it is swallowed. The pill transmits information to a wearable patch, which in turn transmits information to a smartphone.
Posted on December 11, 2017 at 6:37 AM •
Andrew “bunnie” Huang and Edward Snowden have designed a hardware device that attaches to an iPhone and monitors it for malicious surveillance activities, even in instances where the phone’s operating system has been compromised. They call it an Introspection Engine, and their use model is a journalist who is concerned about government surveillance:
Our introspection engine is designed with the following goals in mind:
- Completely open source and user-inspectable (“You don’t have to trust us”)
- Introspection operations are performed by an execution domain completely separated from the phone”s CPU (“don’t rely on those with impaired judgment to fairly judge their state”)
- Proper operation of introspection system can be field-verified (guard against “evil maid” attacks and hardware failures)
- Difficult to trigger a false positive (users ignore or disable security alerts when there are too many positives)
- Difficult to induce a false negative, even with signed firmware updates (“don’t trust the system vendor”—state-level adversaries with full cooperation of system vendors should not be able to craft signed firmware updates that spoof or bypass the introspection engine)
- As much as possible, the introspection system should be passive and difficult to detect by the phone’s operating system (prevent black-listing/targeting of users based on introspection engine signatures)
- Simple, intuitive user interface requiring no specialized knowledge to interpret or operate (avoid user error leading to false negatives; “journalists shouldn’t have to be cryptographers to be safe”)
- Final solution should be usable on a daily basis, with minimal impact on workflow (avoid forcing field reporters into the choice between their personal security and being an effective journalist)
This looks like fantastic work, and they have a working prototype.
Of course, this does nothing to stop all the legitimate surveillance that happens over a cell phone: location tracking, records of who you talk to, and so on.
Posted on September 11, 2017 at 6:12 AM •
Researchers demonstrated a really clever hack: they hid malware in a replacement smart phone screen. The idea is that you would naively bring your smart phone in for repair, and the repair shop would install this malicious screen without your knowledge. The malware is hidden in touchscreen controller software, which is trusted by the phone.
The concern arises from research that shows how replacement screens—one put into a Huawei Nexus 6P and the other into an LG G Pad 7.0—can be used to surreptitiously log keyboard input and patterns, install malicious apps, and take pictures and e-mail them to the attacker. The booby-trapped screens also exploited operating system vulnerabilities that bypassed key security protections built into the phones. The malicious parts cost less than $10 and could easily be mass-produced. Most chilling of all, to most people, the booby-trapped parts could be indistinguishable from legitimate ones, a trait that could leave many service technicians unaware of the maliciousness. There would be no sign of tampering unless someone with a background in hardware disassembled the repaired phone and inspected it.
Academic paper. BoingBoing post.
Posted on August 28, 2017 at 6:22 AM •
Sidebar photo of Bruce Schneier by Joe MacInnis.