I recently wrote about the new ability to disable the Touch ID login on iPhones. This is important because of a weirdness in current US law that protects people’s passcodes from forced disclosure in ways it does not protect actions: being forced to place a thumb on a fingerprint reader.
There’s another, more significant, change: iOS now requires a passcode before the phone will establish trust with another device.
In the current system, when you connect your phone to a computer, you’re prompted with the question “Trust this computer?” and you can click yes or no. Now you have to enter in your passcode again. That means if the police have an unlocked phone, they can scroll through the phone looking for things but they can’t download all of the contents onto a another computer without also knowing the passcode.
This might be particularly consequential during border searches. The “border search” exception, which allows Customs and Border Protection to search anything going into the country, is a contentious issue when applied electronics. It is somewhat (but not completely) settled law, but that the U.S. government can, without any cause at all (not even “reasonable articulable suspicion”, let alone “probable cause”), copy all the contents of my devices when I reenter the country sows deep discomfort in myself and many others. The only legal limitation appears to be a promise not to use this information to connect to remote services. The new iOS feature means that a Customs office can browse through a device—a time limited exercise—but not download the full contents.
Posted on September 15, 2017 at 6:28 AM •
Turns out that all the major voice assistants—Siri, Google Now, Samsung S Voice, Huawei
HiVoice, Cortana and Alexa—listen at audio frequencies the human ear can’t hear. Hackers can hijack those systems with inaudible commands that their owners can’t hear.
Posted on September 13, 2017 at 6:03 AM •
Andrew “bunnie” Huang and Edward Snowden have designed a hardware device that attaches to an iPhone and monitors it for malicious surveillance activities, even in instances where the phone’s operating system has been compromised. They call it an Introspection Engine, and their use model is a journalist who is concerned about government surveillance:
Our introspection engine is designed with the following goals in mind:
- Completely open source and user-inspectable (“You don’t have to trust us”)
- Introspection operations are performed by an execution domain completely separated from the phone”s CPU (“don’t rely on those with impaired judgment to fairly judge their state”)
- Proper operation of introspection system can be field-verified (guard against “evil maid” attacks and hardware failures)
- Difficult to trigger a false positive (users ignore or disable security alerts when there are too many positives)
- Difficult to induce a false negative, even with signed firmware updates (“don’t trust the system vendor”—state-level adversaries with full cooperation of system vendors should not be able to craft signed firmware updates that spoof or bypass the introspection engine)
- As much as possible, the introspection system should be passive and difficult to detect by the phone’s operating system (prevent black-listing/targeting of users based on introspection engine signatures)
- Simple, intuitive user interface requiring no specialized knowledge to interpret or operate (avoid user error leading to false negatives; “journalists shouldn’t have to be cryptographers to be safe”)
- Final solution should be usable on a daily basis, with minimal impact on workflow (avoid forcing field reporters into the choice between their personal security and being an effective journalist)
This looks like fantastic work, and they have a working prototype.
Of course, this does nothing to stop all the legitimate surveillance that happens over a cell phone: location tracking, records of who you talk to, and so on.
Posted on September 11, 2017 at 6:12 AM •
A new feature in Apple’s new iPhone operating system—iOS 11—will allow users to quickly disable Touch ID.
A new setting, designed to automate emergency services calls, lets iPhone users tap the power button quickly five times to call 911. This doesn’t automatically dial the emergency services by default, but it brings up the option to and also temporarily disables Touch ID until you enter a passcode.
This is useful in situations where the police cannot compel you to divulge your password, but can compel you to press your finger on the reader.
Posted on August 21, 2017 at 6:57 AM •
There’s interesting research on using a set of “master” digital fingerprints to fool biometric readers. The work is theoretical at the moment, but they might be able to open about two-thirds of iPhones with these master prints.
Definitely something to keep watching.
Research paper (behind a paywall).
EDITED TO ADD (6/13): The research paper is online.
Posted on May 24, 2017 at 6:44 AM •
The US Senate just approved Signal for staff use. Signal is a secure messaging app with no backdoor, and no large corporate owner who can be pressured to install a backdoor.
Susan Landau comments.
Maybe I’m being optimistic, but I think we just won the Crypto War. A very important part of the US government is prioritizing security over surveillance.
Posted on May 17, 2017 at 2:45 PM •
Interesting research: “A Study of MAC Address Randomization in Mobile Devices When it Fails“:
Abstract: Media Access Control (MAC) address randomization is a privacy technique whereby mobile devices rotate through random hardware addresses in order to prevent observers from singling out their traffic or physical location from other nearby devices. Adoption of this technology, however, has been sporadic and varied across device manufacturers. In this paper, we present the first wide-scale study of MAC address randomization in the wild, including a detailed breakdown of different randomization techniques by operating system, manufacturer, and model of device. We then identify multiple flaws in these implementations which can be exploited to defeat randomization as performed by existing devices. First, we show that devices commonly make improper use of randomization by sending wireless frames with the true, global address when they should be using a randomized address. We move on to extend the passive identification techniques of Vanhoef et al. to effectively defeat randomization in 96% of Android phones. Finally, we show a method that can be used to track 100% of devices using randomization, regardless of manufacturer, by exploiting a previously unknown flaw in the way existing wireless chipsets handle low-level control frames.
Basically, iOS and Android phones are not very good at randomizing their MAC addresses. And tricks with level-2 control frames can exploit weaknesses in their chipsets.
Posted on March 20, 2017 at 5:05 AM •
There’s a Kickstarter for a sticker that you can stick on a glove and then register with a biometric access system like an iPhone. It’s an interesting security trade-off: swapping something you are (the biometric) with something you have (the glove).
Posted on November 14, 2016 at 9:26 AM •
Remember the San Bernardino killer’s iPhone, and how the FBI maintained that they couldn’t get the encryption key without Apple providing them with a universal backdoor? Many of us computer-security experts said that they were wrong, and there were several possible techniques they could use. One of them was manually removing the flash chip from the phone, extracting the memory, and then running a brute-force attack without worrying about the phone deleting the key.
The FBI said it was impossible. We all said they were wrong. Now, Sergei Skorobogatov has proved them wrong. Here’s his paper:
Abstract: This paper is a short summary of a real world mirroring attack on the Apple iPhone 5c passcode retry counter under iOS 9. This was achieved by desoldering the NAND Flash chip of a sample phone in order to physically access its connection to the SoC and partially reverse engineering its proprietary bus protocol. The process does not require any expensive and sophisticated equipment. All needed parts are low cost and were obtained from local electronics distributors. By using the described and successful hardware mirroring process it was possible to bypass the limit on passcode retry attempts. This is the first public demonstration of the working prototype and the real hardware mirroring process for iPhone 5c. Although the process can be improved, it is still a successful proof-of-concept project. Knowledge of the possibility of mirroring will definitely help in designing systems with better protection. Also some reliability issues related to the NAND memory allocation in iPhone 5c are revealed. Some future research directions are outlined in this paper and several possible countermeasures are suggested. We show that claims that iPhone 5c NAND mirroring was infeasible were ill-advised.
Susan Landau explains why this is important:
The moral of the story? It’s not, as the FBI has been requesting, a bill to make it easier to access encrypted communications, as in the proposed revised Burr-Feinstein bill. Such “solutions” would make us less secure, not more so. Instead we need to increase law enforcement’s capabilities to handle encrypted communications and devices. This will also take more funding as well as redirection of efforts. Increased security of our devices and simultaneous increased capabilities of law enforcement are the only sensible approach to a world where securing the bits, whether of health data, financial information, or private emails, has become of paramount importance.
Or: The FBI needs computer-security expertise, not backdoors.
Patrick Ball writes about the dangers of backdoors.
EDITED TO ADD (9/23): Good article from the Economist.
Posted on September 15, 2016 at 8:54 AM •
Sidebar photo of Bruce Schneier by Joe MacInnis.