Entries Tagged "Apple"

Page 1 of 14

Security Risks of Client-Side Scanning

Even before Apple made its announcement, law enforcement shifted their battle for backdoors to client-side scanning. The idea is that they wouldn’t touch the cryptography, but instead eavesdrop on communications and systems before encryption or after decryption. It’s not a cryptographic backdoor, but it’s still a backdoor — and brings with it all the insecurities of a backdoor.

I’m part of a group of cryptographers that has just published a paper discussing the security risks of such a system. (It’s substantially the same group that wrote a similar paper about key escrow in 1997, and other “exceptional access” proposals in 2015. We seem to have to do this every decade or so.) In our paper, we examine both the efficacy of such a system and its potential security failures, and conclude that it’s a really bad idea.

We had been working on the paper well before Apple’s announcement. And while we do talk about Apple’s system, our focus is really on the idea in general.

Ross Anderson wrote a blog post on the paper. (It’s always great when Ross writes something. It means I don’t have to.) So did Susan Landau. And there’s press coverage in the New York Times, the Guardian, Computer Weekly, the Financial Times, Forbes, El Pais (English translation), NRK (English translation), and — this is the best article of them all — the Register. See also this analysis of the law and politics of client-side scanning from last year.

Posted on October 15, 2021 at 9:30 AMView Comments

More on Apple’s iPhone Backdoor

In this post, I’ll collect links on Apple’s iPhone backdoor for scanning CSAM images. Previous links are here and here.

Apple says that hash collisions in its CSAM detection system were expected, and not a concern. I’m not convinced that this secondary system was originally part of the design, since it wasn’t discussed in the original specification.

Good op-ed from a group of Princeton researchers who developed a similar system:

Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

EDITED TO ADD (8/30): Good essays by Matthew Green and Alex Stamos, Ross Anderson, Edward Snowden, and Susan Landau. And also Kurt Opsahl.

EDITED TO ADD (9/6): Apple is delaying implementation of the scheme.

Posted on August 20, 2021 at 8:54 AMView Comments

Apple’s NeuralHash Algorithm Has Been Reverse-Engineered

Apple’s NeuralHash algorithm — the one it’s using for client-side scanning on the iPhone — has been reverse-engineered.

Turns out it was already in iOS 14.3, and someone noticed:

Early tests show that it can tolerate image resizing and compression, but not cropping or rotations.

We also have the first collision: two images that hash to the same value.

The next step is to generate innocuous images that NeuralHash classifies as prohibited content.

This was a bad idea from the start, and Apple never seemed to consider the adversarial context of the system as a whole, and not just the cryptography.

Posted on August 18, 2021 at 11:51 AMView Comments

Apple Adds a Backdoor to iMessage and iCloud Storage

Apple’s announcement that it’s going to start scanning photos for child abuse material is a big deal. (Here are five news stories.) I have been following the details, and discussing it in several different email lists. I don’t have time right now to delve into the details, but wanted to post something.

EFF writes:

There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts — that is, accounts designated as owned by a minor — for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

This is pretty shocking coming from Apple, which is generally really good about privacy. It opens the door for all sorts of other surveillance, since now that the system is built it can be used for all sorts of other messages. And it breaks end-to-end encryption, despite Apple’s denials:

Does this break end-to-end encryption in Messages?

No. This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom. If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit. For accounts of children age 12 and under, parents can set up parental notifications which will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit. None of the communications, image evaluation, interventions, or notifications are available to Apple.

Notice Apple changing the definition of “end-to-end encryption.” No longer is the message a private communication between sender and receiver. A third party is alerted if the message meets a certain criteria.

This is a security disaster. Read tweets by Matthew Green and Edward Snowden. Also this. I’ll post more when I see it.

Beware the Four Horsemen of the Information Apocalypse. They’ll scare you into accepting all sorts of insecure systems.

EDITED TO ADD: This is a really good write-up of the problems.

EDITED TO ADD: Alex Stamos comments.

An open letter to Apple criticizing the project.

A leaked Apple memo responding to the criticisms. (What are the odds that Apple did not intend this to leak?)

EDITED TO ADD: John Gruber’s excellent analysis.

EDITED TO ADD (8/11): Paul Rosenzweig wrote an excellent policy discussion.

EDITED TO ADD (8/13): Really good essay by EFF’s Kurt Opsahl. Ross Anderson did an interview with Glenn Beck. And this news article talks about dissent within Apple about this feature.

The Economist has a good take. Apple responds to criticisms. (It’s worth watching the Wall Street Journal video interview as well.)

EDITED TO ADD (8/14): Apple released a threat model

EDITED TO ADD (8/20): Follow-on blog posts here and here.

Posted on August 10, 2021 at 6:37 AMView Comments

Apple Will Offer Onion Routing for iCloud/Safari Users

At this year’s Apple Worldwide Developer Conference, Apple announced something called “iCloud Private Relay.” That’s basically its private version of onion routing, which is what Tor does.

Privacy Relay is built into both the forthcoming iOS and MacOS versions, but it will only work if you’re an iCloud Plus subscriber and you have it enabled from within your iCloud settings.

Once it’s enabled and you open Safari to browse, Private Relay splits up two pieces of information that — when delivered to websites together as normal — could quickly identify you. Those are your IP address (who and exactly where you are) and your DNS request (the address of the website you want, in numeric form).

Once the two pieces of information are split, Private Relay encrypts your DNS request and sends both the IP address and now-encrypted DNS request to an Apple proxy server. This is the first of two stops your traffic will make before you see a website. At this point, Apple has already handed over the encryption keys to the third party running the second of the two stops, so Apple can’t see what website you’re trying to access with your encrypted DNS request. All Apple can see is your IP address.

Although it has received both your IP address and encrypted DNS request, Apple’s server doesn’t send your original IP address to the second stop. Instead, it gives you an anonymous IP address that is approximately associated with your general region or city.

Not available in China, of course — and also Belarus, Colombia, Egypt, Kazakhstan, Saudi Arabia, South Africa, Turkmenistan, Uganda, and the Philippines.

Posted on June 22, 2021 at 6:54 AMView Comments

1 2 3 14

Sidebar photo of Bruce Schneier by Joe MacInnis.