More on Apple’s iPhone Backdoor

In this post, I’ll collect links on Apple’s iPhone backdoor for scanning CSAM images. Previous links are here and here.

Apple says that hash collisions in its CSAM detection system were expected, and not a concern. I’m not convinced that this secondary system was originally part of the design, since it wasn’t discussed in the original specification.

Good op-ed from a group of Princeton researchers who developed a similar system:

Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

EDITED TO ADD (8/30): Good essays by Matthew Green and Alex Stamos, Ross Anderson, Edward Snowden, and Susan Landau. And also Kurt Opsahl.

EDITED TO ADD (9/6): Apple is delaying implementation of the scheme.

Posted on August 20, 2021 at 8:54 AM29 Comments

Comments

++Don August 20, 2021 9:23 AM

Regarding the quote from that op-ed, Apple has at least implicitly acknowledged all along that the technology could be used for other purposes. That’s why they’ve been saying, “We promise not to do that.” Whether we should take them at their word is another question entirely, of course.

Cara Valentine August 20, 2021 9:37 AM

I’ve been telling Appje for years that Im being cyberstalked for over4 years. my cyberstaljers are on a root kevel, going theough the back door. So I guess they are correct when they state it could be used for other purposes………..

mexaly August 20, 2021 9:57 AM

Thanks for calling it, “Apple’s Backdoor.”
Sadly, public policy battles require good sound bites to get any traction.

TimH August 20, 2021 1:08 PM

“We promise not to do that” doesn’t work when a government orders them to “do that”.

Also, the “We can’t do that” argument becomes moot.

They know this too, which suggests to me that there’s an important missing subtext.

zaxxon August 20, 2021 1:23 PM

And regarding the quote from that op-ed
The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

That sort of thing could be said also for example about Whatsapp’s so-called “end to end encryption”.

sven August 20, 2021 1:44 PM

Do we know why Apple decided to piss away their previously good “What happens on iPhone stays on iPhone” reputation for security and do this? Is this going to get them a single extra iPhone sale? Any chance this is Apple telling the world “Come on – you’re not stupid! Obviously it’s not just child porn this system can be used for! We were so successful with security technically that we’ve been secretly pwned legally (NSL) but can’t tell you that it’s not safe publicly so consider this system a Warrant Canary”.

Clive Robinson August 20, 2021 2:01 PM

@ TimH, ALL,

which suggests to me that there’s an important missing subtext.

As is oft said this side of the pond,

You don’t say!

Or

Not half mate!

Basically Apple know which side “the bread is buttered” and what side it’s going to land face down when various Governments push it off the table.

Now the $64,000 question is are Apple just putting in the mechanism on an excuse, or are they putting in a hurdle for governments to fall at?

That is put in just enough to meet what most would consider adiquate to meet the “Think of the children” FUD as well as “lay it off” onto a pair of third parties.

Thereby from Apple’s point of view “move the goal posts” else where onto somebody elses turf…

Look at it this way if two Five-Eyes countries put in the same “Cop-Tag” for something that does not meet the “think of the children” criteria, who gets to carry the can? Not Apple, thus they may well be setting up a “poisoned chalice” for that attack.

Now consider the result now the algorithm is outed… you can be sure that it’s going to get tested if not reversed in some way. This makes it highly likely that each new “cop-tag” is going to get tested by several eyes who are going to look and look hard at it. The two obvious –though there are other–
groups are,

1, Security Researchers
2, Those who traffic images

Both are going to effectively attack the system to make it fail in one way or another. Both failures would suit Apple and the other Tech Industries as they would in effect,

A, Show the “back door” fails.
B, Cause bad publicity in larger numbers of the public against such Government FUD.

As I’ve already indicated if someone is a Type 2 person who traffics images, what is the easiest solution for them, considering how the cip-tags are generated?

Well the cop-tags use “known” images to generate a tag from. So the easy way for a Type 2 person is to,

Simply go out and create a whole load of new images, and not by re-working old images.

Which means a whole load of new victims are created by this FUD.

Just let that sink in for a while, and you will realise just how dishonest those who peddle the “think of the children” are. And if you doubt this dishonesty look at all the harm the UK CEOP and earlier initiatives under Ex-police officer –of dubious repute– Jim Gamble caused as I’ve mentioned before.

Norio August 20, 2021 3:22 PM

Perhaps I’m being excessively cynical, but it seems to me that Apple can sell MORE of their hardware in China and other authoritarian regimes after this technology becomes embedded in their systems. It’ll become a big selling point they can advertise to dictators. As Mayer and Kulshrestha indicate in their op-ed:

China is Apple’s second-largest market, with probably hundreds of millions of devices. What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials? Absolutely nothing, except Apple’s solemn promise. This is the same Apple that blocked Chinese citizens from apps that allow access to censored material, that acceded to China’s demand to store user data in state-owned data centers and whose chief executive infamously declared, “We follow the law wherever we do business.”

When you are an organization whose bottom line is only money, it becomes very difficult to ignore or say no to a market with over a billion potential customers.

SpaceLifeForm August 20, 2021 3:54 PM

@ sven

Note the date

hxtps://gigaom.com/2014/09/18/apples-warrant-canary-disappears-suggesting-new-patriot-act-demands/

[I think I know what is going on currently, but obviously I have no proof. As with any tool, it can be used for good or for bad. Time will tell.]

Geordie August 20, 2021 4:09 PM

I have been waiting for someone to clarify this but it does not seem to have happened in the comments in either the original post or this one. The issues with the hashing are not Apple’s. From their technical white paper, “First, Apple receives the NeuralHashes corresponding to known CSAM from the above child-safety organizations.” We can disagree about whether Apple’s approach of scanning on your device before allowing it to be transferred to their device (server) is the least harm way of achieving the goal of reducing CSAM on their infrastructure but the hash output that suffers from collisions is the same one that Google, FaceBook and all the other vendors have been using for years. I am not taking a position on whether or not hash collisions are something to be concerned about; however if they are, we need to lay blame on the NCMEC, etc. not Apple.

Also I believe there is a subtle difference being glossed over about how this is different from the refusal to create a version of the OS that circumvents encryption. The creation of a broken OS could be used to retroactively remove encryption that already existed and was assumed at the time that the user purchased the phone to be unbreakable. What we are talking about now is significantly different in that it happens prior to encryption and if you decide you don’t like it you can opt out by turning of iCloud syncing or deciding not to use an iPhone. It is perfectly reasonable to dislike what is being proposed but this slippery slope is metaphorically sliding down to a much shallower pit.

Again I am not taking a position on whether or not what Apple is doing is an acceptable solution. I just want to eliminate some of the noise that distracts from the core privacy implications.

echo August 20, 2021 8:21 PM

@Geordie

Again I am not taking a position on whether or not what Apple is doing is an acceptable solution. I just want to eliminate some of the noise that distracts from the core privacy implications.

You did a very good impression of Gus from “Drop the Dead Donkey” defending Sir Roysten Merchant. I’ll give you that.

SpaceLifeForm August 20, 2021 10:34 PM

Dog or Cat? Same NeuralHash.

hxtps://pseudorandom.resistant.tech/neuralhash-collisions.html

[I go with Dog in left image, Cat in right image. Think about it]

Clive Robinson August 21, 2021 12:09 AM

@ SpaceLifeForm,

Dog or Cat? Same NeuralHash

At least the author of this article understands the various laws of small numbers,

“That is not to say that the discovery of such collisions undermines the numbers that Apple disclosed, we don’t really have enough information to make any kind of judgement at all about those numbers”

My personal feeling on the way the collisions are showing up as quickly are,

1, The algorithm is very bad with some domains.
2, The algorithm is actually weak across several image domains.

It is noticible that some of the matches are in essence blocks of colour divided up into areas, which have a “low frequency content” whilst the dividing lines have a high frequency content on what is a reasonably narrow line.

Obviously changing colour to black and white and adjusting grey scale / intensity to similar levels produces almost identical backgrounds. If you then high pass the image the low frequency content effectively disapears leaving bands of high frequency “noise”.

It’s known that humans tend to ignore low frequency content of nearly the same intensity (skin tones)[1]. Likewise that humans tend to ignore high frequency noise on edges (think hair)[2].

Thus you might expect the algorithm to work in a similar way based on it’s aproximate design specification.

Thus to the algorithm such common domain images are going to be the same, based on the assumption anyone trying ro hide an image would make most changes where humans would least perceive them.

When you start to think this way you can see why the Tony Blair and G W Bush images would look very similar to the algorithm but not to humans who “know the faces” thus do not realy “see” the backgrounds.

Thinking on this you can see why I suspect the actuall algorithm is actually not that good. In order to be strong in some ways it has to be at best weak if not very fragile in others. So as humans start to “see” how it “matches” matching image pairs will get easier and easier to find, so Apple’s numbers will start to look increasingly wrong.

However will that realy be the case?

I guess it all depends on what is and is not considered “random” for images. After all most “landscape” domain pictures are very far from random in the low frequency / large area effect. Think of sky/ground ratios remains more or less the same, with most high frequency changes on or very close to these area edges. Because that’s what humans “find natural” so pleasing, which the person taking a photo would almost instinctively take into account.

So “images are not random” mostly, if Apple chose their database to be “mostly random” their figure estimates are going to be off by quite a way.

I’m guessing that those Apple figures will be found to be somewhat “over optimistic”…

[1] It’s why cartoon faces look rather more real than they should do, and blocky “Lego” and “Minecraft” images likewise.

[2] Most “natural” images have very noisy edges think hair, plants/trees, waves/riples, and just about everything else. It’s why “regular” edges like the hexagons of the Giants Causeway make us think “man made”. Donald Knuth the mathmatician wrote several quite readable documents on why including “random” in fonts and images made them feel more natural or aesthetically pleasing.

Sut Vachz August 21, 2021 5:36 AM

https: //condenaststore.com/featured/mrs-hammond-id-know-you-anywhere-from-little-edward-frascino.html

Steve August 21, 2021 8:33 AM

iCloud troublin’ your joy, lock the front door, oh boy!
Look at all the happy snoops spyin’ dusk till dawn
Pass me there a Snapple, today, I’ll buy no Apple
Doo, doo, doo, lookin’ for that back door

With sincere apologies to both John Fogerty and anyone with a sense of rhythm or rhyme.

Andy August 21, 2021 12:20 PM

Given the brand reputational damage/risk this is causing Apple, there has to be more going on here for them to persist with this clearly unpopular measure.
It’s reported 4.2 million individuals have eligibility to access classified information. Paranoia about leaks is off the scale. This device scanning may be an attempt to weed out potential blackmail targets?

John Medcalf August 21, 2021 3:30 PM

Apple must be feeling pressure to fall in line with all other big tech to have a ready made excuse to give government agencies what they demand. Beats me though why checking for old child porn on iPhones is even posited as a significant step to reducing child abuse and thus a plausible excuse for installing their back door. Seems pretty lame. Or maybe Apple made it lame so their constituents would save them from committing this sin.

Bryan August 22, 2021 4:58 PM

Beats me though why checking for old child porn on iPhones is even posited as a significant step to reducing child abuse and thus a plausible excuse for installing their back door.

Apple released two products simultaneously. One of them, for children’s phones in a family group, warns the child that if they send a message with nude content, or receive an image with nude content, then their parents will be notified.

It’s to prevent iMessage, which has end-to-end encryption from being used by people to groom underage kids.

That is a (minor) step in reducing abuse.

The CSAM scan, which this is about, is a separate product. To the extent it might detect a few perverts, it also reduces abuse, but not as much as the iMessage work.

The EU has recently been discussing how end-to-end encryption helps product child abusers (the new terrorists). If this method works, it will allow Apple to prove that governments don’t need to outlaw E2E in order to protect against child sex-abuse.

If it fails, all E2E systems will come under intense legislative attack, such that the average person in the street will effectively lose their right to digital privacy.

We should all want Apple to succeed in this.

echo August 23, 2021 9:10 AM

@Bryan

Apple released two products simultaneously. One of them, for children’s phones in a family group, warns the child that if they send a message with nude content, or receive an image with nude content, then their parents will be notified.

It’s to prevent iMessage, which has end-to-end encryption from being used by people to groom underage kids.

That is a (minor) step in reducing abuse.

The CSAM scan, which this is about, is a separate product. To the extent it might detect a few perverts, it also reduces abuse, but not as much as the iMessage work.

The EU has recently been discussing how end-to-end encryption helps product child abusers (the new terrorists). If this method works, it will allow Apple to prove that governments don’t need to outlaw E2E in order to protect against child sex-abuse.

If it fails, all E2E systems will come under intense legislative attack, such that the average person in the street will effectively lose their right to digital privacy.

We should all want Apple to succeed in this.

It’s not the easy problem people think it is. There are legal issues of competency, consent, and privacy. You cannot have a company driven by “we own your data and our T&C’s overrride your legal rights” simply imposing a system because they can. It just does not work this way in Europe.

I personally want a much stronger legal discussion of the type the UK’s Law Commission would typically have before opening that door. We also need to hear more views from law enforcement and criminologists and sociologists to put everything into context and this will include historical issues and legal drift. It may involve discussions about the nature of policing which is typically gung ho and completely lacking with regard to white collar crime and social policy led policing. There’s also the issue of whether the internet as organised is revealing a problem or creating a problem and this needs to be looked at. You also need to look deeper into healthcare and support services and media and advertising. The very protocols the internet is based on are exploited both by security services and criminals which creates or allows things which probably should not happen.

I intensely dislike being bounced into decisions and need to make my own mind up. I’m simply not going to agree if I think peer pressure or job titles or emotional blackmail or anything like this is being used to push a decision and think a lot of this has been happening recently.

The world is complex and there can be knock on effects and unintended consquences both direct and indirect. I’m old enough I’ve seen enough knee jerk initiatives and rushed through decisions and expectations of agreement just because to be digging my heels in quite strongly.

SpaceLifeForm August 23, 2021 3:47 PM

Nothing new, nothing to see here

hxtps://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/

lurker August 23, 2021 9:23 PM

@SLF
Nothing to see if images are wrapped in plain brown paper, maybe stronger than uuencode. Then the new magic eye would try to scan the mail too.

Who? August 24, 2021 5:50 AM

I do not think Apple’s backdoor affects only iPhones. Right now I would consider any Apple-branded device and/or service deeply compromised; in other words, useless for anything but playing on the Internet.

1&1~=Umm August 24, 2021 7:49 AM

@Who:

“Right now I would consider any Apple-branded device and/or service deeply compromised; in other words, useless for anything but playing on the Internet.”

There is an irony in that statment that caused a wry smile.

Based on –supposed US social media figures– of what people get upto “playing on the Internet” these days. It’s mainly,

‘The 5G’ of ‘Girls, Gambling, Games, Gossip and Gripe.’

At least one of which apparentlty get nearly everyones dopamine sloshing around their grey cells, thus is ‘addictive’ to them in some way…

Which is probably why all of ‘The 5G’ are almost certainly legally questionable if not prohibited somewhere…

AL August 24, 2021 10:25 AM

“Apple released two products simultaneously. One of them …”
is an eavesdropping device that currently scans images but could be repurposed to scan text, particularly if Apple is issued a warrant accompanied with a gag order.

With the presence of eavesdropping software, the assurance that it will be used benevolently is no assurance at all. And now that everyone is on notice about iMessage, what would stop the grooming of children using Facetime? There is a missing shoe to drop, on Facetime.

If parents need this stuff, then it should be installed as an addon, not something built in into everyone’s phone that can be activated surreptitiously.

hamslabs August 24, 2021 1:30 PM

We have no way of knowing whether Apple of Google is already collecting everything on our phones for their own use or because of governmental pressure. At some level it’s about trust. Either you trust the mobile vendor and their suppliers or you don’t. If you want to use these devices, you’re stuck trusting someone and they might not be forthcoming about what they are actually doing. So Apple saying “we won’t do it” really means nothing if they are already doing things we don’t know about. The choice really is “do I use a smartphone or not” and there is no way around that.

That said, Apple has the best infrastructure do do things without a backdoor since they control the processor, the OS and just about every piece of their devices. This isn’t to say that they aren’t doing bad thing, just that the collection of companies you need to trust is smaller.

Clive Robinson August 24, 2021 4:05 PM

@ hamslabs,

At some level it’s about trust. Either you trust the mobile vendor and their suppliers or you don’t.

Why would I trust any supplier to do anything other than I pay them to do?

In fact why should I even trust them that far?

So I’d be foolish to invest any trust in them, especially as I should have no need to. To trust them would be foolish so why would you or anyone else. Saying,

If you want to use these devices, you’re stuck trusting someone …

Does not exactly make it even close to true… In fact for a couple of thousand years or more, probably even more than four millennia, people have failed to trust each other. So they have sat down and thought about it and then put in place measures to mitigate “betrayal” by a second party to a third party etc.

The earliest evidence we have are baked clay tablets with writing on them, that then get baked inside what are effectively clay envelopes. Later we have more direct evidence of codes and ciphers being used.

The point is “betrayal from trust” is with a little thought either avoidable or can be mitigated to where poyential harm/loss can be minimized.

All you have to do is “think things through”, if you can not them maybe you are not smart enough to be doing what you are planning to do.

To be blunt when you analyse it, every time people get betrayed / conned it’s because they make the mistake of “trusting” or being too lazy or stupid to mitigate what it is they are doing (encrochat etc being prime examples).

As for second or third parties and,

… they might not be forthcoming about what they are actually doing.

If you mitigate correctly then that is irrelevant.

Many people make the mistake of thinking if they have “sufficient hold over/on someone” then they can be trusted… Well centuries of “turning Kings/States Evidence” and more recently “witness protection” show the futility of that stratagem.

But sit down and think about it and you realise that there are better ways to mitigate, you just have to think them through.

Have a think about it and you will see that it is true.

name.withheld.for.obvious.reasons August 25, 2021 3:14 PM

@Clive Robinson

Well centuries of “turning Kings/States Evidence” and more recently “witness protection” show the futility of that stratagem.

I’ll take what are not whistleblowers for $1000 Alex.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.