Apple Adds a Backdoor to iMessage and iCloud Storage

Apple’s announcement that it’s going to start scanning photos for child abuse material is a big deal. (Here are five news stories.) I have been following the details, and discussing it in several different email lists. I don’t have time right now to delve into the details, but wanted to post something.

EFF writes:

There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

This is pretty shocking coming from Apple, which is generally really good about privacy. It opens the door for all sorts of other surveillance, since now that the system is built it can be used for all sorts of other messages. And it breaks end-to-end encryption, despite Apple’s denials:

Does this break end-to-end encryption in Messages?

No. This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. Any user of Messages, including those with with communication safety enabled, retains control over what is sent and to whom. If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit. For accounts of children age 12 and under, parents can set up parental notifications which will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit. None of the communications, image evaluation, interventions, or notifications are available to Apple.

Notice Apple changing the definition of “end-to-end encryption.” No longer is the message a private communication between sender and receiver. A third party is alerted if the message meets a certain criteria.

This is a security disaster. Read tweets by Matthew Green and Edward Snowden. Also this. I’ll post more when I see it.

Beware the Four Horsemen of the Information Apocalypse. They’ll scare you into accepting all sorts of insecure systems.

EDITED TO ADD: This is a really good write-up of the problems.

EDITED TO ADD: Alex Stamos comments.

An open letter to Apple criticizing the project.

A leaked Apple memo responding to the criticisms. (What are the odds that Apple did not intend this to leak?)

EDITED TO ADD: John Gruber’s excellent analysis.

EDITED TO ADD (8/11): Paul Rosenzweig wrote an excellent policy discussion.

EDITED TO ADD (8/13): Really good essay by EFF’s Kurt Opsahl. Ross Anderson did an interview with Glenn Beck. And this news article talks about dissent within Apple about this feature.

The Economist has a good take. Apple responds to criticisms. (It’s worth watching the Wall Street Journal video interview as well.)

EDITED TO ADD (8/14): Apple released a threat model

EDITED TO ADD (8/20): Follow-on blog posts here and here.

Posted on August 10, 2021 at 6:37 AM101 Comments

Comments

fp August 10, 2021 7:06 AM

The link following the two Twitter ones isn’t correct, unless, of course, Apple is releasing pillows with a backdoor.

Ian August 10, 2021 7:39 AM

The link to the Forbes article appears to be wrong. Unless of course, it’s some kind of coded message that we need to decode, in which case that would be awesome.

Hedo August 10, 2021 7:59 AM

Time to ditch your overpriced “unpenetrable” (LOL) Macs, that is, if you own them. Why support a company that does this? MS & Google are in the same trash bin in my life. If you know how to build and secure your own computers you should be semi-fine. Always been that way, always will be that way. Of course, if you’re a big fan of clouds, don’t be surprised when it suddenly rains on you, but it’ll be a shitstorm instead of a rainstorm.

Again, this opens up many new business opportunities for a bunch of other “fruits” (some may not like Apple that much) that will attract a bunch of new sheeple(customers) by promising them 100% privacy/security/anonymity. IF you trust them with your data.
Where is the system in place like they have with the safe deposits at some banks – two keys, one is with you, and one with the bank. Only both keys can open the safe deposit box – the bank and the customer. Here, you give your data to complete strangers and you have NO IDEA who has access to it.

A cultural change needs to take place (a RE-education) regarding who to trust, and why? Based on what? If you care for your data, you’ll keep it to yourself, and in your OWN cloud. Where there’s a will – you’ll find a way. Always.

Only a big time fool can be persuaded that if they give their data to anyone else for safekeeping, that their data will remain private. You don’t urinate out in the open showing your private parts, do you? Private (not supposed to be shown to(ESPECIALLY) strangers), in the cloud. What a name – CLOUD. As in shady, can’t see-through, non-transparent. Actually quite fitting in my view.

Winter August 10, 2021 8:39 AM

I am wondering what type of (ab)users Apple are targeting?

A hard core (ab)user would likely do his (it seems to always be men) abuse in a VM and most certainly would not store it in iCloud, or Google pictures for that matter. Is there any indication that criminal people store the evidence of their criminal activity unencrypted in the Cloud at all? I know criminals tend to be stupid, but natural selection and all should weed that out pretty quickly I suspect.

I doubt that Apple will be able, or willing to invest the resources, to scan activity inside VMs.

SwashbucklingCowboy August 10, 2021 8:41 AM

Two thoughts:

  1. Does this imply that end to end encryption already is broken?
  2. Did Apple do this, at least in part, to blunt law enforcement criticism of the strength of the encryption on phones?

Denton Scratch August 10, 2021 9:06 AM

@Crusher

I’m ashamed of Apple.

Why? Did you make Apple? Do you work for Apple? If not, why should you be ashamed of the behaviour of an entity that isn’t you?

HMFSB August 10, 2021 9:26 AM

That second feature is not based on hashing. It will detect nude images of your own children via AI. And AI is bad at that type of detection. There will be lots of false positives, angering parents. Also, it means that Apple is able to accuse PARENTS of child pornography if their KIDS do something stupid with their iPhone. This problem of kids misusing their phones in this way is said to be widespread. This could become an avalanche of notifications, false positives, arrests, and even actual inappropriate photos that should be for parents (or maybe schools) to address, not a big tech company.

Apple has, without any legal authorization, put itself in charge of your children. What if the “child” is a minor but of legal age? What if the “child” is 18, but the phone is accidentally set as a “child” account? They have made themselves into a Law Enforcement Organization because they have the power, but lacking the right.

What if a trusted insider decides to add code to this system, or — more likely — what if a hacker from organized crime spreads a virus that changes the system, so that the AI detection would be used to gather and disseminate new child porn images, rather than report them.

1991: “Oh, no, my computer has a virus and I need to run McAfee stinger.”
2021: “Oh, no, my kid’s iPhone has a virus, and now child porn featuring him/her is irrevocably on the dark web.”

Are there no laws governing what Big Tech can do? When you buy an iPhone, do you own it, or does Apple still own it? The latter. Apple is behaving as if they own your phones and are co-parents with you over your kids.

TimH August 10, 2021 9:34 AM

I suggest that Apple’s behaviour is not to support LE, save the kiddies etc. It’s to protect them if Section 230 collapses (future) and in defense of charges of enabling child abuse etc (now).

The $10M question is what they’ll do if given an NSL with secrecy clause instructing further facilities. They can deal with that by showing the community (not thee & me, but Bruce and colleagues) that the system cannot be subverted to examine other images without major work, which is then an issue that they can force to the courts.

Ross Snider August 10, 2021 9:54 AM

Having looked in the past at how law enforcement requests are handled for Apple products, I learned that the privacy features of Apple products are a thin veil. Apple maintains the ability and right to decrypt content synced from your phone to iCloud to investigate it. Apple makes a great fuss about how this is “impossible” but are pulling the same trick as they are here with end-to-end encryption – redefining the word.

It’s not that I don’t trust Apple’s heart is in a good place. They are required by law to collaborate in certain ways to enforcement over communications. The pattern I’ve observed is they will try to invent their way out of seemingly contradictory, over-constrained problems. It’s the PR side that takes this to the public as a differentiating feature of the products and hams it up to these impossible claims.

I agree though that this is more of an overreach. There’s some limited control a device owner has over iCloud synchronization (though, less than first appears). Furthermore, there has to be some active request for data in the former case justifying the access. Finally, such investigations are latent – coming at a point in time of a legal investigation. This new proposal controls communications as they happen, does not exhibit user choice, and acts regardless of their being a specific warrant or need.

I don’t use Apple products for personal communications either way. But its a shame to see the company take this direction. I suspect that even if they repeal the proposal and go on a PR apology tour, they will lose a significant amount of trust – and in turn marketshare – from this. At least, I hope. The market has been a dismal place for privacy.

Hedo August 10, 2021 10:01 AM

As HMFSB noted in this post
https://www.schneier.com/blog/archives/2021/08/apple-adds-a-backdoor-to-imesssage-and-icloud-storage.html/#comment-386428

“There will be lots of false positives…”

and HMFSB is correct in that many people might start pulling off pranks such as applying various “filters” to otherwise completely normal images of humans fully dressed making them appear butt-naked.

Hey Apple – it won’t work.
I AM ALL for completely rooting out child pornography, in fact, when I hear or see someone getting caught for doing it, it truly makes me angry and ashamed at the same time for belonging to the same (human) species as the animals engaging in it.
But this is not a good way to go about it at all.

Steve August 10, 2021 10:09 AM

Any bets on whether this abomination was first called Sexual Child Abuse Material until someone noticed the obvious acronym?

But, still, let’s assume that Apple is sincere in its intent to “refuse any such demands” for expanding the technology.

Experience shows that there are other entities (I’m talking to you, NSO Group) that are willing and clearly able to penetrate anything Apple throws up as a defense and weaponize it.

Sofa August 10, 2021 10:24 AM

Alex Stamos — former head of security at Facebook, currently at Stanford Internet Observatory, in a thread on Twitter (via DaringFireball.net:
In my opinion, there are no easy answers here…”

In my opinion, there are no easy answers here. I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.

Nuanced opinions are OK on this.

-Sofa

Etienne August 10, 2021 10:26 AM

Encryption should never be performed in the same device that acts as the modem. That is, a smart phone, or radio.

A secure radio has an encryption engine separate and functionally removed. That is a tiny circuit board that plugs in to the radio.

Computer networks use encryption devices that are external, and single function, to perform ciphers using secret keys.

Any device which performs encryption as an algorithm in the same device that modulates or demodulates the information, is a toy.

Do not use toys where the penalties are removal of your freedoms, and/or your wealth.

End to end encryption cannot be performed wholly by a smartphone or a radio. It requires an external encryption appliqué, or manually encrypting the message to be transported.

All smartphone encryption is a fraud, and the same concept as putting your money in a system that uses more electricity than Cuba to perform the accounting.

Per S Nickety August 10, 2021 10:36 AM

Notice Apple changing the definition of “end-to-end encryption”

This is what I have previously stated about Whatsapp’s and others so-called “end to end” encryption.

Because such companies have agreements with law enforcement / alphabet agencies, the “end to end encryption” is not necessarily “end to end encryption”.

Table August 10, 2021 10:49 AM

@Per S Nickety

On the definition of “end-to-end encryption”. Have not read much on this yet, but would not surprise me if this means the photos are sent de-crypted to some external server that then does the comparing.

Because of course the iCloud server will be able to process the photo unencrypted (it being one of the two endpoints of the “end to end encryption”). But a side effect of this is that it can also send the de-crypted version to-whom-ever. Kind of like what WhatsApp etc can do.

Sloth August 10, 2021 10:53 AM

Summarised in a sentence:
“Power curropts, absolute power curropts absolutely”

With proprietary software the proprieter has a say, with free software the people have it.

Chose wisely Dont Buy Jails!! Buy Freedom, Use free(as in freedom) software and support it

Impossibly Stupid August 10, 2021 10:54 AM

It opens the door for all sorts of other surveillance, since now that the system is build it can be used for all sorts of other messages.

Another correction, Bruce: “build” there should clearly be “built”.

Hot on the heels of your AirDrop article, I agree that Apple is stepping in it. I have never understood why violence gets a pass in our society in a way that sex doesn’t. If they can scan messages for “fleshy” content, they can be scanning it for weapons. Why is Apple giving terrorists a pass!?!? And I don’t know why they’d limit it to just images, either, since message text can be matched easily to a list of “bad words” even without machine learning. A request for an explicit image is very likely to occur before the image is taken, so why is Apple choosing to wait until the deed is done before warning children/parents of the direction the conversation has gone?

And it breaks end-to-end encryption

Technically it doesn’t, but it’s really all just weasel words on Apple’s part. They have indeed backdoored their own software, though, making the end points for the encryption compromised. So if they’re not using an open protocol that allows for a drop-in replacement of Messages that isn’t scanning and reporting on their usage, I would hope they can be sued for false advertising.

@Winter

I am wondering what type of (ab)users Apple are targeting? . . . Is there any indication that criminal people store the evidence of their criminal activity unencrypted in the Cloud at all?

This is the right line of questioning. Some reporter needs to hold Apple’s feet to the fire and ask them how much child porn is being stored on their iCloud servers that makes this action necessary. They need to establish a benchmark for the conviction of criminals that justifies the intrusion into the day-to-day communication of everyone. There is no science that I can see behind Apple’s decision to do this.

@HMFSB

Apple has, without any legal authorization, put itself in charge of your children.

Big tech has done this for decades. Google decides who Gmail users can contact, for example. So, really, we have Apple again serving as the focus for an entire industry that has no ethical core. Just another case of convenience of giving security a beatdown.

echo August 10, 2021 11:00 AM

I find the whole discussion is too caught up in technical issues and there isn’t enough legal discussion or abuse domain expert discussion. There’s a lack of reasoning and a data. I also think everyone has forgotten the rest of the world beyond the US exists.

Before this discussion began there were already concerns about the cloud and telemetry and consent. I find it especially annoying to be on the receiving end of the American discussion firehouse, and corporate greed, and constitional and other weaknesses in the American system constantly fighting to take over not to mention the well funded corporate lobbying and dark money. As a European I’d rather you kept your problems to yourself.

I think it would help to slow this whole discussion down. I want to hear from European human rights lawyers and abuse domain experts.I want to know exactly what problem Apple think they are trying to solve and what the real world challenges are beyond technical gimmicks before giving the techncial gimmick the time of day. I want to know if it makes a real difference or causes more problems than it solves. I want to know what educational and resource and enforcement and social policy issues exist. None of that discussion will involve technical computer issues. Only then do I want to hear the technical computer issues before forming an opinion because I don’t want the tail wagging the dog.

I’m also concerned this is taking attention away from dealing with social media and monitized hate and dark money and lobbying.

Politics is about priorities and I’m not persuaded it should be a private company which sets the agenda.

AL August 10, 2021 11:03 AM

Some really good news. 😜 Apple is considering protecting me from 3rd party apps.
https://www.macrumors.com/2021/08/09/apple-child-safety-features-third-party-apps/

but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp

If we’re talking WhatsApp, well Telegram and Signal aren’t too far away from this client side scanner. The back door will be in all confidential communications.

I don’t think there is the slightest chance that this client side scanner is limited to image files. Either it already is, or will be scanning text. Mission creep is inevitable.

JonKnowsNothing August 10, 2021 11:07 AM

@ Winter • August 10, 2021 8:39 AM

re: I am wondering what type of (ab)users Apple are targeting? A hard core (ab)user would likely [not] store it in iCloud, or Google pictures for that matter.

From MSM and LEA arrest touts, the folks that the LEAs are targeting store their kit on the Dark Web. Only a fledgling would upload such fun-phots to their iCloud accounts.

More likely the fun-phots that are uploaded include group activity chains where the victim is conked out and the group live-streams the action to their buddies. These may or may not be illegal (depends on where, when, location, social status, and historical time frame).

Criminal actions by adults and youth for those activities and other Sx crimes are already covered by laws and are not indicated as part of the KiddieFids that Apple claims they and their LEA buddies are targeting.

If they were, a US Supreme Court Justice or two may have some increased concerns.

Another aspect is, the “no one looks but the computer”, until there is a match up. Then some human has to look at all the images from that selector. All of them because they have to verify IF the actions pass the threshold set.

Is Apple (et al) donating all their content moderators to the US Law Enforcement Agencies?

They already have extensive connections and folks that do that work require a lot of health support which is limited in the GigEcon.

Rolf E August 10, 2021 11:10 AM

@Bruce,

What I would appreciate is an explanation of how end-to-end encryption would not be end-to-end encryption. The presumption is that the uploaded items are encrypted with a strong encryption and can only be decrypted by the user. If the math and implementation of the encryption is sound, what can go wrong?

Etienne August 10, 2021 11:28 AM

“…an explanation of how end-to-end encryption would not be end-to-end encryption.”

When there is a man in the middle, or parts of the message bypass the encryption function, and allow third parties to monitor the channel.

echo August 10, 2021 11:35 AM

@AL

Apple went back on its committment to open the Facetime standard which put a block of establishing a common videoconferencing standard. (Microsoft buying up Skype did similar.) None of this inspires trust. I’d rather not get involved with half a dozen different applications all of which are variable quality just to speak with someone.

American standards of speech versus human rights are lower than Europe and I haven’t seen social media or any of the other big tech companies including Apple rush to do anything about this.

Apple have effectively taken it upon themselves to make public policy. Their rigid silence and lack of allowing scrutiny or discussion about their methodologies and knowledge and data isn’t acceptable in terms of public policy discussion. All their silence is going to do is make people more angry unless Tim Cook thinks by being a thin skinned authoritarian with the balance of power on his side means they can beat down and depress people into acquiescence? Myself I feel it is worrying that corporations believe they can effectively make public policy in this way. There’s already too much of the wrong kind of exercise of this kind of power.

AL August 10, 2021 11:36 AM

@Bruce
There are two completely separate scanners. I’ll let someone else deal with the image scanner that deals with CSAM hashes, and checks uploaded files against a child porn database.

There is also a scanner/backdoor on iMessage,and suggestions that this particular scanner will be extended to WhatsApp and beyond. It is the message scanner that I see breaking encrypting messaging, because the scanner is juxtiposed between the user and the transit of the message, and on the receiving side, between the transit and the recipient.

Apple says this scanner is currently configured to be active on children accounts when the parents have opted in and looks for sexually explicit images.

So, what Apple represents is, they have eavesdropping software that is currently configured to be benign.

Generalizing, they have intercepting software that scans messages and notifies a 3rd party. Do we have any information that this eavesdropping scanner can’t be reconfigured to scan text and notify someone other than the parents?

https://cdt.org/press/cdt-apples-changes-to-messaging-and-photo-services-threaten-users-security-and-privacy/

The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor — it is a backdoor. Client-side scanning on one “end” of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy.

I’m seeing an internet effort to combine the two scanners into the hash scanner, and confine the discussion to that scanner. There are two scanners that need to be discussed separately. And the one they don’t want us to discuss, the eavesdropper, is the one we should concentrate on.

AL August 10, 2021 11:47 AM

@Echo
With iMessage gone, I’ve been wondering when the shoe will drop on Facetime, the last Apple E2EE software. The silence on that sticks out like a sore thumb.

Humdee August 10, 2021 12:35 PM

Look, in most situation I am one of the most fervent critics of binary thinking. However, some situations really binary. Yes, death is a process but at some point your either dead or alive. At some point in time you have stopped breathing or you haven’t. You brain activity has stopped or it hasn’t. Stamos is wrong to say that nuanced options are OK. They are not. You have to chose. He writes,

“I find myself constantly torn between wanting everybody to have access to cryptographic privacy and the reality of the scale and depth of harm that has been enabled by modern comms technologies.”

Yep. It’s a tough choice. However, the tough choice cannot be evaded by handwringing appeals to “nuance”. Either the message is end-to-end encrypted (as Bruce defines the term) or it is not. Either the number is a zero or a one, or it is not. There are no fuzzy logic gates in this debate.

AL August 10, 2021 1:10 PM

@echo

I find the whole discussion is too caught up in technical issues … I want to know exactly what problem Apple think they are trying to solve … Only then do I want to hear the technical computer issues

You got to be kidding me. Maybe what Apple is “solving” is complying with a court order that is accompanied with a gag order. If that is the case, the matter is on hold indefinitely.

Also, there is a concept called multitasking. We can address the technical and non-technical issues simultaneously.

It seems to me that Apple has launched a multi-billion grass-roots negative advertising campaign. And it isn’t like they couldn’t have anticipated that. So, something is going on that is a bit bigger than Apple. I don’t see putting the technical discussion of this client side scanning on hold until we figure out what that is.

Steve August 10, 2021 1:37 PM

@sloth sez: “Power curropts, absolute power curropts absolutely”

Per comedian, satirist, and voice actor Harry Shearer puts it

If absolute power corrupts absolutely, does absolute powerlessness make you pure?

@R-Squared: I haven’t had the hardware to try it out lately, but FreeBSD offers jails.

As does the free open source Linux security tool fail2ban.

TimH August 10, 2021 1:58 PM

“If you don’t use iCloud Photo Library, no images on your devices are fingerprinted.”

I’d be happier with that if:

  1. If iPL isn’t used, the CSAM database is not loaded onto the phone so that the image analysis can’t happen.
  2. The phone explicitly states whether the CSAM database is or is not loaded. That makes Big Fruit legally liable if they are caught lying.

Timh August 10, 2021 2:01 PM

Thomas Brewster’s lede “Apple Is Trying To Stop Child Abuse On iPhones” makes a massive presumption as to Apple’s motive.

I posit: if that is really the motive, why limit the analysis to icloud uploads only?

name.withheld.for.obvious.reasons August 10, 2021 2:12 PM

I know Apple has struggled with whatever internal machinations that legal and policy folks tend to focus on and possibly failed to come to some sort of legal position that keeps them in China and compliant with a host of national laws. Don’t really see how this works under the EU’s Data Protection regiment. I suspect a court challenge is in the offing.

But hey, Microsoft has been ahead of Apple on this one; the disclosure came during the release of Windows 10 and Microsoft’s latest Privacy Policy. Essentially Microsoft contends that if they find anything of question on your system they have and RIGHT/OBLIGATION to report it to authorities. This does not just apply to cloud applications but to local storage as well.

lurker August 10, 2021 3:19 PM

What size is this CSAM database? iOS/MacOS is already loaded with so much bloatware, will this not chew up more RAM/ROM, and d/l update bandwidth?

Note also that “for now” this feature is USA only. Apple’s techs and lawyers must be discussing how they can make it work with pix from country A sent to country B with all the combinations of age/gender of consent in different jurisdictions. Oh wait, iCloud lives in California, iMessage servers are in CA, so Apple neatly brings the whole world under Californian law. Welcome to the Age of Empire.

AL August 10, 2021 3:53 PM

@lurker
You are mixing scanners.

Apple’s techs and lawyers must be discussing how they can make it work with pix from country A sent to country B with all the combinations of age/gender of consent in different jurisdictions.

The CSAM scanner is between device and Apple’s Icloud Photos. Not between users (yet).

There is a different scanner that Apple says doesn’t use CSAM that monitors images sent between users. That is a message monitoring scanner that Apple says they want to expand to WhatsApp. For now, Apple says this will monitor children’s accounts if the parents opt in. If content is flagged, the scanner will open a communication to the parents, Apple says.

SpaceLifeForm August 10, 2021 4:35 PM

@ R-Squared, Steve

Software Jails are not enough.

The crypto and the comms must be separated.

lurker August 10, 2021 4:37 PM

@AL, There is a different scanner…

Riiight, the bloatware just keeps bloating… It used to be that you could send a TXT or PXT which was a simple transaction between the user and the telco. That’s not possible now, it’s handled by a “Messages” app, on both A & G’s systems, with such a variety of different colored bells and whistles who knows where your msg goes before it gets delivered.

Maybe I missed it in all the shouting, but what hash is being used, MD5 or NCMEC’s own PhotoDNA? Because as Hackerfactor (linked above) points out that’s reversible, which would be a nasty own goal by Apple.

AL August 10, 2021 4:48 PM

@lurker

both A & G’s systems

But it is Apple that introduces client side scanning, and says it wants to extend it to 3rd party software. So, after IOS 15, Signal on Android is safer than on IOS, at least until Google introduces client side scanning. One thing that is looking up for Android is, they don’t get updated. I just forced Signal on an Android tablet where the last update was 2019. Android 8.1.

As far as all the other stuff is concerned, things like PhotoDNA are more relevant to the other scanner. I’m paying attention to the message scanner, which is separate and distinct (and perhaps wiretap capable).

lurker August 10, 2021 6:29 PM

@SpaceLifeForm: I read the techcrunch article and I still don’t get it. If I attempt to upload an image to iPhoto-Cloud, or if I attempt to send (or receive) an image via iMessage, the image is compared with a database of known CSAM. If I’m creating new CSAM on my iDevice I’m in the clear until I get a hash collision?

SpaceLifeForm August 10, 2021 6:31 PM

@ Bruce, ALL

https://pseudorandom.resistant.tech/obfuscated_apples.html

Generating noise in a way which is indistinguishable from real signal is a ridiculously hard problem. Obfuscation does not hide signal, it only adds noise.

[somehow unredacted via C+P: if you take anything away from this article please let it be this fact.]

Sadly, most people operate under the assumption that adding noise to a system is all that it takes to make the signal unrecoverable. This logic is very clearly in operation in Apple’s new proposal for on-device scanning technical summary which, among other things, proposes generating synthetic matches to hide the true number of real matches in the system.

JPA August 10, 2021 6:57 PM

@Steve
“Per comedian, satirist, and voice actor Harry Shearer puts it

If absolute power corrupts absolutely, does absolute powerlessness make you pure?”

No.

If a then b => If not b then not a.

If absolute power corrupts absolutely then if you are not absolutely corrupt you do not have absolute power. But you can be powerless and corrupt. You just can’t cause damage.

Andy August 10, 2021 8:18 PM

NSO called Apple’s iOS security laughable. It probably isn’t but Khashoggi’s and the iPhone of his fiancee, and Jeff Bezos’ iPhone X were hacked with NSO spyware.

On device real-time scanning and reporting seems like a charter for those who want to frame opponents, with one of the worst things anyone can be accused of.

R-Squared August 10, 2021 10:01 PM

@ AL

… database of known CSAM. …

https://www.apple.com/child-safety/

Databases of known what? What are you saying there? Isn’t it about high time to pull the fire alarm, get out of the Apple Store, stop vaping legal weed at that damned Genius Bar whatever it’s called, and never ever buy another Apple Macintosh product again?

I mean I don’t know what these people are so intent on getting in trouble with the law for, but I don’t want anything to do with it and I can’t see how anybody in his right mind would either.

SpaceLifeForm August 10, 2021 10:57 PM

Another link to add to the pile. Hour long video discussion

hxtps://www.youtube.com/watch?v=dbYZVNSOVy4

SpaceLifeForm August 11, 2021 12:28 AM

@ lurker

It sure looks like fuzzy hashing to me.

I don’t think Apple realizes the can of worms they opened.

Clive Robinson August 11, 2021 1:20 AM

@ Rolf E, ALL,

What I would appreciate is an explanation of how end-to-end encryption would not be end-to-end encryption.

That is fairly easy,

When it is NOT “end to end” to the “two communicating parties and nobody else”.

Which may not be of help as only about 10% of the population see the world in words the majority need a picture.

So here’s the description of what to draw… On a piece of paper draw,

1, Two boxes marked User A and user B horizontally spaced.

2, Draw a line between the two boxes and mark it insecure comms channel.

3, Draw a vertical line down through the middle of each box and call it “security end point”. Shade the area between the two lines and call it “insecure area”.

4, lable the left of the first verticle line and right of the second verticle line as “needs to be secure”

5, Now draw a third box in shaded “insecure area” and call it “third party” and draw an arrow to the insecure side of the two user boxes.

Now… If your system is set up correctly the two unshaded areas left of the fisrt user box and right of the second user box should be secure…

But on modern Smart devices and laptops and other computing devices they usually are very very far from being secure.

Under both user boxes add a second box the same size and mark it “OS and Apps” so it two crosses the vertical line marked “security end point” as well. In the unshaded supposadly “secure areas” draw a fold back line from the OS and Apps to the output side of the user box. In the shaded insecure area draw a line from the third party box to the “OS and Apps” box insecure side.

Now look at the diagram and see how the third party can “end run” around the securiry in the user box via the insecure “OS and Apps” box. Thereby getting convenient access to the “Plain text User Interface” that you think is secure but the reality is it is not, because of all those “end run attacks” in the “OS and Apps” box…

Those vertical lines are called “the security end point” and you should now see it is in the wrong place, thus offers no security what so ever on your Smart device, phone or computer.

I’ve been saying this on this blog for some years despite all the “Security Guru’s” etc saying these aleged security apps are secure. Actually the reality is,

The are all insecure by design..

And the designers like Moxie Marlinspike etc should know this and say what the peoblem is but they do not… So every one who puts Signal, WhatsApp etc, etc, on their Smart device or phone is living in a fantasy world at best, and one where they will be killed and chopped into piecies by an agent of the House of Saud or their loved ones will at worst.

Because the plaintext UI is on the same device as the communications and any one of thousands of end runs in the other Apps, OS, or driver devices will give a third party evesdropper full “plain text access” to the UI of the supposed “Secure Messenger App”.

I’ve also repeatedly indicated how you “move the security end point” so it is secure… But most users are way way to lazy to do this…

It’s up to you to, decide what you want to do… But I refuse to use any supposed “Secure Messenger App” because people will not use them correctly and I see no reason to alow them the opportunity to put my life, liberty, etc at risk for their stupidity or veinal activities.

As I’ve explained in the past, the proper use of a One Time Pad(OTP), not only gives you secure communications, it gives you something no other currently used encryption system can. Which if used correctly is full deniability against betrayal by the second party to the authorities etc… Something no doubt all those who spent vast ammounts on those supposed secure phones by encro-chat etc now wish they had done…

Dave August 11, 2021 1:36 AM

Other things that Apple will soon be scanning for:

People talking about women’s rights in Saudi Arabia, people criticising the BJP in India and the CCP in China, and in the future, people thinking bad thoughts about President Ted Cruz.

Clive Robinson August 11, 2021 1:49 AM

@ Humdee, ALL

Stamos is wrong to say that nuanced options are OK. They are not. You have to chose.

I’ve been very uncompressed by Stamos for nearly all the time I’ve heard or read his statments.

He is just another foolish self promoting “security guru” who pretends to know rather more than he actually does about security.

Or worse he is deliberatly selling innocent people to undesirables like the “House of Saud” who’s notion of civilised behaviour is to chop people up probably whilst still alive with meat saws and the like.

All whilst alowing certain “venture capitalists” to make vast fortunes creating and selling anti-security companies that will sell anything to the likes of the House of Saud and other despots, tyrannies, and psychopathic dictators, to go on such killing sprees if the price is right.

Stamos does not understand the binary technical option of “Secure or insecure”. There is no touchy feely fantasyland inbetween of NOBUS and socially responsible law enforcment, he pretends should exist, it does not nor ever will.

There are just “Directing minds” who others might look upon in future times to decide via human morality if the actions of those Directing Minds was good or bad. Of course by then, potentially many many people will have been prejudiced against, suffered mental or physical harm or even death by extrajudicial means.

The best we can do currently is “Full Independent Overview and Review” but as can be seen in the pinicals of self appointed democracy in certain Western Nations no such systems exist, or where they arr supppsed to they are faught by every deceit imaginable.

Anyone who thinks “Full Independent Overview and Review” will ever be alowed to happen is deluding themselves, those in positions of power will never alow constraints to be put upon them.

echo August 11, 2021 3:41 AM

https://www.reuters.com/article/us-usa-tech-prison-idUSKBN2FA0OO

For people like Heather Bollin, a 43-year-old woman in Texas engaged to a man who is currently incarcerated, constant surveillance is a fact of life: the three daily phone calls they have together are subject to monitoring by prison officials.

“We are never able to communicate without being under surveillance,” she told the Thomson Reuters Foundation in a phone interview, asking that the prison her fiance is in remain anonymous because she fears retaliation.

Prisons in the United States could get more high-tech help keeping tabs on what inmates are saying, after a key House of Representatives panel pressed for a report to study the use of artificial intelligence (AI) to analyze prisoners’ phone calls.

But prisoners’ advocates and inmates’ families say relying on AI to interpret communications opens up the system to mistakes, misunderstandings and racial bias.

https://thehill.com/policy/technology/567206-amazon-awarded-secret-10b-nsa-cloud-computing-contract-report

The National Security Agency has awarded a secret cloud computing contract worth up to $10 billion to Amazon Web Services

[REDACTED LINK BECAUSE MODERATION WON’T LET THE LINK THROUGH]

My last comment went walkies so I’m changing tack. Apple’s initiative seems to be coincidental with a lot of surveillance and cloud technology being used on low hanging fruit.

People seem to have missed things link a ramping up of big tech lobbying and in the UK at least government ministers “getting off” either deleting their own communciations or developing a love of off the record meetings.

Lots of tech is flying about but very little is being done to firm up governance or social policy led initiatives. This is an artifcial narrowing of the definition of “security” as well as excluding experts in other fields who have legitimate input.

I am extremely sceptical of the dominance of tech in these discussions. It may be where the power is and bikeshedding happens but there’s a whole lot of other factors at play here outside of the bubble.

Blim August 11, 2021 4:16 AM

Just want to call out the irony here that, however unlikely, if Apple’s system falsely flags photos of my kid frolicking around naked, a total stranger will be evaluating those images as to whether or not they are sexually gratifying.

Clive Robinson August 11, 2021 8:36 AM

@ ALL,

There is another aspect to Apple’s action that you might not be aware of.

The UK as @echo has pointed out is,stamping down on free speach in every which way it can, whilst still trying to appear democratic (which it most certainly is not).

Towards that conviction of bojo&co lets hang out exhibit A,

https://www.eff.org/deeplinks/2021/07/uks-draft-online-safety-bill-raises-serious-concerns-around-freedom-expression

Whilst on the surface it sounds like it is functional legislation ot is not by a very very long way.

Essentially it is fully under “Political Office” control voa the Home Office and a highly suspect organisation called OfCom who are knoen to have repeatedly purjured themselves in court on the excuse of “politics”… (look up Clive Corrie if you want to see but one example of an OfCom purjurer).

In essense the draft bill is a “Your Guilty Because We Say You Are” prototype act for the Home Office Minister to wield at anyone they chose to, and it is far from clear if it has “Sovereign Judicial Limits” or not.

Thus Apple legal eagles may well be trying to mitigate it before the act even gets out of draft.

Which is realy a bad idea because that will just give the power hungry idiots in Whitehall more reason to be draconian and embolden future worse action much as we lambast the “idiot from marketing” Scott Morriss Australian Prime Minister and over all “bad egg and worse” who clearly works for News International and Rupert “the bear faced liar” Murdochdoch.

Sloth August 11, 2021 11:13 AM

@Steve, Jails that capture the unauthorised user are never comparable to those that tie users hands, your os restricts you to install, that is a jail for you not a intruder

Sloth August 11, 2021 11:20 AM

@Steve also powerless person need not be thiught about since in theory he is powerless, the thing is power is always delegated and we must see the line, if you delegate the power of your secure messaging etc to apple see what you will get!!

uh, Mike August 11, 2021 1:47 PM

Apple looked so good, to the security conscious, when they publicly turned down the FBI.
So few of us are privy to what’s really going on behind the scenes.
If the enemies of security manage to turn Apple (or if they already have), we’re all owned.

Clive Robinson August 11, 2021 3:56 PM

@ uh, Mike,

So few of us are privy to what’s really going on behind the scenes.

Whilst true, we do not need to know the specifics. We can make assumptions much as we do about creatures that come under threat in the wild.

Whilst it does boil down to “fight or flight” there are a few semi-autonomous steps in between based on relative size and energy expenditure. That is your average grazing herbivour does not run in the presence of blood sucking tics, it tends to swish it’s tail instead. But the sight of a stalking big-cat or other carnivore from the size of a domestic dog upwards will.

Which brings us to,

If the enemies of security manage to turn Apple (or if they already have), we’re all owned.

It can definately be said that “Scott from marketing” the handle-of-incompetance the less than popular Prime Minister of Australia has been already christened over his moronic legislation on privacy he has already passed.

Or Blow-Job Jhonson the blond quif and right wing thug and UK Prime Minister is bringing legislation to the vote.

Fill your “already have” and “to turn” requirments via legislation then Apple only has five logical options,

1, To ignore the legislation.
2, To change the legislation.
3, To fight the legislation.
4, To accept the legislation.
5, To move out of the jurisdiction.

The first is not a logical option unless the plan is to move to the last option.

The second option is only available before the legislation becomes law, and then it becomes the third which carries a whole load of risk and cost as politicians do not like to have their legislation no matter how bad “struck-down” by courts etc and they tend in the case of BoJo in particular to get very vindictive and throw the toys out of the pram.

Which if Apple feels they are not going to get the support of the rest of the industry and lets be honest Google has proven craven repeatedly leaves the fourth option.

Thus the Apple strategy is probably a mixture of option 2 flowing into a compromise and option 4 outcome, as they have not as yet shown willing to do option 5 anywhere.

Doing option 5 is actually not wise, not just for Apple but consumers as well. Arguably Apple’s position will make the legislation hard/expensive for the authorities as option 4. However it they go with option 5 those that “fill the void” will almost certainly make option 4 very easy and low cost for the authorities. Which in turn will make it easy and low cost for every cyber-crook and venture capital con artist out there to develope device based malware.

I looked into this a number of years ago and realised two things,

A, You are either secure or you are insecure, there is no middle ground.
B, No matter what back-doors or golden-keys or other nonsense legislation calls for, you can always overlay any prescribed by law insecure systems with a secure layer.

Thus I would prefere Apple take option 4 over option 5 as it is likely to significantly increase the authorities required resources over almost all other option 4 vendors.

Because the legislators are playing a game of FUD to get their draconian and tyranical wishes in what are still considered democratic political systems.

That is they authorotarians have to give ground or the FUD of “think of the children” etc becomes obviously false and their legislation gets voted down by fairly verment public opinion working on more risk adverse representatives (see history of UK RIPA).

Thus sufficient pressure can cause sufficient ground to be given that gives more privacy orientated designs more room to make life more expensive resource wise for the Guard Labour and their ilk. Thus limiting the number of individual citizens that can be “data raped” at any one point in time causing a fairly real and measurable tipping point for authorities between “mass surveillance” and “targeted surveillance”.

Collect it all is at the end of the day “garbage for landfill” collection unless you have or eventually get the resourcess to process it from “garbage to actionable” intelligence. The higher we can keep that resource cost, the more selective the Guard Labour has to be for it’s actionable intelligence and that is to the average citizens benifit despite the FUD the guard labour and their legislators would have you believe.

Whilst it does not change the average person / citizen from insecure to secure, it does make those cautiously practicing secure communications very much harder to find.

SpaceLifeForm August 11, 2021 3:57 PM

@ Weather

Consider:

  1. This is retro-cover, that it has been ongoing for some time.
  2. They are training an AI, and it is not really all about CSAM.

SpaceLifeForm August 11, 2021 4:43 PM

@ lurker

Fuzzy Hashing

Take this with a grain of salt. Mentioning md5 was not a great idea. But, it gives you the idea.

hxtps://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

Many applications assume these hashes are privacy-preserving, but these results above show that they can be reversed. Any service that claims security by storing sensitive image hashes is misleading its users and at potential risk for an attack like this.

any moose August 11, 2021 5:30 PM

@Shelley “Snowden is in no way a credible source for ANYTHING.”

You deserve a beer. A large, tasty beer.

@Dave “people thinking bad thoughts about President Ted Cruz”

Whatever. Try, people thinking contrary opinions about COVID-19 lockdowns or expressing anti-CRT sentiments.

I really like Apple’s scheme, but then again I have never traded in child porn, nor have I ever wanted to. And I have never used anyone’s cloud.

What makes people think they have a right to do anything illegal via smartphone? Your smartphone is your property, though not the software it employs, but the networks it uses certainly are not.

Tõnis August 11, 2021 6:01 PM

Apple claims this will never go beyond “csam.” So, some organization(s) supply the “csam” hashes and Apple adds them to the phones. What’s to say that the “csam” hash library isn’t (“accidentally” or otherwise) supplemented with hashes for other “unapproved” content e.g. crown virus “misinformation” (memes, pdf files, etc)? And as for the Messages policing, the executive has already asked text message service providers to intervene in people’s private texts, and it’s not even about “csam,” it’s crown virus “misinformation”!

Steve August 11, 2021 6:45 PM

@JPA: I thought the “comedian, satirist” qualifiers would have made the intent of the comment obvious and not to be taken literally.

@Sloth: Obviously.

Pro Tip: Never take anything I say seriously. Because I don’t.

- August 11, 2021 8:48 PM

@oldnumberseven:

“I am curious who owns the servers where one’s iCloud photos have always been stored?”

The only thing you realy need to know legaly on that score is,

‘Not You’.

The same applies to all “one’s iCloud photos” etc from the uSec they touched the server, all you legaly need to know is,

‘You don’t own them any more’.

Ismar August 11, 2021 10:50 PM

Ok,
I am going to argue that , regardless of how concerning this is, we are focusing on a wrong aspect of this issue.
The problem is that we do need this type of back door, in order to control usage of a shared platform for facilitating all sorts of despicable behaviour, More interesting is that the access itself is not audited and can be used for other purposes ( I am happy to be proven wrong but changes are Apple does not have guards in place for this).
One of these other purposes can be planting discriminatory material on ones cloud and falsely accusing them of breaking the law.

blim August 12, 2021 2:14 AM

@any moose: “I really like Apple’s scheme, but then again I have never traded in child porn, nor have I ever wanted to.”

But you’re OK with complete strangers evaluating photos of people’s kids – without their knowledge – to determine whether or not they’re sexually gratifying, so what are the rest of us to believe?

“What makes people think they have a right to do anything illegal via smartphone?”

Illegal? Nobody posting here… oh, you mean “Illegal in e.g. Saudi Arabia,” like the way the Saudis infiltrated Twitter to chop dissidents into pieces?

Clive Robinson August 12, 2021 3:23 AM

@ Ismar,

The problem is that we do need this type of back door, in order to control usage of a shared platform for facilitating all sorts of despicable behaviour,

No we don’t that is the “cool-aid” from the “poisoned chalice” argument of idiots like ex US Attorney Generals and FBI directors who very clearly have other very much more frightening agendas.

That is the level of harm they intend toward society is much much greater than all the FUD they raise to blind side you with has ever caused.

What you need to understand beyond all else is,

You can not change solve moral questions with technological arguments it always fails and usually fairly badly.

As I repeatedly point out,

1, Technology is agnostic to use.
2, Good or Bad is a human perspective that changes constantly.

Therefore it is not the technology you should be destroying but you should be constraining the “Directing mind” that uses technology as a tool.

You do not destroy all printing presses because a few people print words you “personaly” dislike. It is a disproportionate harm to society.

The same applies to typewriters, computers and printers, the actual benifit to society always out weighs the alledged harm of a few “self-entitled” often mentally ill / deficient people insist they will enable.

The changable “good or bad” view point of society is a healthy sign of society evolving. Any one insisting that this evolution should not happen is hurting not just some but all around them as well as themselves. They are at the end of the day doomed to fail, often by their own destruction, this is what history teaches us over and over.

Thus the solution to “good or bad” is a question of morals that move with time and understanding. In the main the solution to them is the same, people have to understand the harm it’s scope and those who practice the harms. Then use this knowledge to stop the harm.

If people foolishly try to stop a harm by blocking a general technological means, all those committing those harms will do will be simply to move to a different technology, not stop their harms.

I could go on and describe how every time such controls as you insist are necessary actually are not, but worse they become the tools of tyrannical oppression.

I won’t because you should go away and learn what history teaches.

echo August 12, 2021 4:06 AM

@Clive

If people foolishly try to stop a harm by blocking a general technological means, all those committing those harms will do will be simply to move to a different technology, not stop their harms.

My countermeasures to Apple’s new scheme went walkies so I’m going to have to sit on things. I have however been doing some digging for entirely other reasons and discovered a few gotchas Apple may or may not have considered but may impact other manufacturers. It’s going to need someone to step through the entire processing chain but potentially this ends up with a single walled off secret enclave much like Intel’s CPU within a CPU only with extra fuctionality.

It’s going to impact every single SOC manufacturer. If the sale of ARM goes through to a company within the American legal jurisdiction this is going to create a lot of problems for a lot of people.

Law enforcement and human rights aside it is a known known among patent lawyers that it is wise to download the patent database and do offline searches because you will be snooped on if you do an online search. Add meta data. We’re talking not just millions or billions but trillions in GDP going walkies. The direction of entire industries and wealth of nations can be impacted. It has geo-political implications.

Myself I feel this discussion needs criminologists and sociologists and therapists and support workers and other expertise such as school teachers to have a say so the proper balance can be established and/or confirmed. It will also need political scientists and economists and anti-poverty workers to have their say.

Such a massive public policy change cannot be allowed through on the nod.

AL August 12, 2021 9:55 AM

@Tõnis

Apple claims this will never go beyond “csam.”

It is already beyond CSAM. There are two separate scanners. Only the scanner that monitors photos shared with iCloud Photos uses CSAM.

The iMeasage scanner that intercepts conversations, according to Apple doesn’t use CSAM. Instead it uses AI to what is a sexually explicit photo. I see mission creep into scanning text and wiretapping with that scanner.

Tõnis August 12, 2021 4:19 PM

@AL, I, myself, have already made changes. As much as I love iCloud, and as much as I love iMessage, I’ve turned off photo upload to iCloud, downloaded Signal just in case I need more privacy, and have forgone iCloud backups. I’ll continue to use iMessage for day to day innocuous stuff because it’s so convenient, but I’ve turned off messages in iCloud. I’ll make local, encrypted backups using iTunes and save my photos on the device or to my pc. I’m not deluded, thinking that my pc is safer, but I don’t want to support Apple on this; I want to take it away from Apple. If Apple were to reverse course and not go through with this travesty, I would by right back to the convenience of iCloud on all of it.

lurker August 12, 2021 7:53 PM

@Tönis: …local, encrypted backups using iTunes…

Can iTunes do that without phoning home? Are you sure? One of the reasons I flicked iTunes was its insistent random demands for a live ‘net connection.

AL August 12, 2021 9:04 PM

@Tõnis
Well, according to Apple, these changes will roll out in IOS 15. But, we’re on IOS 14 for now. So, I think we’re good for August, no need to do anything in a rush.

Supposedly we can stay on IOS 14 when IOS 15 comes out and they’re ship security fixes for IOS 14. That’s the plan for now while I hear more.

Erdem Memisyazici August 13, 2021 12:55 AM

This is pretty much the failure of liberty vs. security ideology but mostly governed by hysteria historically known as the four horsemen of the infocalypse. https://en.m.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalypse

At one hand there is this sense of security but really it’s the path paved to a world free of crime which is not possible today. A lot of our social structures are designed around inequalities so the wise thing to do would be to keep liberty in mind when unleashing things which may be contrary to the spirit of the 4th Ammendment. In a digital world where your personal information is perhaps not on ink and paper at home but stored in 1s and 0s on a handheld computer one sort of expects privacy from a device they use as akin to a table at home.

SpaceLifeForm August 13, 2021 2:44 PM

Hashing Fuzzy vs Fuzzy Hashing

Where the hashes can meet is an issue.

hxtps://www.twitter.com/living_syn/status/1426002352037896200

SpaceLifeForm August 13, 2021 3:55 PM

Fuzzy Hashing vs Hashing Fuzzy

hxtps://arstechnica.com/tech-policy/2021/08/apple-defends-iphone-photo-scanning-calls-it-an-advancement-in-privacy/2/

Federighi also said the CSAM database “is constructed through the intersection of images from multiple child safety organizations,” including the NCMEC, and that at least two “are in distinct jurisdictions.”

[So? Multiple jurisdictions does not help, it is a hinderance for someone trying to defend themselves]

“Such groups and an independent auditor will be able to verify that the database only consists of images provided by those entities, he said,” the Journal wrote.

[So? Why should someone that has been falsely accused trust this faux reassurance?]

SpaceLifeForm August 13, 2021 4:54 PM

LOL. So, I g(Federighi star wars)

Top link: hxtps://thetechtack.com/craig-federighi-clarifies-details-about-apples-child-safety-scanning/

UnicornsAreReal August 13, 2021 5:45 PM

Maybe Apple knows something. I mean, maybe something happened, like a group of abusers using iCloud and iPhones, and that crime became known to Apple, and then this was their response, many months later. Otherwise, it just seems an odd thing to come out of the blue like that.

SpaceLifeForm August 13, 2021 6:28 PM

@ UnicornsAreReal

Of course Apple knows. There is no doubt.

As to why this has become public at this time, well that is the interesting angle.

It’s a leak.

See Florida. Names redacted.

Anonymous August 13, 2021 9:00 PM

This tool alerts Apple when it detects images that match criteria that Apple controls. What’re the chances this isn’t really about CSAM, but industrial espionage and leak control (look for images of competitors’ prototype devices to pass to Apple’s spies, and images of unreleased Apple products to identify potential leakers)?

lurker August 13, 2021 9:38 PM

it’s a single (db) image across all countries – Federighi

Sorry, all countries don’t work like that…

SpaceLifeForm August 14, 2021 1:47 AM

Their lips are moving

hxtps://www.reuters.com/technology/after-criticism-apple-only-seek-abuse-images-flagged-multiple-nations-2021-08-13/

Asked why it had only announced that the U.S.-based National Center for Missing and Exploited Children would be a supplier of flagged image identifiers when at least one other clearinghouse would need to have separately flagged the same picture, an Apple executive said that the company had only finalized its deal with NCMEC.

Clive Robinson August 14, 2021 10:01 AM

@ lurker, ALL,

Sorry, all countries don’t work like that…

How very true…

But we do know that some elements of every countries Government do want to work in a certain way which even authoritarian followers might not wish to be treated if it was explained in a way they understood applied as much to them as everyone else…

@ SpaceLifeForm, ALL,

Their lips are moving

Yes and you and I know they are lying from the bottom of their dark twisted little PR/Marketing souls, to sell enslavement as though it were life in a guilded bird cage.

It’s not exactly difficult to show that everytging they say is just words, the technology can do oh so much more that tyrants, dictators and worse will see as a golden opportunity.

What should not be lost on people is that “It runs on your phone” you can almost quarentee the first court this comrs up in, the judge will nod through the prosecuters argument that it was not an illegal search because “you invited it onto your phone” or something similar.

I would say this is going to be a PR Disaster when it happens, but by then it will be to late, not just for the customers but the suppliers and designers as well.

They might as well hang down their heads in shame now, because the one thing you can guarantee is that this little “political nicety” is going to get people killed a lot of them.
We already know that from what the House of Saud was caught doing, they very assuredly are not alone in having people killed, just dumb enough to get caught doing so.

And this “political nicety” will certainly hurt more innocent people over time than it will ever save from criminals…

Lets be honest, if you were the sort of criminal this Cop-Tag system is supposed to catch, you’ve been given plenty of warning to “clean up” and “clear out” and it’s what you will do. Because there are hundreds of other ways to communicate these illegal images, way too many for them all to be monitored. In fact way too many for 99.9% of them to be monitored.

Thus all such systems are going to fail against the real criminals, thus the majority of those caught will be idiot teenagers doing idiot teenager things that they always have done in one way or another, and will in most cases grow out of very fast.

But the LEO’s get quick easy convictions which leads to promorion, better pay etc, so they will give this a thumbs up.

The fact that it will create “paper criminals” who will then be persecuted into becoming other types of criminal is just more bonus points for them and more taxes that flow into the pockets of the private prison lobbyists and shareholders. Thus giving firther stump bashing nonsense to political tub thumpers looking for re-election for basically being not just usless but actual enemies of society.

Mark well the “idiot words” of certain Texas Politicians, they tell you clearly what they realy think of the voters. But it’s not just them it’s most other politicians that perhaps are just that little bit smarter and don’t tell the real truth on record…

SpaceLifeForm August 14, 2021 4:28 PM

Fuzzyhash

hxtps:/www.twitter.com/SarahJamieLewis/status/1426611447325368320

hxtps://git.openprivacy.ca/sarah/fuzzyhash

fuzzyhash – A toy (s-t)Detectable Hash Function

This package contains a toy implementation of an (s-t)Detectable Hash Function as described in The Apple PSI System by Abhishek Bhowmick, Dan Boneh, Steve Myers, Kunal Talwa, and Karl Tarbe.

WARNING: This is a toy implementation. Do not use this as anything other than a toy.

SpaceLifeForm August 14, 2021 5:33 PM

Schrödinger’s Apple

Will it land on the ground, or go into orbit?

hxtps://gizmodo.com/apple-will-keep-clarifying-this-csam-mess-until-morale-1847484296/amp

chuqli August 15, 2021 8:40 AM

And what about OSX based devices – are we ok with those not being scanned ?

How about Windows or Linux desktops with iCloud browser logins ?

Of course not. There is no logical difference between end-point platforms in terms of the unacceptability of this content. Laptop and Desktop users can be equally as bad as mobile users.

Apple knows this too !

In my opinion, that is why a cloud based approach is best.

If they roll this out to OSX then there are a whole gamut of other technical security problems to fix – on a general purpose computer with a file system and the possibility of loading untrusted Applications.

I don’t even think it can be rolled out to Linux or Windows.

- August 15, 2021 3:01 PM

@chuqli:

“In my opinion, that is why a cloud based approach is best.”

Which shows you do not understand the fundemental rule of security about ‘data ownership’. Nor much else with regards the rest of security, such as ‘A back door is just a door for anyone to use, and they will’ there is no such thing as NOBUS or other such nonsense.

Just get away from the “Think of the children” FUD nonsense that is stoping you thinking logicallt and realise what is actually proposed in the much broader sense.

You should then realise that the best solution because it does least harm to not just innocent individuals, but the environment as well is “Do not scan at all”.

The entire idea is realy very stupid from before day one, and based on a totally false premise that Apple will retain control… They can’t and they won’t. It will take a judge less than five minutes to take it away from them then it’s ‘privacy gone for good’ for all users. The whole idea is something you would think Apple would understand they can not do by now…

If you ‘Put a mechanism in place’ such as file scanning a lawyer will tell you or should tell you ‘You will be told legally what to use that mechanism for’. No amount of ‘it can not be used for that’ will be accepted by a court. That’s the way it works because a prosecuter will take but seconds to tell the judge ‘all files are equal, just a bag of bits’ and you will not be able to argue against it because it is true.

I just wish people would stop falling for the FUD of ‘Think of the children’ etc and actually think.

Chris Drake August 15, 2021 7:26 PM

“notifies the parent” …
“None of the communications … are available to Apple.”

How exactly does any iPhone “notify” anyone in any way that preserves the “None of the communications … are available to Apple.” claim ?

“None” is a very strong word. Even the very fact that any communication took place at all, and the word “None” cannot apply anymore…

When you tug at the thread of their lie, everything else they said falls apart.

anon August 15, 2021 7:34 PM

Now that known-pedophile photos all self-destruct, all new ones are going to have to be taken, right?

How is any of this tech “protecting” anyone? The net-outcome here will be MORE exploitation!

lurker August 15, 2021 7:50 PM

@Chris Drake

How exactly does any iPhone “notify” anyone in any way that preserves the “None of the communications … are available to Apple.” claim ?

Was Apple “notified” amongst the recipients of the Airdropped gun photo? https://www.schneier.com/blog/archives/2021/07/airdropped-gun-photo-causes-terrorist-scare.html

Apple’s claims of respecting user privacy while at the same time being able to control what happens on the device are becoming so confusing I’m glad I gave up on iOS/MacOS.

.. August 17, 2021 6:24 AM

Wow. massive extrapolation and non-sequitur…

“@chuqli:

“In my opinion, that is why a cloud based approach is best.”

Which shows you do not understand the fundemental rule of security about ‘data ownership’. Nor much else with regards the rest of security, such as ‘A back door is just a door for anyone to use, and they will’ there is no such thing as NOBUS or other such nonsense….

(remainder of rant cut for brevity)

- August 17, 2021 8:26 AM

@..:

“Wow. massive extrapolation and non-sequitur…”

Realy?

Have you anything meaningful to say, that is also germane to the subject being discussed?

John Harris August 19, 2021 3:12 PM

@Etienne

Would you mind elaborating on your assertion below or else pointing me to a resource that would enable me to better (a) understand it and (b) secure my privacy in light of it?

All smartphone encryption is a fraud, and the same concept as putting your money in a system that uses more electricity than Cuba to perform the accounting.

.

Thank you.

Dee August 31, 2021 7:29 AM

But what if the device is ran by a child, yet registered as an adult? like a 12 year old holding a iPhone without it being set to a family account?!

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.