Facebook Plans on Backdooring WhatsApp

This article points out that Facebook’s planned content moderation scheme will result in an encryption backdoor into WhatsApp:

In Facebook’s vision, the actual end-to-end encryption client itself such as WhatsApp will include embedded content moderation and blacklist filtering algorithms. These algorithms will be continually updated from a central cloud service, but will run locally on the user’s device, scanning each cleartext message before it is sent and each encrypted message after it is decrypted.

The company even noted that when it detects violations it will need to quietly stream a copy of the formerly encrypted content back to its central servers to analyze further, even if the user objects, acting as true wiretapping service.

Facebook’s model entirely bypasses the encryption debate by globalizing the current practice of compromising devices by building those encryption bypasses directly into the communications clients themselves and deploying what amounts to machine-based wiretaps to billions of users at once.

Once this is in place, it’s easy for the government to demand that Facebook add another filter—one that searches for communications that they care about—and alert them when it gets triggered.

Of course alternatives like Signal will exist for those who don’t want to be subject to Facebook’s content moderation, but what happens when this filtering technology is built into operating systems?

The problem is that if Facebook’s model succeeds, it will only be a matter of time before device manufacturers and mobile operating system developers embed similar tools directly into devices themselves, making them impossible to escape. Embedding content scanning tools directly into phones would make it possible to scan all apps, including ones like Signal, effectively ending the era of encrypted communications.

I don’t think this will happen—why does AT&T care about content moderation—but it is something to watch?

EDITED TO ADD (8/2): This story is wrong. Read my correction.

Posted on August 1, 2019 at 6:51 AM74 Comments

Comments

Uoyathe August 1, 2019 7:00 AM

Security in whatsapp????!!!! It is a hoax. Security in Whatsapp is a lie. A big lie to fool people to use it more.

Winter August 1, 2019 7:31 AM

I do not see how content moderation makes sense to Facebook in private conversations? Because, Whatsapp is mainly private communication.

Pickle Rick August 1, 2019 7:51 AM

@Winter – the main events driving this ‘feature’ were, at least partially, events such as the recent lynch mobs in India and other fly-blown countries with very poor and uneducated people, which form after people share fake crime stories that go viral via WhatsApp, and then a lynch mob forms and people die because of those (otherwise very obviously, to educated people) fake stories…

Facebook was ostensibly trying to do something about that, so that their app wasnt responsible for enabling innocent people to be killed by angry mobs. Their fix is to remove basically all privacy and security from every single user on their platform.

Would we expect anything less from Facebook?

Egen von Greyez August 1, 2019 8:16 AM

I don’t understand why anyone still trusts Facebook and would use whatsapp if they’re literally telling you that they’re putting a backdoor in it.

Deleting whatsapp and installing Signal (Or other) takes less than 5 minutes. There just isn’t any excuse anymore. Do it. Do it now. Right now.

Noah Vail August 1, 2019 8:30 AM

AT&T doesn’t care about content moderation. It deeply cares about firehosing personal data to the US IC, to which it’s been bone grafted for decades.

Winter August 1, 2019 8:33 AM

“Facebook was ostensibly trying to do something about that, so that their app wasnt responsible for enabling innocent people to be killed by angry mobs.”

They already blocked that by limiting the number of recipients of a message. The content moderation will be much less effective, if it becomes effective at all.

Dio August 1, 2019 9:07 AM

Next comes jailbroken phones and custom apps designed to feed false information to the listeners, then jailbroken phones are considered “items of suspicion” for arrest and larger monitoring, and eventually everyone gets a giant smart TV installed for free and the government use it to both monitor all activity and deliver targeted messages.

Janet August 1, 2019 9:38 AM

@Dio

Haven’t TVs already done a lot of that for decades? I’m not talking tin-hat 1984 kind, I’m talking: ads make us think we need certain products that we otherwise wouldn’t need, news directly influences how we think based on what’s reported and how it’s slanted, movies give us the values and character that the film makers want us to have, and it’s all done while the decision-making and critical-thinking parts of our brains are shut off and we “immerse” ourselves in the story, etc… (oh, and those parts of our brains atrophy if we leave them shut off for large portions of our lives). You think most of what I’ve said is delusional? Watch the face of someone who’s watching TV and decide for yourself what’s going on… Watch as the blank dull stare never changes, even as violent murders and rapes are played…

Wouldn’t it be even more sinister if we got all this and we paid for it ourselves because we wanted it, instead of getting it for free?

parabarbarian August 1, 2019 10:06 AM

Do not worry, citizen. We will only use it to frustrate those evil right wingers and other enemies of the collective.

Trust us for we are the Brave New World.

Seriously, this is not surprising. At some point a message has to be in plaintext for people to read it and that is obviously the best place to intercept it. Law enforcement and other spies have been doing it for years on a limited basis. Facebook is just doing it wholesale, which sucks, but I doubt more than 1% of current users will abandon them. Probably much less.

Randy August 1, 2019 10:38 AM

Sure, AT&T does not care about moderation. But they care about revenue and isn’t advertising a great way to generate some? Give them enough time and they’ll be using those backdoors to insert ads on the fly.

Gunter Königsmann August 1, 2019 10:55 AM

I always believed WhatsApp included a mechanism, anyway, that allows the server to select an end-to-end encryption key for the user. Which could be used as a government backdoor if the government besides the metadata and all images, voice messages and videos (that on Android seem to be stored unencrypted) wants to read the message contents.

Bong-Smoking Primitive Monkey-Brained Spook August 1, 2019 10:59 AM

@Randy,

they’ll be using those backdoors to insert ads on the fly.

Great idea! You’d be on a call with your friend, and in the middle of the call you’d hear a voice advertisement: “Ryan, you should be looking at buying the fishing poles from Acme! Based on your conversation, your skill isn’t inhibiting catching fish! It’s the pole, dawg! Say “yes” to be redirected to Acme (and put your friend on hold) or say “no” to continue with your pathetic rant” 🙂

Chris August 1, 2019 12:03 PM

@Janet: I have not yet achieved the requite levels of paranoia and cynicism to believe that this is necessarily about “oppression” and I don’t attribute to malice that which is adequately explained by some other human shortcoming. In this case, that shortcoming is pusillanimity on Facebook’s part. Someone at the company thinks its bad for the brand to have WhatsApp involved in a terrorist plot so they made a business decision that’s good for them but bad for nearly everyone else. What else is new with Facebook? Why is this even remarkable? As for AT&T, as another poster pointed out, they rolled over years ago when the government calling.

If this comes to pass, the worst outcome — at least in the West — will be that it will lead to stupid criminals which will eventually and inevitably leads to stupid law enforcement. If encryption is outlawed (or weakened), only outlaws will use strong encryption and the police will become less and less capable and interested in preventing their crimes because it will be easier to thwart many more lesser capers on WhatsApp.

Petre Peter August 1, 2019 12:47 PM

Forget about the OS level, maybe spying it’s already happening at the hardware level-why else would it be call intel

Clive Robinson August 1, 2019 12:55 PM

@ Bruce, All,

but what happens when this filtering technology is built into operating systems?

It’s called an “end run attack” and there is a known solution to the problem. Which I’ve been saying since before Whatsapp Signal and all the other “Known to be insecre by design” messaging apps were developed by the likes of Moxie Marlinspike et al.

Moving the surveillance to the plaintext side of the security end point when it is possible, has been the obvious attack since before the Internet. In fact for such a long time that designers of secure equipment during and after WWII have ensured that it is not posible for an attacker to get at the plaintext side. It’s a big chunk of what TEMPEST and EmSec are all about.

The solution is,

    Move the security end points, such that getting to the plaintext side is not possible for an attacker.

It’s not exactly difficult to do as I’ve mentioned more than a few times here even before Smart Phone Apps became popular (see using two PC’s to maintain not just an “air gap” but an “energy gap”).

So not doing it is realy realy stupid, but then that describes the Apps, the App designers, and in most cases the users.

Why the users? Well even when you tell them the problem and a solution, call it lazyness or convenience wins and they just carry on behaving insecurely, on the obviously incorrect assumption that nobody is interested in them…

@ ALL

The real question you should ask yourself is if Facebook does do this and they get in first which is likely… What’s stopping politico’s legislating that all Apps have to use Facebook’s filter?

Thus irrespective of your desires you will all be forced to use Facebook, even for tax returns, banking, insurance, medical applications etc etc…

Remember on the Internet it’s alnost always “First to Market, that owns the Market”.

So consider that along with Facebooks digital currency they will end up owning the world Finance and World Censorship filter”, just how much power does that put in Mark Zuckerburgs hands?

Gregory Magarshak August 1, 2019 1:17 PM

That’s the point.

We have to TRUST WhatsApp, Telegram, Signal, et al because there are no good OPEN SOURCE alternatives. Otherwise we could just run our own.

We have to TRUST the current PKI and DNS because the alternatives are still immature.

But EVEN IF we develop robust and mature alternatives, and wide adoption, we have to TRUST the Device, OS and Browser makers.

The Trusted Computing Base is made by a handful or companies. Open source in hardware is still a nascent field.

It’s interesting if one can ever be truly sure that some part or chip hasn’t been interdicted. Apple tries to scan components as they arrive vs their hardware designs.

But hardware will always be the weak link. Keyloggers. Cameras watching your fingers from the ceiling.

The future of surveillance is in this kind of stuff. We will ALMOST have secure communications, but not really. The only thing you can be sure of is sending quantum entangled particles from airgapped rooms.

CallMeLateForSupper August 1, 2019 1:19 PM

“[…] Facebook’s planned content moderation scheme will result in an encryption backdoor into WhatsApp”

Cool. One more reason to stay far away from both Facebook and those verdammt cell phones.

Ten years ago I did what I had known for twenty-five years I ought to do: stop polluting the air and enriching DMV and oil companies. I sent my 24 year-old car to the scrap yard and didn’t replace it(1). Seeing no compelling reason to join Facebook, I never did(2). And having no reason to be reachable 24/7/52, I’ve never owned a cell phone(3).

And yet I lived a healthy, happy, productive life all those years and to the present. I don’t feel relative deprivation; I wonder why the f**k millions of people willingly enslave themselves to things that inevitably, repeatedly bite them in the tender parts. I’m just sitting here watching the wheels go ’round and ’round.

(1) I bicycle, year ’round.

(2) Would you believe: I have never laid eyes on a Facebook page? Well, I haven’t.

(3) Funny story… Back in the oughts, my sister was driving us back to her house from from an appt. in her local Big City. She asked me to dig cell phone out of her purse and ring hubby to tell him we were in traffic and she would be late starting dinner. An E.E. with 40 years experience, I’m no stranger to many kinds of electronic devices, but I was a virgin cell phonist … I had to ask sis for step-by-step instructions.

lurker August 1, 2019 1:54 PM

@ CallMeLateForSupper

I have never laid eyes on a Facebook page?

Ah, but maybe FB has laid eyes on a lot of pages you watched while you were watching. That little blue f tells Mr.Z you were there…

Gabriel Bauman August 1, 2019 2:15 PM

Hey Bruce,

The kind of backdoor WhatsApp is apparently considering would be trivial to implement across all apps if Google or Apple were ordered to do so.

If ANY app displays text in a UI control on screen, it could easily be snagged pre-encryption or post-decryption by a malicious OS-level accessibility service pushed by Google or Apple on the government’s orders. Or the Play or Apple Store could wrap apps in a malicious payload that adds a backdoor, then re-sign the executable and ship it only to targeted Google/Apple users.

Touch keyboards are apps with internet access. Everything we type on our phones should be considered immediately compromised by the keyboard developer.

And let’s not even mention the cellular baseband, which is a separate OS running next to IOS and Android that can be updated at any time by carriers. In many cases it has full access to RAM and storage.

End-to-end encryption by apps is already meaningless. We are fully compromised. Our devices are malicious hosts for malicious apps, and we have already accepted it.

zoobab August 1, 2019 2:34 PM

Just move the problem to another device that cannot be backdoored. Like a simple arduino with a simple bluetooth stack which doing the encryption/decryption of messages.

Clive Robinson August 1, 2019 3:18 PM

@ Benjamin,

Have you seen BSD running on a Microchip $1 CPU?

Well it was done over five years ago, which kind of says just how far ordinary 32bit microcontrolers can be pushed. OK no windows just the Command Line but four serial devices before considering “bit banging”.

There’s even an OSless version of Python for running on microcontrolers.

But of more importance are the light weight OS’s such as RTOS for microcontrolers many of which are available as fairly easy to understand C source code.

Such RTOS OS’s are unlikely to get backdoored for a couple of reasons. Firstly because the code base is to small to be easily able to hide a backdoor secondly when run effectively as “air-gapped” systems they don’t have the hardware support to give backdoors that would not be very obvious to an embedded systems designer.

Clive Robinson August 1, 2019 3:25 PM

@ All,

Those looking for a more appropriate securiry model to work with go and have a look at “Tin Foil Chat” it’s still up on GitHub and just reading the documents alone should give you good ideas,

https://github.com/maqp/tfc/blob/master/README.md

It was designed and built by Markus Ottela when at the University of Helsinki, Finland.

It was talked about quite abit on this blog as it was being developed.

Imoortantly the design moves the security end points off of the communications end points, thus getting rid of the majority of OS based and End Run attacks.

Friso August 1, 2019 3:45 PM

I’m sorry to say your source is based on overdrawn conclusions from a speculative article.

The linked to Forbes (F1) article you use goes to another Forbes article (F2), which links to the Developer talk.

F2 is a speculative article based on the Facebook talk, which one can figure out by its second paragraph:

I have long suggested that the encryption debate would not be ended by forced vulnerabilities in the underlying communications plumbing but rather by monitoring on the client side and that the catalyst would be not governmental demands but rather the needs of companies themselves to continue their targeted advertisements, harvest training data for deep learning and combat terroristic speech and other misuse of their platforms.

Facebook suggests that it wants to use Edge AI for automated content moderation. One of the challenges they name is that they don’t know whether the algorithms work, which requires that they send violating content to their servers. They name this as a privacy challenge.

F2 also makes the inference that this could be used to bypass E2E encryption if they do send flagged content to Facebook servers. F2 suggests that encrypted messaging may fall target to these same algorithms, although Facebook never stated this. Instead they used the vague ‘our platform’, so it’s not an entirely strange conclusion to make.

F1 then declares the death of encryption by hands of Facebook, magnifying the speculations of F2 as conclusions. We find the link to F2 in this piece of text:

Facebook announced earlier this year preliminary results from its efforts to move a global mass surveillance infrastructure directly onto users’ devices where it can bypass the protections of end-to-end encryption.

One the same site, it went from speculative to conclusive.

Whether this method of content moderation and Facebook’s implementation is good can be debated, but I would not use this article as a basis for that. It’s misinformation that gets clicks by playing on our confirmation bias.

mrfox August 1, 2019 4:20 PM

@ Clive Robinson,

Signal and all the other “Known to be insecre by design” messaging apps

Care to elaborate?

@ Gregory Magarshak,

We have to TRUST […] Signal […] because there are no good OPEN SOURCE alternatives.

Signal is Free and Open Source: https://github.com/signalapp/.

I am in no way affiliated with Signal (other than using it personally), but I’m surprised by the bashing it’s getting here.

Was Venger the Dungeon Master? August 1, 2019 4:51 PM

It’s too bad all that negative publicity concerning the “B” word got all out of hand. For example, back when the fibbonacci sequence was getting a lot of *.FLA (C) & (K) for simply attempting to block and capture a murderer, the wierding ways of the “B” word were demonstrated.

In fact, the “B” word was the wrong word. It should have been the “F” word instead!

B = “backdoor”
F = “firmware update”

It’s amazing how much word choice (diction = “D” word) can get people acting silly!

Nevertheless, all this publicity about faceblank.com and faceplant.com really still doesn’t help any of us really needing to block the likes of tw*tter.com

Not to get too involved in the 2024 or 2028 election cycles, but just how many candidates past and present own stock in tw*tter.com anyhow??????

https://i.postimg.cc/KjvrXdbC/PNG.png
http://4.bp.blogspot.com/-DrOlgnTYZdM/UHH3nWp_OAI/AAAAAAAAADQ/qhTecPlSYhI/s1600/DungeonMasterVenger.png

May the f*rce be with you.

America Jones August 1, 2019 5:15 PM

maybe there are no technical solutions to the problems of electronic surveillance. Expecting Congress — who normalized these practices after 911 — to step in is lunacy.

Recall that the labor movement, the womens movement, the civil rights movement all made dramatic gains without “social media” — dont try to organize on a wiretap…

Nsaid August 1, 2019 5:31 PM

@mrfox

Signal has had several flaws by design, was and still is intransparent about its actions taken on user systems.
in a few words, way too much data gets sent around to call it secure. Great pr guy, but nowhere near security minded, that moxie.

gordo August 1, 2019 5:37 PM

@ Clive Robinson,

“end run attack”

For the corporates that is an answer to this question:

How can we keep our surveillance business models intact and keep governments “off our backs?”

The slow drumbeat of acquiescence continues apace.

Venger was NOT the Dungeon Master August 1, 2019 5:52 PM

Question: So what’s all this got to do with “5G” (networks)(?) being mandated upon us?

I state this question, because so many digital decisions are publicized as if the speed or bandwidth benefits will be increased. I pretty much doubt that, since the bloatware makers have a long tradition of filling in every last gap of breathable headspace with their own pre[…]:

  • prefetch
  • fetch
  • superfetch

etcetera, etcetera

Answer to a different question:
Q: https://i.postimg.cc/7LpJBWPM/collective-soul.png

A: Collectivism (again?)

aside: “It’s nice to know, exactly where I’m going” –Curve

mrfox August 1, 2019 6:51 PM

@ Nsaid,

No software or company is flawless, including Signal. But throwing around accusation of it being broken by design (and insinuating the flaws are there for malicious reasons?) requires a bit more proof. If there is an analysis of its security that highlights the problems you’re talking about, I’d love to see it.

FWIW, it comes recommended by at least somewhat trustworthy sources (EFF, Snowden, the NSA calling it a pain in their arse (in so many words))…

David August 1, 2019 8:11 PM

“Just move the problem to another device that cannot be backdoored. Like a simple arduino with a simple bluetooth stack”
Someone who has clearly never used Bluetooth API, it is a bloated and muddled protocol almost impossible to keep secure.

Cocotoni August 2, 2019 1:02 AM

If the content moderation gets baked into the phone OS, the logical next step is for the phone to start getting the fingerprint and mugshot databases from the police, and start comparing them against the face/fingerprint owner uses to unlock the phone. Then in case of a match start siphoning all the data, starting with location to law enforcement. The phone vendors can still sugarcoat it as the fingerprint/face is hashed by the algorithm and never leaves the phone. Sounds scary?

carrots August 2, 2019 2:05 AM

@Daniel

To quote Schneier, the amount of rat feces in your breakfast cereal is not zero because zero is too expensive.

Same goes for child porn. Zero child porn in secure communications platforms is too expensive for healthy democracy because it would mean no secure communication.

So the politically incorrect and sad truth is nothing, at least in preventative sense.

Alejandro August 2, 2019 3:59 AM

Seems FB still operates on the “move fast, break things rule”, despite claims of rehabilitation.

Not that long ago, after the Congressional hearings and getting fined $5 billion for privacy violations, etc., they claimed to have improved privacy and espoused a desire to become more…”transparent”.

Then this, which will without doubt break encryption once and for all. If it works, there is little doubt every two bit app in the world will follow with a similar back door.

What’s to stop them?

I think FB has a very special relationship with the US government myself, so that they are protected from any real accountability from furthering the mass surveillance and indeed may have had technical assistance from various government agencies in creating their privacy breaking routines.

Regarding, the fine. Have they actually paid one dime of it, or was that yet another side show for us rubes?

Footnote: Several of my family members are addicted to FB. They got to have it. I am embarrassed to say, I get dragged along. When I explain to them why they shouldn’t do this, they look me straight in the eye and tell me I am “just paranoid”.

Maybe so.

But, I still resist.

Clive Robinson August 2, 2019 6:53 AM

@ MrFox,

Care to elaborate?

I’ve actually said it a number of times before on this blog and else where as well as in my first comment on this page,

    The security end point is before/on the communications end point, thus the plaintext interface is easily available to an attacker.

It’s a fundemental design flaw that has been known for a very long time. It’s not just Whatsapp or Signal that suffer from this it’s true of all the major so called “secure messaging apps”.

It’s the first attack type any SigInt agency looks for. After all why bother breaking crypto algorithms, key update mechanisms etc when you can just get directly to the plaintext, especially in “real time” as is the case with mobile device apps.

End run attacks comes in many forms but essentially it’s one of the things you “design out” at the earliest stages of secure system design, even before you doodle ideas on the back of a napkin.

Oh and it’s three times worse on Mobile Phone Devices, because by legislation you are not alowed to play with the “On Air Interface”, thus that is controlled by your SIM provider. Likewise most Smart device manufacturers don’t alow you real control of the hardware or OS so they have their telemetry running which if you think back to the CarrierIQ scandal means they can spy on your device any which way they chose. Then there is the OS supplier like Google with it’s telemetry on you plaintext as well

The way to fix the problem is by the TEMPEST / EmSec fundemental “segregation” which first started back in WWI over a century ago. Which came about from observing failings on even older systems using just humans and pencil and paper, telegraphs or horses. That is you simply take the security end point off of the communications device and move it beyond the influence of anything that can be done by an attacker to the communications system. The greater the gap between the communications end point and the security end point the easier it is to ensure there is no influance.

Clive Robinson August 2, 2019 7:07 AM

@ gordo,

The slow drumbeat of acquiescence continues apace.

Yup and 99.99..% of people don’t realise they are in the frog boiling experiment.

Pasi Patama August 2, 2019 10:49 AM

For just these reasons, been working on #CleanHardware based mobile secure computing unit. Removing trust based security from software, hardware and supply chain is tough, but can be done. Our approach was that we need to compile every bit of source, have no binary blobs and protect RAM to maximum.

You just cannot have security on Consumer platforms for various reasons.

Shay August 2, 2019 11:13 AM

I might have missed something, but I have problem following the source for this – that Forbes opinion column links to another opinion column by the same author, which links to a presentation which I’m not sure that actually says what the author implies. What ma I missing?

Alejandro August 2, 2019 1:16 PM

Back in April two key executives resigned form FB, Chris Cox and Chris Daniels. The resignations came after Zuckerberg declared his plan for the future of the company.

Z. has altered his vision for the organization and is now planning to transform it into a messaging company built on the foundation of strong encryption and to merge their three apps into one: Whatsapp, Facebook and Instagram.

???

I would suggest Cox and Daniels did not share Z’s enthusiasm for the changes.

Now we find out there is a plan to completely backdoor FB encryption too. The end result of course is that FB and friends, such as the US govt. will have complete access to user content while leaving the impression (delusion?) user content is secured by unbreakable encryption.

https://us.blastingnews.com/tech/2019/03/two-high-profile-facebook-executives-resign-as-the-company-goes-into-restructuring-mode-002871851.html

Seems like the end of private encrypted messaging is near. Maybe here already when it comes to FB.

I wonder if the other big players will piggy back Zuck’s vision (likely) or refuse to cave in?

Michael August 2, 2019 1:20 PM

@Shay The Forbes opinion piece was a complete fabrication. It had no source, and was written by a person who was fired from University of Illinois for fabricating research.

Facebook is an awful company with no respect for the privacy of their users, but there are plenty of real cases that we can write about that demonstrate that point.

It’s a bit disappointing to see Schneier spreading misinformation like this, but the silver lining is doing so prompted Cathcart (Whatsapp Product VP) to publicly state that they will not be implementing client-side scanning.

Steve Weis August 2, 2019 2:51 PM

The source Forbes article is not based on fact, but rather speculation by the author. I strongly encourage Bruce Schneier and his readers to actually watch the source video from Facebook.

There is no mention of WhatsApp. There is no mention of end-to-end encryption. There is no mention of sending raw plaintext back to Facebook.

What this video is talking about is moving content filtering — which already happens today for Facebook posts– from the server to a client device. The end result is a privacy win for users.

Imagine you accidentally posted a private image to Facebook. Today, that will go to Facebook, get flagged, and then likely saved on their side. With client-side filtering, it would be detected before it ever leaves your device.

sofot August 2, 2019 3:03 PM

@Pickle Rick:
the recent lynch mobs in India and other fly-blown countries
with very poor and uneducated people, which form after people
share fake crime stories that go viral via WhatsApp, and then
a lynch mob forms and people die because of those (otherwise
very obviously, to educated people) fake stories…

That’s a very interesting issue: people with almost no education being given access to relatively advanced communication tools without moderation.

I once noticed that some people from rural areas of Ukraine – a poor and unstable post-Soviet state – similarly (and suprisingly at least for me, because almost every Ukrainian graduated from high school at least) tend to believe crazy news spreading in the Internet, including via messengers like WhatsApp or Viber (quite popular in countries of the former Soviet Union).

@Janet
There are open source operating systems too. There’s free access to it all, worldwide.

Yes and this is why I’m even more grateful to Richard stallman. He was unfortunately right in his warnings against proprietary software.

Alejandro August 2, 2019 3:56 PM

Certainly, FB moderates a ton of content at the home office via AI and staff review.

Certainly, FB is talking about end to end encryption.

Certainly, FB is talking about merging FB, WhatsApp and Instagram.

Certainly, FB is in business to make money off user content.

The problem becomes, how does one moderate the end to end encrypted content of those three major apps and still make billions?

The backdoor described seems plausible if not ideal, …for FB and the police.

But, for sake of argument, I ask can content be moderated on the device without ever leaving in plain text back to the mother ship?

I don’t think so, but it would be nice if FB could explain that all to us.

Are we supposed to believe there is client app to replace cloud based AI moderation and thousands of employees?

That’s a big leap in faith, in my opinion.

Especially considering it’s FB, again.

Will Cathcart August 2, 2019 4:12 PM

Professor Schneier,

The team here wants you and everyone who reads the blog to hear from us: we haven’t added a backdoor to WhatsApp. We aren’t planning such client side scanning for messaging services and I’d oppose any effort to do so. One of the key reasons we would not pursue this approach is because of the reasons you raised in your blog about how governments could seek to manipulate this practice.

End-to-end encryption is one of the most important security tools we have today to help keep people’s conversations private and secure. We’ve defended this technology within our app and in courts around the world and will continue to do so.

We’re not asking anyone to just “trust us” on this point. The security community has rightly never relied on words alone. We value the work that cryptographers and security researchers do all the time to evaluate the safety and integrity of apps like ours. That works makes everyone safer. A client side scan would be immediately detectable not just by experts but by users who would notice specific messages getting dropped or blocked.

Thank you for raising your voice on this – and the chance for us to address this important issue.

Will Cathcart

Clive Robinson August 2, 2019 6:30 PM

@ Alejandro,

How would FB/WhatsApp/Instagram moderate E2E encrypted content?

Without access to the plaintext they can not. So if they are, then either they would have to,

1, Breaking the crypto.
2, Doing an end run around the crypto they can not.

Are realistically the only two options.

If you look through @Will Cathcart’s statment this paragraph stands out,

    End-to-end encryption is one of the most important security tools we have today to help keep people’s conversations private and secure. We’ve defended this technology within our app and in courts around the world and will continue to do so.

So I’m guessing option two of an end run attack is the remaining option, if such plaintext access was required.

It will be interesting as a not involved party, to see how the game plays out. Because I’ve been predicting such behaviour as this as the next logical step for quite a while. Thus I’ve been giving a couple of methods to mitigate if not eliminate the issue.

Josh August 3, 2019 1:35 AM

From the article:
“In Facebook’s vision, the actual end-to-end encryption client itself such as WhatsApp will include embedded content moderation and blacklist filtering algorithms. These algorithms will be continually updated from a central cloud service, but will run locally on the user’s device, scanning each cleartext message before it is sent and each encrypted message after it is decrypted. ”

If this is a “proposed solution”, then it is pretty clear the problem they ran into was of processing power.

They just don’t have enough bells and whistles to handle all the decrypt/demouldations or didn’t want to pay for it. A “secure” E2E messaging system relies on each client to process the demodulation of complex communication algos instead of burdening the central servers (which is a pre-requisite to the “invisible third party” proposal).

This is akin to mass surveilling the populace and hand them the bills for it.

If anyone else is going to say it won’t work because people can fool it with “modified” clients, which is unlikely to happen given FB, Whatsapp.

How many people do you know run a “modified” Whatsapp?

TRX August 3, 2019 8:47 PM

Of course alternatives like Signal

Um.

A few months ago some people tried to get me to install the Signal app on my phone. I went to the Signal web site, read about how it was “open source” etc., and looked for the source code. There is no link on the Signal web site to the source code. I looked for a download link. The only “official” means of getting the app is to create a Google account, install “Google Play”, and download it from Google’s app store. My LineageOS phone has never been connected to a Google account, does not have the Google Play app, and will never have the Google Play app, which is a flaming security hole.

There’s a project called “Signal” on GitHub, and it points back to the Signal web site, but it’s strictly a one-way thing. It could be anybody; there’s no mention of it anywhere on the Signal web site.

Even if the Signal people are straight-arrow, that’s two major security problems – available only from the Play Store, and no pointer to the promised source. If it was a “pin the tail on the donkey” game, yeah, I could understand, but not for anything that promotes itself as security-oriented.

They failed so hard there was no point in going any further. That kind of carelessness or disinterest in basic security right off the top was too much to wave away.

Ted August 4, 2019 4:41 AM

@TRX,

Signal has many issues among them I believe they should allow for federation of private servers to commune to their official servers. This would make it a true open source crowd-run service, which tor claims to be.

On the security front, I believe signal is using a combination of open and obscurity of security. On one hand, they claim all sources codes are “out there” but on the other hand they run proprietary servers.

In a nutshell, it did not pass the “sniff test”.

Rover August 4, 2019 8:23 AM

Hello,

In relation to preserving user’s privacy, while E2EE will provide content security but what about the meta data that WhatsApp has collected. I would like to hear more discussion on this area which is a determining factor to improve privacy and security.

If you compare and analyze WhatsApp (source code not available) with Signal (Source code available), the stand out difference is that WhatsApp collects and retains meta data (what it collects and retains are not precisely published and may even for perpetuity) while Signal has been demonstrated in court case to hold almost none of significance (only the last time the user contacts Signal Server).

While WhatsApp VP mentions about WhatsApp’s strong effort to wade off government demands for data, if WhatsApp follows what Signal does (retaining no meta data), it would save WhatsApp lots of legal fees and pacify lots of distrusting people (me included) of WhatsApp.

Rover

Thomas August 6, 2019 5:21 AM

@Irritated,

This line particularly stands out.

“SIGNAL DOES NOT WARRANT THAT ANY INFORMATION PROVIDED BY US IS ACCURATE, COMPLETE, OR USEFUL, THAT OUR SERVICES WILL BE OPERATIONAL, ERROR-FREE, SECURE, OR SAFE”

Thomas August 6, 2019 5:26 AM

@Rover,

If you scroll down the “sesame” page you’d get to these lines:
“There is a server which stores the current record of all users and devices.”
“The server temporarily stores the messages that devices send to each other, until the messages are fetched.”

thus we can see it is incorrect to assume signal does not store any messages, because without storing any messages such messaging service cannot correctly function. the word “temporarily” can be very losely interpreted.

Irritated August 6, 2019 1:49 PM

@Thomas

Whoop-de-doo.

You ever read the GPL license of pretty much any libre software?

One section usually inevitably there:

” This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.”

Thomas August 7, 2019 12:18 AM

@Irritated,

Currect. The “white papers” on its website reads more like a bunch of mumbo jumbo buzzwords strung losely together. Furthermore, the site’s “terms of service” explicitly exempts its writers from any misinformation. However, it should not be categorized as “libre” service because the service instance is a closed loop as the federation of servers is strictly prohibited by the operators.

Irritated August 7, 2019 3:15 PM

Hmm, more like restricted, then prohibited?

I don’t think OWS is stopping anyone from running their own servers.

They’re just not going to allow them to connect to theirs, for “reasons”. I don’t recall Moxie’s reasoning on it off-hand. But one day I did wade through multiple pages of forum threads.

If all the users you’re interacting with use non-official Signal builds connecting to your own servers, I don’t really see the problem.

Then again, it could be easier and safer to just use Riot / Matrix / Tox / your preferred software.

But what do I know? I’m just a layman.

Dom August 8, 2019 1:41 AM

@Irritated,

“They’re just not going to allow them to connect to theirs, for “reasons”. I don’t recall Moxie’s reasoning on it off-hand. But one day I did wade through multiple pages of forum threads.”

This is called “federation” by my understanding, which the aforementioned poster wrote, for reasons nefarious or not. This is often sited as a criticism to Signal’s proclaimed “openness” because in reality it’s not.

Patriot August 8, 2019 10:37 AM

Facebook–what bothers me is the constant lying.

They destroy communities. They are in it to win it: get the money and apologize later.

I hope that others involved in information security, computer science, cryptography, etc., will be active in limiting the collection activities of this vampire squid called Facebook.

It might just be time to get together and face them. We need to come up with a list of effective actions that will limit their collection.

David August 10, 2019 4:32 AM

@Patriot,

What I believe Facebook is doing, in contrast to Google, is that they don’t claim to participate in “content moderation” of politicism. Google is notorious for doing so because of their openly pro-liberalism stance. However, Facebook is keen to keep itself above and under the political clashes of laymen, nor do they seek to influence/shape it like Google does.

This may draw the ire of certain parties of special interest but it strive well to perculate as a platform to which people can freely associate with others. Totalitarian entities have been known to force its participants to behave certain ways which may seem effective on surface but the suppression of expression does not change people’s core beliefs, it only makes reinforcement of personal viewpoints harder.

The upcoming election year will be interesting to see how these parties try to influence the results hiding behind the veil of fighting “fake news”.

SCOTT January 16, 2020 2:17 PM

They are all scammers, they will make you pay after which they will give you an excuse asking you to pay more money, they have ripped me of $2000, i promised i was going to expose them. I figured it all out when my colleague took me to ALBERTl (Theredhackergroup@gmail.com) +1 571 318 9498  He did perfect job, he hacks all accounts ranging from (Emails, Facebook, whatsapp, imo, skype, instagram, Phone cloning, DMV removal, tracking locations, background checks Kik etc. he also hacks cell phones, cell phone tapping and cloning, clears bad driving and criminal records, bank transfers, locates missing individuals e.t.c. You should contact him and please stop using contacts you see on websites to execute jobs for you, you can ask around to find a real hacker.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.