Judging Facebook's Privacy Shift

Facebook is making a new and stronger commitment to privacy. Last month, the company hired three of its most vociferous critics and installed them in senior technical positions. And on Wednesday, Mark Zuckerberg wrote that the company will pivot to focus on private conversations over the public sharing that has long defined the platform, even while conceding that “frankly we don’t currently have a strong reputation for building privacy protective services.”

There is ample reason to question Zuckerberg’s pronouncement: The company has made—and broken—many privacy promises over the years. And if you read his 3,000-word post carefully, Zuckerberg says nothing about changing Facebook’s surveillance capitalism business model. All the post discusses is making private chats more central to the company, which seems to be a play for increased market dominance and to counter the Chinese company WeChat.

In security and privacy, the devil is always in the details—and Zuckerberg’s post provides none. But we’ll take him at his word and try to fill in some of the details here. What follows is a list of changes we should expect if Facebook is serious about changing its business model and improving user privacy.

How Facebook treats people on its platform

Increased transparency over advertiser and app accesses to user data. Today, Facebook users can download and view much of the data the company has about them. This is important, but it doesn’t go far enough. The company could be more transparent about what data it shares with advertisers and others and how it allows advertisers to select users they show ads to. Facebook could use its substantial skills in usability testing to help people understand the mechanisms advertisers use to show them ads or the reasoning behind what it chooses to show in user timelines. It could deliver on promises in this area.

Better—and more usable—privacy options. Facebook users have limited control over how their data is shared with other Facebook users and almost no control over how it is shared with Facebook’s advertisers, which are the company’s real customers. Moreover, the controls are buried deep behind complex and confusing menu options. To be fair, some of this is because privacy is complex, and it’s hard to understand the results of different options. But much of this is deliberate; Facebook doesn’t want its users to make their data private from other users.

The company could give people better control over how—and whether—their data is used, shared, and sold. For example, it could allow users to turn off individually targeted news and advertising. By this, we don’t mean simply making those advertisements invisible; we mean turning off the data flows into those tailoring systems. Finally, since most users stick to the default options when it comes to configuring their apps, a changing Facebook could tilt those defaults toward more privacy, requiring less tailoring most of the time.

More user protection from stalking. “Facebook stalking” is often thought of as “stalking light,” or “harmless.” But stalkers are rarely harmless. Facebook should acknowledge this class of misuse and work with experts to build tools that protect all of its users, especially its most vulnerable ones. Such tools should guide normal people away from creepiness and give victims power and flexibility to enlist aid from sources ranging from advocates to police.

Fully ending real-name enforcement. Facebook’s real-names policy, requiring people to use their actual legal names on the platform, hurts people such as activists, victims of intimate partner violence, police officers whose work makes them targets, and anyone with a public persona who wishes to have control over how they identify to the public. There are many ways Facebook can improve on this, from ending enforcement to allowing verifying pseudonyms for everyone­—not just celebrities like Lady Gaga. Doing so would mark a clear shift.

How Facebook runs its platform

Increased transparency of Facebook’s business practices. One of the hard things about evaluating Facebook is the effort needed to get good information about its business practices. When violations are exposed by the media, as they regularly are, we are all surprised at the different ways Facebook violates user privacy. Most recently, the company used phone numbers provided for two-factor authentication for advertising and networking purposes. Facebook needs to be both explicit and detailed about how and when it shares user data. In fact, a move from discussing “sharing” to discussing “transfers,” “access to raw information,” and “access to derived information” would be a visible improvement.

Increased transparency regarding censorship rules. Facebook makes choices about what content is acceptable on its site. Those choices are controversial, implemented by thousands of low-paid workers quickly implementing unclear rules. These are tremendously hard problems without clear solutions. Even obvious rules like banning hateful words run into challenges when people try to legitimately discuss certain important topics. Whatever Facebook does in this regard, the company needs be more transparent about its processes. It should allow regulators and the public to audit the company’s practices. Moreover, Facebook should share any innovative engineering solutions with the world, much as it currently shares its data center engineering.

Better security for collected user data. There have been numerous examples of attackers targeting cloud service platforms to gain access to user data. Facebook has a large and skilled product security team that says some of the right things. That team needs to be involved in the design trade-offs for features and not just review the near-final designs for flaws. Shutting down a feature based on internal security analysis would be a clear message.

Better data security so Facebook sees less. Facebook eavesdrops on almost every aspect of its users’ lives. On the other hand, WhatsApp—purchased by Facebook in 2014—provides users with end-to-end encrypted messaging. While Facebook knows who is messaging whom and how often, Facebook has no way of learning the contents of those messages. Recently, Facebook announced plans to combine WhatsApp, Facebook Messenger, and Instagram, extending WhatsApp’s security to the consolidated system. Changing course here would be a dramatic and negative signal.

Collecting less data from outside of Facebook. Facebook doesn’t just collect data about you when you’re on the platform. Because its “like” button is on so many other pages, the company can collect data about you when you’re not on Facebook. It even collects what it calls “shadow profiles“—data about you even if you’re not a Facebook user. This data is combined with other surveillance data the company buys, including health and financial data. Collecting and saving less of this data would be a strong indicator of a new direction for the company.

Better use of Facebook data to prevent violence. There is a trade-off between Facebook seeing less and Facebook doing more to prevent hateful and inflammatory speech. Dozens of people have been killed by mob violence because of fake news spread on WhatsApp. If Facebook were doing a convincing job of controlling fake news without end-to-end encryption, then we would expect to hear how it could use patterns in metadata to handle encrypted fake news.

How Facebook manages for privacy

Create a team measured on privacy and trust. Where companies spend their money tells you what matters to them. Facebook has a large and important growth team, but what team, if any, is responsible for privacy, not as a matter of compliance or pushing the rules, but for engineering? Transparency in how it is staffed relative to other teams would be telling.

Hire a senior executive responsible for trust. Facebook’s current team has been focused on growth and revenue. Its one chief security officer, Alex Stamos, was not replaced when he left in 2018, which may indicate that having an advocate for security on the leadership team led to debate and disagreement. Retaining a voice for security and privacy issues at the executive level, before those issues affected users, was a good thing. Now that responsibility is diffuse. It’s unclear how Facebook measures and assesses its own progress and who might be held accountable for failings. Facebook can begin the process of fixing this by designating a senior executive who is responsible for trust.

Engage with regulators. Much of Facebook’s posturing seems to be an attempt to forestall regulation. Facebook sends lobbyists to Washington and other capitals, and until recently the company sent support staff to politician’s offices. It has secret lobbying campaigns against privacy laws. And Facebook has repeatedly violated a 2011 Federal Trade Commission consent order regarding user privacy. Regulating big technical projects is not easy. Most of the people who understand how these systems work understand them because they build them. Societies will regulate Facebook, and the quality of that regulation requires real education of legislators and their staffs. While businesses often want to avoid regulation, any focus on privacy will require strong government oversight. If Facebook is serious about privacy being a real interest, it will accept both government regulation and community input.

User privacy is traditionally against Facebook’s core business interests. Advertising is its business model, and targeted ads sell better and more profitably—and that requires users to engage with the platform as much as possible. Increased pressure on Facebook to manage propaganda and hate speech could easily lead to more surveillance. But there is pressure in the other direction as well, as users equate privacy with increased control over how they present themselves on the platform.

We don’t expect Facebook to abandon its advertising business model, relent in its push for monopolistic dominance, or fundamentally alter its social networking platforms. But the company can give users important privacy protections and controls without abandoning surveillance capitalism. While some of these changes will reduce profits in the short term, we hope Facebook’s leadership realizes that they are in the best long-term interest of the company.

Facebook talks about community and bringing people together. These are admirable goals, and there’s plenty of value (and profit) in having a sustainable platform for connecting people. But as long as the most important measure of success is short-term profit, doing things that help strengthen communities will fall by the wayside. Surveillance, which allows individually targeted advertising, will be prioritized over user privacy. Outrage, which drives engagement, will be prioritized over feelings of belonging. And corporate secrecy, which allows Facebook to evade both regulators and its users, will be prioritized over societal oversight. If Facebook now truly believes that these latter options are critical to its long-term success as a company, we welcome the changes that are forthcoming.

This essay was co-authored with Adam Shostack, and originally appeared on Medium OneZero. We wrote a similar essay in 2002 about judging Microsoft’s then newfound commitment to security.

Posted on March 13, 2019 at 6:51 AM29 Comments


Winter March 13, 2019 6:59 AM

I ask myself how much of this “pivot” (in the press) is also due to a looming GDPR sword dangling on a silk thread over its head?

2% of global turnover per incident (4% if repeated) in the biggest market of the world might be a strong incentive to at least be seeing to be doing something about privacy.

Old Bull Lee March 13, 2019 8:42 AM

Facebook will welcome regulation. They’ll influence it in their favor, but more importantly, they’ll make sure it creates a hefty barrier to any competition.

Winter March 13, 2019 9:00 AM

@Old Bull Lee
“Facebook will welcome regulation. They’ll influence it in their favor, but more importantly, they’ll make sure it creates a hefty barrier to any competition.”

It seems you want to say that regulations does not matter in this case. Facebook will not change with or without regulation.

Historically, this has not been what has happened in, say, car and workplace safety.

1&1~=Umm March 13, 2019 9:04 AM


“The company has made — and broken — many privacy promises over the years.”

“But we’ll take him at his word…”

Fool me once, shame on you, fool me twice shame on me.

Does a cheater change it’s spots?


“Zuckerberg says nothing about changing Facebook’s surveillance capitalism business model.”

Or for that matter does a wild cheater change it’s feeding habits?

No, it sees something move and gives chase, catches kills and if not attacked it’s self consumes, however lay fresh food at it’s feet and it will starve. Only if caged from birth does it’s feeding habits change.

So the question arises,

‘Is Zuckerberg smarter than a cheater?’

I’m guessing most have the shortest of answers for that. He and those who work with or for him for so long have proved themselves to be untrustworthy, not just once but many times, the most recent selling peoples phone numbers used for 2FA to advertizers and databases that then sell it on to low lifes that ‘credit con’ and behave worse than many stalkers.

Lets be honest, the selling of the phone numbers was entirely predictable. What you give Zuckerberg and his ilk they will monetize in some way and try their best to give you know choice.

There was a film back in the early 1980’s called ‘War Games’* with the young Matthew Broderick and Ally Sheedy. The main protagonist however was WOPR the military War Computer in charge of the US nuclear deterrent. WOPR effectively went rouge and was trying to start Global Armageddon by first strike attack. However after learning from playing Tic-Tack-Toe against it’s self the notion of ‘no win’ then trying the same with the war games, it concludeds, that it is “a strange game” in which “the only winning move is not to play.”

The thing is that film was fiction and the thing about fiction is it has to have more than a degree of fact in it to be believable. However life is not like fiction it does not have to be believable prior to the event and for many not after either, hence ‘Disaster/Survivor Shock’ where people can not come to terms with breaches of trust. Most here should have learned this from the Ed Snowden revelations, so if a Government can flaunt the law and do as routine what shocks people to the core all the while lying to keep the true nature of the breach of trust hidden, why should one or more corporations be any different?

That is why it is better ‘to be wise before the event’ rather than after. And in thr case of Social Media and other PII gathering opportunities the sensible thing is,

“the only winning move is not to play.”

If you give them no trust then they can not breach it. You don’t have to be paranoid, just develop a healthy scepticism and be distrusting. After all what do we aim to teach children with “Stranger Danger”? To be cautious to survive and thrive, not be paranoid and in fear.

  • Various people claim that the film strongly influanced then President Reagan into the first computer security EO,


Even at the time it was seen in the press as a new threat,


The early 80’s were also an interesting time politically with parallels to today. Back then the new Russian Premier of the time was the ‘butcher of Belgrade’ Ex-KGB leader Yuri Andrapove who was totaly convinced due to his own past behaviours that the West was going to ‘first strike’. Unfortunately the behaviours of Reagan and Thatcher virtually pushed him over the edge, he certainly managed to make the KGB virtually impotent. Earily perhaps, today nearly four decades later we are seeing not to dissimilar events starting to unfold. With an Ex-KGB Premier seeing torrents of attacks from the West, but has this time decided to build up a new nuclear arsenal of tactical rather than stratigic nuclear weapons. Probably in preperation not for the West first strike but from the East. With China being not party to the mid-range nuclear non proliferation treaty, that the current US President has decided is probably pointless now there is a third play who has not signed…

Humdee March 13, 2019 9:33 AM

The underlying problem for FB is that the advertising ecosystem has moved to quality based assessments. So anything that genuinely protects user privacy also undermines data quality, reducing profitability. So one of two things is true. Either FB is lying, again, or they are getting something in return for reduced profits…such as organizational threat reducation.

James March 13, 2019 10:08 AM

Facebook ? Privacy ? Does anyone really believe this BS ? If you use Facebook you have no expectation of privacy whatsoever.

Sed Contra March 13, 2019 12:53 PM


  1. What does FB do with the information it harvests/surveilles from advertisers ? What are they doing about it ?

  2. FB is really an app, like a glorifed dating app. Ggggl deals in more fundamental technology. Which will consume or morph over the other ? The franchise wars [1] are coming.

  3. FB seems simply an invitation to infantilism. Is there a Nobel prize category for reasearch leading to the discovery of a compelling reason for it ?

  4. https://m.youtube.com/watch?v=xFiDoOgRTpk

Wisdom March 13, 2019 1:44 PM

Jonathon writes, “This looks like the usual corporate strategy of co-opting your critics. PoliSci101”

My thought as well. So let us focus on what this means for the field of public interest technology. It suggests that as far as FB and other big companies are concerned the field of public interest technology will merely serve as the minor leagues of the field of corporate technology. If any of those public interest technologists show the slightest bit of verve or above average intelligence FB will remove the thorn in their side by dangling large sums of money in their faces. FB can afford to cherry pick the good ones, which will leave the field of public interest technology for the second-rate and the gutless with the rare odd fellow who can’t be bought trotted out to meet the press when the swamp gets smelly.

If I were in Zukerberg’s shoes I’d be heed over heels in love with what Bruce is doing because he has much more to gain than to lose from public interest technology.

Locked Out March 13, 2019 1:54 PM

Facebook is in fact so secure that a hacker logged into my Instagram account, DELETED it, and there is no official mechanism for restoring it. I have been looking for help for months.

Michael Heindrichs March 13, 2019 1:56 PM

the company hired three of its most vociferous critics
and installed them in senior technical positions

In real fact The company bought and that way silenced the critics.

gordo March 13, 2019 9:09 PM

Speaking of even more “ample evidence”, this also says a lot about how the data sharing ecosystem, of which Facebook is just one player, operates. In my view, given that the FTC’s consent order against Facebook was public knowledge, no one’s hands are clean on this one. Maybe this will finally get everyone’s attention:

Facebook’s Data Deals Are Under Criminal Investigation
By Michael LaForgia, Matthew Rosenberg and Gabriel J.X. Dance | New York Times | March 13, 2019

Federal prosecutors are conducting a criminal investigation into data deals Facebook struck with some of the world’s largest technology companies, intensifying scrutiny of the social media giant’s business practices as it seeks to rebound from a year of scandal and setbacks.

A grand jury in New York has subpoenaed records from at least two prominent makers of smartphones and other devices, according to two people who were familiar with the requests and who insisted on anonymity to discuss confidential legal matters. Both companies had entered into partnerships with Facebook, gaining broad access to the personal information of hundreds of millions of its users.

The companies were among more than 150, including Amazon, Apple, Microsoft and Sony, that had cut sharing deals with the world’s dominant social media platform. The agreements, previously reported in The New York Times, let the companies see users’ friends, contact information and other data, sometimes without consent. Facebook has phased out most of the partnerships over the past two years.




Kevin March 13, 2019 10:21 PM

At the core of this is the difference between what the users want and FBs business model.

I’m pretty sure I could write an app to match FaceBook. What if I did – designed it for the USER, with no advertising, full privacy, but charged everyone $5/yr to use it. I’m sure that would be a viable business for me, and affordable by everyone – but do you think people would pay???

Users are spoilt and expect everything for free.

Matt March 13, 2019 10:33 PM

All of this is basically irrelevant unless FB changes their business model. The saying remains true: If you’re not paying for it, you’re the product. As long as Facebook is “free” to use, they will always, by necessity, do everything they can to acquire as much information about their users as possible, so that they can sell it.

It doesn’t matter what they say or even what they do: as long as their business model is selling info about you to advertisers (and selling ad space), their true motive will be unchanged.

Gerard van Vooren March 14, 2019 12:54 AM

@ 1&1~=Umm,

To think about it: WOPR was one of the first AI. That would have been a serious achievement, if it wasn’t in the movies.

@ Bruce Schneier,

This was one rather large wish list. Isn’t it time to create some laws?

lurker March 14, 2019 3:31 AM

I watched the birth of Facebook: it started as a frat prank, and it’s still a frat prank. It caused Mr Z’s early exit from his alma mater. So how come there’s so many “followers”? P.T.Barnum, or somebody like him, said “There’s one born every minute”, or was it “You can never underestimate the intelligence of an audience”. Of course the higher they fly the harder they fall – I see reports trickling out of a database meltdown, inability to spin up spare machines fast enough …

)required( March 14, 2019 4:24 AM

“Facebook ? Privacy ? Does anyone really believe this BS ? If you use Facebook you have no expectation of privacy whatsoever.”


“As long as Facebook is “free” to use, they will always, by necessity, do everything they can to acquire as much information about their users as possible, so that they can sell it.”

=A thieving company that preys on victims will not stop doing so because they can also charge a premium on the dumbest of their victims, at least, not in my experience.

Ergo Sum March 14, 2019 5:42 AM


All of this is basically irrelevant unless FB changes their business model. The saying remains true: If you’re not paying for it, you’re the product.

That may have been true a few years ego, but nowadays even paid software does the same. From apps to OSs, all of them call home for updates and to upload “telemetry data”.

Not to mention all of the cloud based applications and free data storage from the like of Apple, Google, Microsoft, the usual suspects. And yes, the usual suspects do cooperate.

Take Apple for example, where privacy is a human right according to Tim Cook. He promises that Apple does not collect your information and it does not. It is actually outsourced to Google via default search engine for Safari $3B per year and unknown amount per year for Google search engine for Siri.

Then there’s Windows “telemetry”, that uploads all kind of data to the MS cloud, some of it is voluntary while others are involuntary. Add to this mix of replacing MS own browser engines with the Chrome browser engine, supporting Linux commands/tools and pretty soon Windows 10 will be called Winix, a closed sourced Linux OS…

As someone mentioned earler, this is a surveillance capitalism and there’s no escape from it…

1&1~=Umm March 14, 2019 11:46 AM

@Ergo Sum:

“From apps to OSs, all of them call home for updates and to upload “telemetry data”.”

Which is why it should be made illegal to do so, especially when what they upload generally has nothing what so ever to do with them.

Look at it this way if you were to invade the privacy of one of these big data corps how do you think they would react?

Well we don’t need to ask do we, because we’ve just had an example of it. Which has resulted in divorce proceadings, and a degree of kickback involving the US President…

Speaking of politicos we know after a number of events that they don’t like it very much either.

So perhaps we should be asking how much of a hold has this Silicon Valley nonsense got over both the politicos and the corporation owners?

I suspect our host @Bruce? is correct that the lead on this will probably come from Europe. So hopefully the GDPR will be just the little toe in the water compared to what should come next.

albert March 14, 2019 12:05 PM

It’s one thing to make your own bed and lie in it, but it’s quite another to lie in a bed that someone else has made. That’s what FB, GG, TT, etc. are doing for their hapless victims*, I mean, members. By subscribing to those services, the members turn them from fads to fads perceived as necessities. They are not. Smart people, rich people, public figures, etc. use proxies to handle social media. They assume responsibility for content, and are able to edit out stupid or legally dangerous content.

What surprises me (a little bit) is companies that use FB as a company website. Is there anyone out there that has more than two brain cells connected together?

The batteries in my Diogenian lamp are running low, how will I replace them?

*how I do miss overstrike
. .. . .. — ….

albert March 14, 2019 12:14 PM

This FB playacting reminds me of Microsofts Corporate Ethics Award from years back.

Beware of Asocial Media websites run by feckless frat boys.

“…If you’re not paying for it, you’re the product…”


It doesn’t matter if you pay for it or not, you’re -still- the product.

I leave you with this rhetorical question: “How can you tell if a CEO is lying?r”

. .. . .. — ….

1&1~=Umm March 14, 2019 12:51 PM


“How can you tell if a CEO is lying?”

Because not only are their lips moving they are also trying to look honest…

As Elon Musk found out to his $20million cost, being transparent is just the thing bureaucrats love to punish, as they regard it as a worse crime than total dishonesty 😉

albert March 14, 2019 2:21 PM


It’s SBP, Standard Business Practice, AKA Smart Business.
The ethical, moral or legal need not apply.

I was an arrogant MF since high school. It was only after I mellowed with age, that I began to fail in business…

Being smart and arrogant is the key to business success.

. .. . .. — ….

Anon Y. Mouse March 14, 2019 3:06 PM


The batteries in my Diogenian lamp are running low, how will I replace them?

There is a vending machine in the maze of twisty passageways, all different,
but you will need the coins.

Tatütata March 15, 2019 9:41 AM

Regarding censorship:

The above link to The Verge only mentions US-based serfs.

The French-German channel arte aired in 2018 an 85 minute long WDR (Cologne) documentary titled in English “The Cleaners” on the social media “content moderation” industry that has developed in low-wage countries such as the Philippines, where the conditions are arguably much worse than in Phoenix. (The sight of the workers commuting to their graveyard shifts reminded me of another 2005 documentary on Indian call centers. The title “John & Jane” is a reference to the generic western names the attendants are required to use in their work).

There’s a short CBC interview with the filmmakers on Ewetoob.

“Social media” was mentioned in most reports on today’s load of obscenity…

albert March 16, 2019 3:32 PM

“…Does a cheater change it’s spots?…”

Not without a good stylist.

@Anon Y. Mouse,
“…There is a vending machine in the maze of twisty passageways, all different, but you will need the coins….”

And the coins are crypto-currency, no doubt.

. .. . .. — ….

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.