Lessons From the Facebook Riots

By Bruce Schneier
Wired News
September 21, 2006

Earlier this month, the popular social networking site Facebook learned a hard lesson in privacy. It introduced a new feature called "News Feeds" that shows an aggregation of everything members do on the site, such as added and deleted friends, a change in relationship status, a new favorite song, a new interest. Instead of a member's friends having to go to his page to view any changes, these changes are all presented to them automatically.

The outrage was enormous. One group, Students Against Facebook News Feeds, amassed over 700,000 members. Members planned to protest at the company's headquarters. Facebook's founder was completely stunned, and the company scrambled to add some privacy options.

Welcome to the complicated and confusing world of privacy in the information age. Facebook didn't think there would be any problem; all it did was take available data and aggregate it in a novel way for what it perceived was its customers' benefit. Facebook members instinctively understood that making this information easier to display was an enormous difference, and that privacy is more about control than about secrecy.

But on the other hand, Facebook members are just fooling themselves if they think they can control information they give to third parties.

Privacy used to be about secrecy. Someone defending himself in court against the charge of revealing someone else's personal information could use as a defense the fact that it was not secret. But clearly, privacy is more complicated than that. Just because you tell your insurance company something doesn't mean you don't feel violated when that information is sold to a data broker. Just because you tell your friend a secret doesn't mean you're happy when he tells others. Same with your employer, your bank or any company you do business with.

But as the Facebook example illustrates, privacy is much more complex. It's about who you choose to disclose information to, how, and for what purpose. And the key word there is "choose." People are willing to share all sorts of information, as long as they are in control.

When Facebook unilaterally changed the rules about how personal information was revealed, it reminded people that they weren't in control. Its 9 million members put their personal information on the site based on a set of rules about how that information would be used. It's no wonder those members -- high school and college kids who traditionally don't care much about their own privacy -- felt violated when Facebook changed the rules.

Unfortunately, Facebook can change the rules whenever it wants. Its Privacy Policy is 2,800 words long, and ends with a notice that it can change at any time. How many members ever read that policy, let alone read it regularly and check for changes?

Not that a Privacy Policy is the same as a contract. Legally, Facebook owns all data that members upload to the site. It can sell the data to advertisers, marketers and data brokers. (Note: There is no evidence that Facebook does any of this.) It can allow the police to search its databases upon request. It can add new features that change who can access what personal data, and how.

But public perception is important. The lesson here for Facebook and other companies -- for Google and MySpace and AOL and everyone else who hosts our e-mails and webpages and chat sessions -- is that people believe they own their data. Even though the user agreement might technically give companies the right to sell the data, change the access rules to that data or otherwise own that data, we -- the users -- believe otherwise. And when we who are affected by those actions start expressing our views -- watch out.

Facebook should have added the feature as an option, and allow members to opt in if they wanted to. Then, members who wanted to share their information via News Feeds could do so, and everyone else would not have felt they had no say in the matter.

This is definitely a gray area, and it's hard to know beforehand which changes need to be implemented slowly and which won't matter. Facebook and others need to talk to their members openly about new features. Remember: Members want control.

The lesson for Facebook members might be even more jarring: If they think they have control over their data, they're only deluding themselves. They can rebel against Facebook for changing the rules, but the rules have changed, regardless of what the company does.

Whenever you put data on a computer, you lose some control over it. And when you put it on the internet, you lose a lot of control over it. News Feeds brought Facebook members face to face with the full implications of putting their personal information on Facebook.

It had just been an accident of the user interface that it was difficult to aggregate the data from multiple friends into a single place. And even if Facebook eliminates News Feeds entirely, a third party could easily write a program that does the same thing. Facebook could try to block the program, but would lose that technical battle in the end.

We're all still wrestling with the privacy implications of the internet, but the balance has tipped in favor of more openness. Digital data is just too easy to move, copy, aggregate and display. Companies like Facebook need to respect the social rules of their sites, to think carefully about their default settings (they have an enormous impact on the privacy mores of the online world) and to give users as much control over their personal information as they can.

But we all need to remember that much of that control is illusory.

earlier essay: The ID Chip You Don't Want in Your Passport
later essay: Why Everyone Must Be Screened
categories: Privacy and Surveillance
back to Essays and Op Eds

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..