Entries Tagged "social media"

Page 13 of 13

Facebook and Data Control

Earlier this month, the popular social networking site Facebook learned a hard lesson in privacy. It introduced a new feature called “News Feeds” that shows an aggregation of everything members do on the site: added and deleted friends, a change in relationship status, a new favorite song, a new interest, etc. Instead of a member’s friends having to go to his page to view any changes, these changes are all presented to them automatically.

The outrage was enormous. One group, Students Against Facebook News Feeds, amassed over 700,000 members. Members planned to protest at the company’s headquarters. Facebook’s founder was completely stunned, and the company scrambled to add some privacy options.

Welcome to the complicated and confusing world of privacy in the information age. Facebook didn’t think there would be any problem; all it did was take available data and aggregate it in a novel way for what it perceived was its customers’ benefit. Facebook members instinctively understood that making this information easier to display was an enormous difference, and that privacy is more about control than about secrecy.

But on the other hand, Facebook members are just fooling themselves if they think they can control information they give to third parties.

Privacy used to be about secrecy. Someone defending himself in court against the charge of revealing someone else’s personal information could use as a defense the fact that it was not secret. But clearly, privacy is more complicated than that. Just because you tell your insurance company something doesn’t mean you don’t feel violated when that information is sold to a data broker. Just because you tell your friend a secret doesn’t mean you’re happy when he tells others. Same with your employer, your bank, or any company you do business with.

But as the Facebook example illustrates, privacy is much more complex. It’s about who you choose to disclose information to, how, and for what purpose. And the key word there is “choose.” People are willing to share all sorts of information, as long as they are in control.

When Facebook unilaterally changed the rules about how personal information was revealed, it reminded people that they weren’t in control. Its eight million members put their personal information on the site based on a set of rules about how that information would be used. It’s no wonder those members—high school and college kids who traditionally don’t care much about their own privacy—felt violated when Facebook changed the rules.

Unfortunately, Facebook can change the rules whenever it wants. Its Privacy Policy is 2,800 words long, and ends with a notice that it can change at any time. How many members ever read that policy, let alone read it regularly and check for changes? Not that a Privacy Policy is the same as a contract. Legally, Facebook owns all data members upload to the site. It can sell the data to advertisers, marketers, and data brokers. (Note: there is no evidence that Facebook does any of this.) It can allow the police to search its databases upon request. It can add new features that change who can access what personal data, and how.

But public perception is important. The lesson here for Facebook and other companies—for Google and MySpace and AOL and everyone else who hosts our e-mails and webpages and chat sessions—is that people believe they own their data. Even though the user agreement might technically give companies the right to sell the data, change the access rules to that data, or otherwise own that data, we—the users—believe otherwise. And when we who are affected by those actions start expressing our views—watch out.

What Facebook should have done was add the feature as an option, and allow members to opt in if they wanted to. Then, members who wanted to share their information via News Feeds could do so, and everyone else wouldn’t have felt that they had no say in the matter. This is definitely a gray area, and it’s hard to know beforehand which changes need to be implemented slowly and which won’t matter. Facebook, and others, need to talk to its members openly about new features. Remember: members want control.

The lesson for Facebook members might be even more jarring: if they think they have control over their data, they’re only deluding themselves. They can rebel against Facebook for changing the rules, but the rules have changed, regardless of what the company does.

Whenever you put data on a computer, you lose some control over it. And when you put it on the internet, you lose a lot of control over it. News Feeds brought Facebook members face to face with the full implications of putting their personal information on Facebook. It had just been an accident of the user interface that it was difficult to aggregate the data from multiple friends into a single place. And even if Facebook eliminates News Feeds entirely, a third party could easily write a program that does the same thing. Facebook could try to block the program, but would lose that technical battle in the end.

We’re all still wrestling with the privacy implications of the Internet, but the balance has tipped in favor of more openness. Digital data is just too easy to move, copy, aggregate, and display. Companies like Facebook need to respect the social rules of their sites, to think carefully about their default settings—they have an enormous impact on the privacy mores of the online world—and to give users as much control over their personal information as they can.

But we all need to remember that much of that control is illusory.

This essay originally appeared on Wired.com.

Posted on September 21, 2006 at 5:57 AMView Comments

Privacy Risks of Public Mentions

Interesting paper: “You are what you say: privacy risks of public mentions,” Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 2006.

Abstract:

In today’s data-rich networked world, people express many aspects of their lives online. It is common to segregate different aspects in different places: you might write opinionated rants about movies in your blog under a pseudonym while participating in a forum or web site for scholarly discussion of medical ethics under your real name. However, it may be possible to link these separate identities, because the movies, journal articles, or authors you mention are from a sparse relation space whose properties (e.g., many items related to by only a few users) allow re-identification. This re-identification violates people’s intentions to separate aspects of their life and can have negative consequences; it also may allow other privacy violations, such as obtaining a stronger identifier like name and address.This paper examines this general problem in a specific setting: re-identification of users from a public web movie forum in a private movie ratings dataset. We present three major results. First, we develop algorithms that can re-identify a large proportion of public users in a sparse relation space. Second, we evaluate whether private dataset owners can protect user privacy by hiding data; we show that this requires extensive and undesirable changes to the dataset, making it impractical. Third, we evaluate two methods for users in a public forum to protect their own privacy, suppression and misdirection. Suppression doesn’t work here either. However, we show that a simple misdirection strategy works well: mention a few popular items that you haven’t rated.

Unfortunately, the paper is only available to ACM members.

EDITED TO ADD (8/24): Paper is here.

Posted on August 23, 2006 at 2:11 PMView Comments

MySpace Used as Forensics Tool

From CNN:

Detectives used profiles posted on the MySpace social networking Web site to identify six suspects in a rape and robbery….

[…]

She knew only their first names but their pictures were posted on MySpace.

“Primarily, we pulled up her friends list. It helped us identify some of the players,” said Bartley.

Posted on March 28, 2006 at 1:19 PMView Comments

1 11 12 13

Sidebar photo of Bruce Schneier by Joe MacInnis.