Schneier on Security
A blog covering security and security technology.
« Racism as a Vestigal Remnant of a Security Mechanism |
| Lessons in Trust from Web Hoaxes »
May 23, 2012
Privacy Concerns Around "Social Reading"
Interesting paper: "The Perils of Social Reading," by Neil M. Richards, from the Georgetown Law Journal.
Abstract: Our law currently treats records of our reading habits under two contradictory rules rules mandating confidentiality, and rules permitting disclosure. Recently, the rise of the social Internet has created more of these records and more pressures on when and how they should be shared. Companies like Facebook, in collaboration with many newspapers, have ushered in the era of “social reading,” in which what we read may be “frictionlessly shared” with our friends and acquaintances. Disclosure and sharing are on the rise.
This Article sounds a cautionary note about social reading and frictionless sharing. Social reading can be good, but the ways in which we set up the defaults for sharing matter a great deal. Our reader records implicate our intellectual privacy the protection of reading from surveillance and interference so that we can read freely, widely, and without inhibition. I argue that the choices we make about how to share have real consequences, and that “frictionless sharing” is not frictionless, nor it is really sharing. Although sharing is important, the sharing of our reading habits is special. Such sharing should be conscious and only occur after meaningful notice.
The stakes in this debate are immense. We are quite literally rewiring the public and private spheres for a new century. Choices we make now about the boundaries between our individual and social selves, between consumers and companies, between citizens and the state, will have unforeseeable ramifications for the societies our children and grandchildren inherit. We should make choices that preserve our intellectual privacy, not destroy it. This Article suggests practical ways to do just that.
Posted on May 23, 2012 at 7:25 AM
• 17 Comments
To receive these entries once a month by e-mail, sign up for the Crypto-Gram Newsletter.
It's not just "social reading".
Imagine what the Maoists or Leninists would have done with your Amazon purchase history.
What's that? You purchased "Wealth of Nations" 15 years ago? To the re-education camp with you!
Nor is the the nature of the information. It's not qualitatively different from information we would be happy sharing - it's basically the same stuff we would tell anyone.
But "quantity has a quality all its own", as Stalin is reputed to have said.
Harmless information, held in bulk, is no longer harmless.
George Orwell (of 1984 fame) pointed out the perils of the state having to much knowledge of their citizens and the way it could be abused in various of his works back during the 1930's and 40's.
The US went through "Reds Under the Bed" of the McArthy years.
History tells us every fifty years or so just how bad the state can oppress those who chose to be well educated and thus fight for others rights.
But at each turn of the wheel the technology assisting the state has become many times cheaper and thus more wide spread in it's use.
Sadly we appear to have reached the point with the technology were the cost to the state of whole sale surveillance is negligable and the cost of sifting through it gets ever cheaper by the minute.
How long before "thought crime" in it's various guises becomes a normal everyday arrestable crime?
We already have some "thought crime" in that you can be arrested for "in an officers opinion the offender was going equiped to commit a crime". But this normaly applies to those tools usefull for "breaking and entery" etc. We also have laws about "conspiracy" where walking into the wrong room can get you accused of being a conspiritor.
Again with each turn of the wheel the crimes become more numerous and the defence that much more difficult.
We see with the likes of China and Russia false accusations and trials where the sole reason is to dispose of a potential political rival or grabbing their assets to re asign to others who are currently "infavour".
We already see laws put in place to "raise income" by fine where in many cases it is almost impossible to stop yourself becoming fined (in the UK fines have been handed out to people for having a tea bag in the wrong rubish bin, but those levying the fines also require you to store your rubbish bins in publicaly accesssable places without any kind of lock or other protection against others putting rubbish in your bin...).
We further have statistics showing how many Wifi points are misused, how much malware falsely downloads information on a users computer (click through advertising etc) how few users know what is or is not dangerous to do on the internet, that we can safely say that it's well neigh impossible to protect people from false accusation.
So the likely hood of seeing people arrested/fined for (supposadly) reading the wrong book or other information is so high that it can only be a matter of time before it becomes widespread and common.
Your purchase history? It's your wishlist you should be worried about. It's already public and lists everything that was bought from it as well as that which you want...
What amazes me is that people need to be reminded that maybe publishing the details of your life to the world -- or to interested but marginally involved governmental or corporate parties -- might be a bad idea.
I haven't read the paper (yet), but to the extent that its recommendations to manage such sharing are feasible, it sounds like a valuable piece of work.
I don't think there is any inherent problem with social reading as such. The problem is one of content retention. 1980s: Students go to class and discuss reading assignment; everyone forgets it by the end of the semester. 2010: Student goes to class on Blackboard or Canvas and discusses reading. Data still around in 2100.
What's needed is not new ways to "manage" sharing. What we need is a way for the internet to *forget*. A hard drive never forgets unless it's told to. So we need to create enforceable rules that ensure hard drives forget.
@Daniel - A hard drive never forgets unless it's told to.
Good point. I'm sure a lot of use entering our 4th decade of net use can remember questionable files and e-mails we naively sent in the early says assuming our transfers were private. Hopefully the DEL buttons were a lot more effective than a lot of the hardware was back then.
@echowit: Too bad we can't go back and selectively delete academic usenet posts. Oh Philosophy -L how you still mock me...
I'm not so worried about my hard drives. I have a group of them beside me as I write this, lined up for indexing and archiving with de-dup, followed by triple-overwrite, then disposal. Yes, they are old enough that overwriting will actually work. The new drives will need a belt-sander.
The issue is stuff that ever left my control. This is not such a new problem. For example, the Usenet postings from the late 1980s that were dredged up during a job interview in 2006. And then there are my childhood library records. What would happen to a kid today who checked out books on pyrotechnics, 10th-century warfare, submarines. radio and electronics, and a goth-porn book that had been mis-shelved with science-fiction and fantasy? Good thing our library records are protected these days, oh, wait...
Oddly, since it's pretty clear to see which apps have rights on your Facebook, I never thought about this much, except for one thing.
If someone does post that they're reading an article via one of those apps, you can't follow the link and read without approving the app. Which is handled in a fairly tricky form where the normal app "Approve" button is replaced with a "Read Article" button.
I agree with many of the concerns, but as is standard (can't recall the rfc number, sorry) with other internet privacy issues, the debate is occurring years after the genie was let out of the bottle.
A friend of mine was recently shocked to find that the articles she was reading on Yahoo News were appearing in the new 'Trending articles' section on Facebook. If it wasn't obvious to her (an experienced web designer) that the 'Read article' button would have this side effect, then it's not going to be obvious to the general public. (Personally, I just copy and paste the article title into Google to avoid any interaction with social reading apps.)
"If someone does post that they're reading an article via one of those apps, you can't follow the link and read without approving the app."
Actually, if you copy the article title that you see in FB and paste in Google search, you will find the article outside of FB.
Fortunately, google doesn't store that search next to your identity. Oh, wait....
"Actually, if you copy the article title that you see in FB and paste in Google search, you will find the article outside of FB."
Sure. And if you happen to be logged onto, say, Gmail at the time, well then Google knows all about your reading rather than FB.
This, of course, is one of my many objections to the recent proposals for the UK government to mandate ISP filtering of "unsuitable" content, with a rather Orwellian terminology inversion having them talk of people having to "opt in" to not being filtered: a ready-made hit list of people who have revealed they access material the government doesn't want them to.
Needless to say, the filter itself is another bone of contention - mobile networks have already capitulated, and at least one has been caught filtering an unpopular but perfectly legal political party's website as "hate speech", as well as all the usual false positives and dodgy calls such as blogs and non-adult discussion groups.
As long as you're always aware and in control of the sharing, it's not such a threat - but all it takes is your boss seeing "Bruce just read 1001 job-hunting tips" or a partner seeing you're reading an article about STD symptoms... I was quite irritated by the Guardian's Facebook setup, trying to announce to all my contacts as soon as I clicked a link of theirs, though. I'm paranoid enough to have caught and blocked the attempt, but how many unwittingly grant it permission?
@MikeA: bringing up "fun facts" from 30 years ago in a job interview is hardly relevant or ethical in the first place (unless you are applying for a job with an intelligence agency - maybe). I know it is now easier to google the name up than having to hire a private investigator or something.
Anyway I will refuse to deal with such an employer. Fortunately, there's still a lot of reasonable ones.
It's all but a surprise that politicians, governments, corporations and other special interest groups try to mould and regulate the internet for their own benefit and control.
The real surprise is how little most people seem to be aware of the profound impact stuff like SOPA, PIPA, ACTA and the like are about to have on freedom of speech, expression and assembly. However much cleverly disguised as "Protect the Children", "Stop Cyberbullying" or "War on Terrorism" acts, they are about censorship and the erosion of privacy and other civil liberties. No more, no less. The same goes for the unprecedented way in which legitimate and criminal organisations alike are spying on our lives through the window of our computer screens big and small, selling off the collected data to the highest bidder or using it to defraud them.
Even more sad is how many even informed folks hide behind the fallacy that they don't have anything to hide and that those who do surely must be up to no good. Or that all of these things happen way above their heads and far beyond their control.
I guess a well-known quote attributed to Edmund Burke (Irish political philosopher, 1729–1797) is in order here: "All that is necessary for the triumph of evil is that good men do nothing." Or from the same man "Nobody made a greater mistake than he who did nothing because he could do only a little."
Schneier.com is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc.