Bruce Schneier's Data and Goliath—Solution or Part of the Problem?
Think of some of the ways the Enlightenment helped advance the human individual. The ability to shape your identity. The ability to own and control your stuff. Economic autonomy. All three help to define the modern world, they're ways we know that "now" is not like "before". All three are founded on the sanctity of the individual. And all three are interlinked.
For example, our identity means little if you can't express it creatively, by protecting your inventions and creations, and having some say over their use. You don't have economic autonomy if an individual cannot negotiate what spoils come from exploiting the value of their work. Privacy is built on the same respect, and it's a more modern and much more culturally specific—laws and norms come from what societies think and feel about the individual. Japanese and Chinese views on privacy are as different as German and American ideas are different.
Today's tech oligarchs like Google and Facebook—the winners of the first era of the internet—are busy setting fire to many of these ideas. To Larry Page or Mark Zuckerberg, the human individual doesn't really exist, it's a legacy idea that's a nuisance to them. The tech oligarchs engage in (often permissionless) data collection, often for the sake of it, it might be useful some day—and lobby politically for the destruction of individual rights over property and privacy. You can't take their assault on privacy in isolation from their assault on identity and their assault on property. Google lashed out so viciously at the Gonzalez vs Google Spain ruling, people began to notice what an adolescent and inadequate corporate citizen it is. We've been been pointing it out for years. It took the tantrum for this to be more widely noticed.
Cryptographer Bruce Schneier has written a hefty book focussing on privacy and data hoarding. It's in two parts. Half of it is a good overview of the extent of corporate and state collection. But the problems start with the second half—Schneier's recommendations to fix the problems.
Schneier approaches this as an American cryptographer would. He doesn't build his argument on the history of the individual, or of ideas about how the individual was considered in political and legal thought.
"Our rights are all over the place ... Standardising this is important," he asserts. But why? We benefit when things like voltages and character sets are standardised, but when laws are standardised there are winners and losers. To a visiting Martian anthropologist, Germany and the USA probably look identical: they're Protestant Western countries. But you couldn't find two more passionate or different approaches to the way the privacy of the individual should be treated today—and we're about to be treated to a spectacular demonstration of this in Europe. Here in the United Kingdom we didn't even call our privacy law a privacy law—here the right to privacy was established by common law in 1849.
Schneier dispenses with solutions often without much or any justification. Some of these are advanced by technical people as sophisticated and knowledgeable as Schneier but who have the advantage of being able to build on human history and philosophy.
"Privacy needs to be a fundamental right, not a property right", he asserts as he dismisses Jaron Lanier's Swiftian idea that data grabbers like Facebook and Google should bill us for our data. This is in a passage headed "Give People Rights To Our Data". Schneier dismisses Lanier so:
"Making this work would be extraordinarily complex, and in the end would require constant surveillance."
It isn't clear why it would require "constant surveillance", but he's committing the Original Nerd's Sin of taking it literally. Lanier is making a ha ha, only serious point about ownership and value. He reminds us that we're getting the raw end of a deal. Tim Worstall argued here recently that because only data processing gives the data any value at all, we should all shut up and be grateful for the free stuff, for Gmail. Readers aren't buying it. If we withheld that data, or traded it, there would be no data processing business at all. The great privacy-busting empires of Google and Facebook are built on sand.
Meanwhile we're not also compromising our privacy, we're surrendering our moral authority when we comply with these asymmetrical terms of trade. How can we express righteous outrage with Facebook or Google's relentless privacy assault, when we continue to hand valuable stuff over?
Although Schneier says we should have "Rights to our Data" without the concept of the individual asserting ownership—without a "property-ish" right, it ends up very wishy-washy. There is a movement that articulates this, called Habeus Data—I wrote about it here. Schneier doesn't mention it. A utopian aversion to the ownership of digital things is why people who care about privacy are so ineffective. I've no doubt the Martha Lane Foxes and Cory Doctorows deeply, deeply care—but every waking hour is spent fighting ownership rights. When the knife fight starts, they charge in waving a carrot.
But wait, we haven't finished with Schneier's philosophical problems just yet.
Fundamentally... everything cancels out
Schneier strongly believes that privacy is a "fundamental right"—as if saying so fixes the problem. In fact it may have dire consequences for individuals. Using fundamental rights as a starting point has some advantages—Europe regards doing it this way. But let's look at how these are implemented in the European Convention on Human Rights. I'll use freedom of speech as an example.
The first bit sounds really good:
1. Everyone has the right to freedom of expression. this right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.
Excellent. Then comes this bit:
2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or the rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.
So "Fundamental" doesn't mean "absolute". It isn't a magic wand. The state grants you a limited licence of expression. And can take it away pretty easily. The implementation here follows the European tradition of a benign state parcelling out freedom bit by bit—the diametric opposite of the British common law based on residual freedom, and the US model. We're a lot less "free" after incorporating this "freedom" into British law.
So the ECHR's privacy protection, Article 8, defends the "right to respect for his private and family life, his home and his correspondence" but it's qualified by "except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others".
Furthermore, no fundamental right is really absolute. Even property rights aren't absolute—you can't shoot somebody who is on your property without permission. In short, calling something "a fundamental right" is barely the start of the story, not the end. If you thought it odd that technology giants are very keen to sign up to Manifestos, Declarations and Magna Cartas—we do not lack these—now you can see why.
As I said at the top, the economics is important—individual rights are interlocking.
Google and Facebook hoard and collect data for two reasons: they can infer something only "Big Data" can reveal, so they need lots of it, and because it might be useful one day. Both are pretty dubious, and both get away with such a one-sided exchange because the services are given away for free. Both justifications prevent Google and Facebook from exploring new, imaginative and mutually useful (to customer and provider) ways of doing business. Ways that don't require data collection and hoarding.
Sadly, the section on highlighting this (called "Incent new business models"—double wince) is one of the more promising avenues to explore. If Google or Facebook only needed to collect and store what economists call revealed preferences—real transactions—then it takes away both incentives. Tesco knows what you've bought, and that's very valuable. It doesn't need your inside leg measurement or your Wi-Fi network password or your lover's health history. Its incentive is to sell more stuff, not hoard for the sake of it.
In essence, Google and Facebook believe they are hugely successful because they hoard data—they're very nervous about doing business openly with us. That's why they spend so much time pretending they're not doing business with us—but saving the world. The utopianism is a cynical PR gloss. They never get called out on this.
In other recommendations, Schneier calls for the NSA to be disbanded—presumably the USA would then cease all signals intelligence—the "cyber sovereignty movement" (meaning China and Russia), and invites us to conclude that a global government is required to sort it all out.
For all his excellence as a computer security guru, Schneier doesn't realistically offer anything that's going to help change things for the better. The large portions of the book devoted to "solutions" could usefully be replaced with one sentence: "Can all the world doing bad things please stop doing bad things?"