The Problems with Managing Privacy by Asking and Giving Consent

New paper from the Harvard Law Review by Daniel Solove: "Privacy Self-Management and the Consent Dilemma":

Privacy self-management takes refuge in consent. It attempts to be neutral about substance -- whether certain forms of collecting, using, or disclosing personal data are good or bad -- and instead focuses on whether people consent to various privacy practices. Consent legitimizes nearly any form of collection, use, or disclosure of personal data. Although privacy self-management is certainly a laudable and necessary component of any regulatory regime, I contend that it is being tasked with doing work beyond its capabilities. Privacy self-management does not provide people with meaningful control over their data. First, empirical and social science research demonstrates that there are severe cognitive problems that undermine privacy self-management. These cognitive problems impair individuals' ability to make informed, rational choices about the costs and benefits of consenting to the collection, use, and disclosure of their personal data.

Second, and more troubling, even well-informed and rational individuals cannot appropriately self-manage their privacy due to several structural problems. There are too many entities collecting and using personal data to make it feasible for people to manage their privacy separately with each entity. Moreover, many privacy harms are the result of an aggregation of pieces of data over a period of time by different entities. It is virtually impossible for people to weigh the costs and benefits of revealing information or permitting its use or transfer without an understanding of the potential downstream uses, further limiting the effectiveness of the privacy self-management framework.

Posted on June 3, 2013 at 6:15 AM • 24 Comments


name.withhed.for.obvious.reasonsJune 3, 2013 6:55 AM

I bring this up here as the legal underpinnings relate to discovery during a proceeding or grand juror investigation. This story is fiction, it is useful as an instrument to inform potential parties to a fictional lawsuit.

An interesting exercise would be a lawsuit, the premise of the case is to determine criminal and civil issues related to an online behavioral scheme to automate misrepresentation of identities using information acquired from a private data company. This data includes real names and related PID info.

Summary of the case against the defendant:

The defendant in this case has developed an application that plies on the edges of the behavioral models of google, bing, and yahoo and their third party data partners. The malware written by the defendant, hidden in a UEFI hook, provides for an invisible persistent Trojan. The Trojan has been instructed (and mapped) to the actual person related to the information stolen from a hacking incident six months other words a level of authenticity is given to the behavior that is exiting the infected host. This proves wildly successful--for the hacker--and are large number of marketing and advertising in the on-line market start failing--failing fast.

Here where it is fun...discovery

The judge orders that all the data related to the monitoring of internet data, activity, accounts, e-mail, etc. The prosecutor is asking the judge for the information so that the people can determine the extent and scope of the original crime and to determine what has changed between the period prior to the hack attack and the resulting cyber crime. Companies have been destroyed, google has lost 50% of its value, and the trust on the net reaches an all time low. After about two weeks in executing the discovery, the prosecution realizes that could take up to one year just to identify and acquire the data related to the ascribed event...oh, and some of that data is classified--aggregated data is also held by newly anointed "National Security Corporate Military Services Complex"

RoxanneJune 3, 2013 8:42 AM

My cellphone came with a bunch of apps. One of them - I never did figure out which - made my battery run hot. I deleted everything that I thought wouldn't compromise the phone. Upon over-shooting and needing to reinstall one, it announced that it would be monitoring my phone usage.

I'm in a dilemma. The phone sort of needs that app to do some things, but I don't want a third party monitoring who I call on my phone. It's bad enough that the phone company keeps those records. Upon review, it's obvious that Google (maker of the operating system) will be collecting that data - and there's no way to get Google out of an Android phone.

Last week I read an article about how Kenya is the top country for banking-by-cellphone. Imagine if Google had access to all of the banking info out there. Yow. Because they probably do.

ZGJune 3, 2013 11:32 AM

The current implementations of personal consent allow for the perfect excuse to do a multitude of things with personal data, some of them not so great for the person in question. Consent is not generally understandable by the average citizen (i.e. a non privacy enthusiast) and it is not manageable even if it is well understood (usually binary choices, have service or don't have service).

It does, however, provide a person with the illusion of control and it allows for a company or government entity to hide behind "consent" when they want to do things are that are questionable with personal information. "She gave us consent to do this!"

I expect that our current models of consent implementation will have a very long life because they provide great benefit to the organisations that forward them.

paulJune 3, 2013 1:24 PM

The other thing about "consent" is that it's seldom truly freely given. Take, for example, the site that changes its terms of service and privacy policy after you've been using it for a while. Sure, you can refuse consent to the new terms and stop using that site -- if you throw away all the data you've entrusted to it or generated with it thus far.

I suppose at this point we should look at any set of privacy terms we consent to as essentially meaningless since subject to change. But then that doesn't really get us very far.

Julien CouvreurJune 3, 2013 1:35 PM

Saying that consent is cognitively difficult or costly does not imply that consent isn't the right model, nor does it imply that legislation would be a more appropriate form of regulation.

We are not limited to only two options, one being isolated and helpless individuals, and the other being a centralized monopoly on "protection". There are many ways to creatively solve such problems without resorting to monopoly or coercion.

Consider a few examples.

Regarding the complexity of information and terms, those can be reviewed and summarized. There are many examples, such as consumer reports, product reviews, TL;DR service (for terms of use). Here is a concrete illustration:

Regarding the difficulty of knowing which entities collect information on which pages, tools can be created to assist users. I can imagine browsers giving users warnings or even blocking pages, "You are about to visit a page which uses Google trackers. To learn more, read this. If you are ok with Google tracking you, continue loading the page".

Regarding the difficulty of making granular decisions, you can delegate most such decisions in bulk to a trusted entity (such as the EFF or UCLA). An example of this is AdBlock filter lists (you don't have to manage filters one by one yourself, just pick a trusted list provider for default behavior). Similarly, you rely on reputable companies to provide blacklists for viruses (Norton) and for urls (Microsoft, Google).

In short, the cognitive and economic challenges of consent are opportunities to serve customers better, which competitive pressures tend to support.
Similar challenges not only exist in political "solutions", they are made worse by incentives and monopoly structure of politics.

[These comments are mine; I do not speak for my employer.]

Patrick CahalanJune 3, 2013 2:49 PM

"There are too many entities collecting and using personal data to make it feasible for people to manage their privacy separately with each entity."

Not to mention the fact that those entities trade that data with each other.

Once upon a time, I got a cell phone and I paid a $200 deposit rather than give my cell phone provider my social security number to run a credit check.

Fast-forward four years, my cell phone provider has merged with another provider and I'm about to change some of my options, and I figure, hey, let's see if I can do that via the website instead of over the phone.

And there's a login screen prompting me for my cell phone number and the last four digits of my social security number. Well damn, I figure, I'll type in the last four and it will pop up with "that doesn't match what we have on record, call this number to get your account activated"... and instead, it just logs me in.

And it turns out that they have my social on record.

Now, they didn't get it from *me*...

DanielJune 3, 2013 3:05 PM

The US Supreme Court has just held that American citizens have given their consent to have their DNA collected and stored by the government forever simply by doing anything an officer of the law deems suspicious.

DanielJune 3, 2013 3:28 PM


There are several significant problems that your response overlooks. First, one issue that remains to be resolved is the foresight issue. For example, parents often act for their kids but parents might not be as mindful as the adult becomes about their privacy. Another issue is that I might choose to lose my privacy now but then want to regain it at some later point in time. This leads to the second issue which is perpetuity. Right now, once I lose my privacy it's gone... permanently.

The underlying problem is that with consent there are significant issues with foreseeability and perpetuity. I personally think that people would be far less worried about issues such as complexity or granularity if they were confident that at some point in time they could backtrack and undo what they did. I fear, however, that history shows that once someone has their measly little paws on the data they are reluctant to let it go because knowledge is power. So the only answer is not to let anyone have that data to begin with.

FigureitoutJune 3, 2013 7:09 PM

The US Supreme Court has just held that American citizens have given their consent to have their DNA collected and stored by the government forever simply by doing anything an officer of the law deems suspicious.
--And I can't even fish in the pond in my own goddamn backyard b/c I have this sneaking suspicion that I'm breaking a motherfucking law.

FigureitoutJune 3, 2013 7:10 PM

And these motherfuckers enter your fucking home. Sometimes people need a bitchslap of what is happening.

Dirk PraetJune 3, 2013 7:57 PM

@ Julien Couvreur

Saying that consent is cognitively difficult or costly does not imply that consent isn't the right model, nor does it imply that legislation would be a more appropriate form of regulation.

You're being horribly naive. First and foremost: there is no such thing as self-regulation where sex, power or money is involved. That's just the human condition, and that's why laws and regulations were invented in the first place. As usual in this context, I refer to the prologue to Hammurabi's Code dated around 1780 BC and explaining its purpose: "... to bring about the rule of righteousness in the land, to destroy the wicked and the evil-doers; so that the strong should not harm the weak ..."

Second, and as the paper correctly points out, there are just way too many entities collecting and using personal data to make it feasible for people to manage their privacy separately with each entity, or to even delegate it to 3rd parties, irrespective of technical or other controls being used to this purpose.

Most readers and commentors on this blog are reasonably or very security and privacy aware folks, but I bet only few will agree that you have any chance whatsoever of upholding your privacy both in the physical world and on the internet where governments, gangsters and corporations alike are trying to track your every move, and in an abundance of ways many of which the average citizen is not even aware of. Sure, you can harden your OS and applications, apply updates and patches zealously, use https, VPN's, Tor, ssh tunnels, browser extensions, malware scanners, S/MIME, PGP, DNScrypt etc. etc. , but in the end you wil always find yourself on the losing end of the fight. This goes especially for determined and resourceful opponents that themselves are willing and enable to enact legislation/regulation to make you comply. Surely you have heared about SOPA, CISPA, PIPA, ACTA, CALEA 2 and countless other "Protect The Children/Guard us from Terrorists" bills, and that's only in the US.

An additional point of concern about big data is that as it evolves, there is a particularly nasty danger lurking around the corner that being able to partially or entirely preserve your privacy may sooner or later blow up in your own face when governmental, financial, medical or other entities could consider you a liability by lack of sufficient aggregated and cross-referenced data.

I don't want to throw the concept of consent entirely out of the window, but what I'm saying is that it is insufficient and way too easy to abuse by legal boffins most of whom do not even understand the full ramifications of a T&C document drafted by the collegue sitting next to them. They are intentionally designed to be lengthy, vague, open for interpretation, contain loopholes and in general be comprehensible for parties with serious legal resources only. Which again kinda rules out the average citizen and for all practical purposes turns it into a fig leaf for privacy.

Personally, I would very much welcome a Miranda-equivalent for online activities applicable to both governments and corporations, and the right to opt out of any tracking of personal data without the possibility of access to a service being denied in the case of corporations. It's probably not gonna happen anywhere, but initiatives like Bonnie Lowenthal's "Right to Know Act" in California and the EU's Data Protection Directive are definitely steps in the good direction.

BS DetectorJune 3, 2013 8:08 PM

@Dirk Praet: "there is no such thing as self-regulation where sex, power or money is involved." Taken literally, you seem to be saying that nobody controls their sexual urges; begs, borrows and steals without limitation; and is happy to use force to make others do as they wish... Frankly, I disagree. I exert and experience self control in these facets of life all the time, and I think you do to. At least, you experience others' self control, whether or not you exhibit it yourself.

George H.H. MitchellJune 3, 2013 8:32 PM

I hope the rest of the cited paper is better than the two paragraphs you quoted, which seem to boil down to, "People can't be trusted to understand what they're consenting to."

WaelJune 4, 2013 12:27 AM

@ Daniel,

The US Supreme Court has just held that American citizens have given their consent to have their DNA

Was consent really necessary? Don't they® welcome babies to this world with a slap on the butt and a needle prick to collect a blood sample at birth?

Clive RobinsonJune 4, 2013 7:04 AM

@ figureitout,

    And I can't even fish in the pond in my own goddamn backyard b/c I have this sneaking suspicion that I'm breaking a motherfucking law

You might well be and you can probably blaim the Roman Catholic Church for it.

Basicaly there is a requirment in the church cannon for fish not meat to be eaten on certain dates. Well untill the last century and the rise of the railways this ment either dried/salted/smoked preserved fish which to the level of preserving required makes them unpalatable or nice fresh fish from a pond.

Now as you are probably aware several hundred years ago you could not move anything without paying a toll of some form be it road usage toll or landing toll for using a town warf etc. Originaly the major benificiary of such tolls was the Church...

So preserved fish was expensive and thus people started what we now call aquaculture (part of which involved shoveling horse muck into ponds as fish food). Well the Church was originaly doing very nicely out of tolls but non church fish ponds were cutting their take... Thus the developing of fish ponds deprived them of income so they used their position to raise money on the fish ponds instead and this gave rise to the position of Water Bailiff.

Well as you might be aware Kings and Princes did not like the Church for various reasons one being their considerable tax take. So over time the Kings and Princes and later Government took over the collecting of such tolls.

Thus in the UK you need a fishing licence to drop your tackle into moving water or common land ponds etc. The situation with private ponds that are not fed from moving water is slightly different... In this case the "Water Boards" as where could raise "rates" against ponds for various reasons including their use for farming sport or other leisure activites...

So yes there is probably someone waiting to nab you for using a pond without you first paying the Dane Gelt they demand...

P.S. Can you be a little less expressive in what you type, certain words trip filters and in some organisations that can be enough to blacklist an entire site.

Dirk PraetJune 4, 2013 9:30 AM

@ BS Detector

Frankly, I disagree. I exert and experience self control in these facets of life all the time, and I think you do to.

But of course we do. The problem with self-regulation lays not with a majority that can and does, but with those minorities who can't or won't. Only in a perfect world does everyone adhere to the highest moral and ethical standards in the pursuit of the common good. Add less than 1% of sociopaths and other deviants, and you have an entirely different situaton. I refer to Bruce's "Liars and Outliers".

Julien CouvreurJune 4, 2013 9:59 AM


Regarding foresight and changing one's mind, you assume that adults become wiser about privacy as they age. Let's go with that plausible assumption for argument's sake.

First, children often pick up on the defaults picked by their family (they probably set up your first bank account, insurance) or help children to. Also, there is education about important topics and actions that cannot easily be undone (just like teens getting pregnant has long-term consequences).

Second, this possible change of mind is foreseeable. Just like getting a tattoo. If you are unsure about your future preferences, then pick more conservative options.
That said, you are correct, some might have a strong time preference and discount the future anyways.

All that said, such cognitive difficulties also apply to politics. People may vote for lax privacy regulation and regret their behavior later, or they may vote for burdensome regulations and pay the opportunity cost of less overall satisfaction.
Furthermore, such mistakes would not only affect them, but affect all (even those who put a lot of effort to make wise decisions).
If people are rational enough to vote, then they can also choose which brand of protection they want. If some are not, then it is better not to expose other people thru politics.

Julien CouvreurJune 4, 2013 10:30 AM

@Dirk Praet

"there is no such thing as self-regulation where sex, power or money is involved. [...] that's why laws and regulations were invented in the first place."

If power is corrupting, then how is legislation (the use of political power) a solution? Rather it introduces a new problem and source of abuse.
Who are these angels supposed to regulate us (reference to Milton Friedman).

In your second point, you mainly list government abuses, which supports my point. Raising government's authority is not protecting people. Pointing to consent failures with regards to private organizations does not mean that government intervention is a solution.

Regarding the last point, I have no qualms with restrictions on government use of online data (as you suggest), but I also don't have high confidence that it would be effective (the example of restriction by Constitution appears a failure). We already have laws preventing government from spying on innocent journalists, yet it still happens with little consequences. Not to mention undeclared wars, droning, etc.

In general, I don't share your pessimism about the impossibility of technological and voluntary solutions, to deal with corporations.
Use the above tools you suggested and a filter to prevent any site or app which does not meet stringent criteria (what logging policy, what tracking is used, how they handle government requests, ...).

When it comes to corporations abusing their customers, competitive pressure and reputation provide strong checks (although admittedly imperfect) on such behaviors. Choice of which services to consume is a powerful leverage and market forces function even with a small minority of sociopaths.
Choice of politicians is nowhere as powerful as it is diluted by vote, it lacks of incentives (see literature on public choice theory) and is vulnerable to capture by the sociopaths. Not to mention that politics is coercion which we don't normally tolerate in civil society (ie a uniquely dangerous power).

FigureitoutJune 4, 2013 12:41 PM

@clive robinson
--The church is a dying institution; literally with my last living (barely) grandmother's church. So no I'm going to blame the individuals enforcing the laws.

Dirk PraetJune 4, 2013 6:55 PM

@ Julien Couvreur

If power is corrupting, then how is legislation (the use of political power) a solution?

Unless you do not believe in the power of (real) democratic process, I think it would be wrong to throw out the baby with the bath water. Just like you, I am deeply sceptical about any government imposed legislation or regulation, especially in pseudo-democracies where a considerable part of legislative and executive branch are on the payroll of corporations and other special interest groups.

I even agree with you that competitive pressure, reputation and other controls provide strong checks for corporations to keep things clean, but the financial crisis has proven beyond a doubt that a free market and self-regulation alone just doesn't cut it. It is still beyond me how a very small minority of people was able to almost bring about a full collapse of the financial system in which trillions of dollars evaporated into thin air, affecting hundreds of thousands in the process and for which - adding insult to injury - the tax payer got the bill. The damages caused were a magnitude of what any terrorist organisation ever managed to pull off. No one was convicted or even indicted.

Corporations are about profit. They don't adopt self-regulation and restrictions of all sorts by their own chosing, but under pressure, whether that be societal, competitive or government induced. And in the end still requires verification by a legal or regulatory framework. Then again, that's just my 5 cents on the issue. Your mileage may vary.

AutolykosJune 6, 2013 7:07 AM

@Dirk Praet: Here, the problem is actually that the majority does not possess the skills necessary for an informed judgment and doesn't bother to acquire them. That way, the corporations can get away with outrageous conditions, and everyone who actually understands them is left with the choice of grudgingly accepting the intolerable or being locked out of modern society and civilization.
I'm very careful with my data (I don't participate in social networks, don't have a homepage, register only if it's unavoidable and with a minimum of data, and never post under my real name), and still, Google has somewhat inaccurate but largely correct data on me on the dashboard (they think I'm ten years older, know about half of my interests and got my degree wrong - but both are STEM, so it's still quite good). Amazon OTOH is ridiculously bad at guessing my taste in books and music (but I don't buy there that often, and it's mostly gifts).
Has anyone ever tried the active approach of posting loads of misinformation to confuse the algorithms?

Nick PJune 6, 2013 11:34 AM

@ Autolykos

"Has anyone ever tried the active approach of posting loads of misinformation to confuse the algorithms?"

I've noticed that Netflix accounts with two different users have poor prediction. I'd imagine Amazon puts more weight into what you buy than view, but I'm sure they count the views too. I figure you could throw the systems off if you post a bunch of disinformation. Google searches Gmail for nuggets of information so maybe put a bunch of keywords in emails to yourself or drafts. And Facebooks image-based profiling system definitely suffers if you aren't tagged in a photo and if bad tags are used often.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.