Federal Trade Commissioner Julie Brill on Obscurity

I think this is good:

Obscurity means that personal information isn't readily available to just anyone. It doesn't mean that information is wiped out or even locked up; rather, it means that some combination of factors makes certain types of information relatively hard to find.

Obscurity has always been an important component of privacy. It is a helpful concept because it encapsulates how a broad range of social, economic, and technological changes affects norms and consumer expectations.

Posted on April 24, 2015 at 12:42 PM • 14 Comments

Comments

Spaceman SpiffApril 24, 2015 12:56 PM

I guess she never heard of the principle that there is no security via obscurity...

Martin WalshApril 24, 2015 1:17 PM

"the principle that there is no security via obscurity" ??
You are confused. It that were true, the way you understand it, then there is no value in using block ciphers either, because you have to hide the key.

Read the first sentence again, for the correct meaning of obscurity in this context. Obscurity is very important. You are probably thinking about system designs that depend upon secret implementations and afford some security but only until the implementation is exposed. An example is XOR'ing data again and again, using a fixed block of random data stored away someplace.

Andrew WallaceApril 24, 2015 1:21 PM

If I do not want my car broken into I will hide my satnav in my glove compartment to make my car less appealing to criminals.

And so on and so forth.

Andrew

Ryan G.April 24, 2015 2:20 PM

Part of security is visibility... and if obscurity increases your visibility by removing "clutter" (such as SSH on a different port), then it is beneficial in my world. The notorious line of "there is no security through obscurity" is really meant as "relying on obscurity for security doesn't work".

LessThanObviousApril 24, 2015 5:30 PM

Obscurity is a useful tool. It provides complete security up until the moment doesn't. Once the obscurity is dissolved it is basically worthless, but it bought you time up until it became worthless. We trash talk obscurity, but I think the reality is we all use it one way or another. The problems come when we use it as an alternative to actually handling a vulnerability or we overestimate our own cleverness. If someone knows what they are looking for, then they can predict how other people might go about hiding it.

There are plenty of examples of the effectiveness of obscurity at reducing the likelihood of crimes of opportunity. Not only does obscurity reduce the attention given by people looking for opportunities, it reduces the incidence of people committing crimes they otherwise wouldn't have if they had not stumbled upon an opportunity too easy to pass up.

Clive RobinsonApril 24, 2015 6:29 PM

Julie Brill appears to conflate obscurity with obsolescence in the first part of the article, which gives me no great confidence as to the rest of her argument.

Further she appears to have no comprehension of how a stateless entity like the Internet works around state jurisdictions, which renders her argument moot. She compounds the error by using the ECJ ruling on the right to have search results on a data subject witheld.

As many here know it only applied to the results given in the EU jurisdictional area, thus searches run via the likes of proxies outside the EU to search engines likewise outside the EU still enabled those inside the EU to see the witheld results...

Further as many here know that the "safe harbour" agreement between the EU and US was and still is a failure for a multitude of reasons that were well known right from the get go.

The first sensible step the US should make is to establish that PII is owned by the data subject not who ever collects the data. However this is not going to happen due to the "vested interests of marketing" which is the largest --and about the most usless-- industry in the world.

However there is a more subtal issue to consider which is of "data holes", which can be filled in from other data. That is if an idividual ellects to have their data witheld it creates a hole in other known data from which the missing witheld information can be fairly well approximated. The usuall example given is the "total of salaries, for directors" etc if there is only two directors and one salary is know the other can be trivially found.

The level of techniques available to de-anonymize anonymized or missing data are many and varied, but suffice it to say that experience over the past twenty years has shown that anonymised data and data holes are in effect an illusion to a greater extent.

Thus I feel that Julie Brill's ideas are based on false or impractical assumptions.

WaelApril 25, 2015 2:18 AM

@Clive Robinson,

I didn't know what to make out of this discussion. I wonder what Bruce liked about it.

which renders her argument moot

It came across that way to me as well!

The usuall example given is the "total of salaries, for directors"

This wouldn't apply in some countries, such as Sweeden, where salaries aren't perceived as "private" data. As far as I know, salaries are public domain available to anyone. I could be wrong, but that's what I heard when I was there...

Clive RobinsonApril 25, 2015 8:43 AM

@ Wael,

This wouldn't apply in some countries, such as Sweeden...

There you have an advantage on me, I'm not aware of what the Swedish equivalent of the UK's "Companies House" requires on a companies annual filings. However I suspect the "total" for directors salaries is recorded somewhere as it's quite an important benchmark for investors and shareholders and most western governments track it in some way.

That said there are many other examples of PII "data holes" that can be filled by other badly anonymized data. One being post/zip/area codes and medical spending against age bands and certain treatments. And this is just with one specific data type data base, when you bring in other available databases such as mobile phone location and credit card these can strip the anonymisation quickly and effectively... So much so that it appears to be the case that PII can not be anonymised to a level to protect an individual and have the resulting data be of any use for research etc that might make collecting it of value.

We have had this debate come up in the UK on a number of occasions where the Government feels it has the right to sell PII it has forced out of people with legislation etc to third parties often outside of the UK jurisdiction and any constraints it might apply.

And it's not just the UK Gov, take "work perks" given in lieu of salary etc such as gym membership, or more pertaintly in the UK Health Insurance. If you read the form carefully you find that if you sign the health care form you are giving not just the healt insurer but any interested parties the right to snoop into any and all of your private information they chose at any time and pass it on to other agencies and organisations, including but by no means limited to not just your medical information but bank details property records etc etc...

It will probably be of no surprise to anyone reading this blog that I don't have nor have ever had work or other health care insurance and have no intention of ever doing so with them asking to sign away the entirety of my privacy...

WaelApril 25, 2015 10:23 AM

@Clive Robinson,

That said there are many other examples of PII "data holes" that can be filled by other badly anonymized data.

PII data de-anonymization and extrapolation is becoming an easier task. So-called "metadata" collection aids as an input to data analytics engines which makes the "data holes" wide enough to drive a truck through them.

Nick PApril 25, 2015 11:51 AM

It's good that she's promoting privacy-enhancing legislation. However, it's likely to fail because the obscurity concept is too vague. What she describes is the kind of thing dirty lawmakers and lawyers are going to have a field day with. I'd rather have legislation more along the lines of the E.U.'s data protection scheme. Most of their principles are pretty straightforward to understand, even to lay judges. They're also easy to comply with outside the "secure the data" mandate.

CuriousApril 25, 2015 12:44 PM

I would think that really the only interesting aspect of 'obscurity' in regard to 'privacy', is 'guaranteed obscurity'. The same way 'security' is only interesting, in that it revolves around 'guaranteed' 'security'.

So, having merely a pragmatical approach, wouldn't do, when there is a goal of making something non-obscured.

And so, if one were to think of government initiatives or efforts at collecting data, trying to call it "obscured" because it isn't looked at, makes it obvious that 'privacy' this way doesn't make any sense at all, because an individual have no say in the matter in how data about his person is handled.

mozApril 27, 2015 2:20 AM

@Clive; I think you aren't reading the obsolete word carefully enough.

She isn't saying that obscure is the same as it even related obscure.

She's saying that non-obsolete data will not be forced to be made obscure by her legislation. In other words it's a dog whistle signal to investors in data companies not to be afraid and to corporate lawyers not to accuse her of interfering in legitimate business interests.

This is a very important part of her future defence of obscurity legislation against attacks from those pushing corporate rights under the US 1st amendment. It's also very useful to understand what holes there will be in the legislation and why.

Clive RobinsonApril 27, 2015 3:12 AM

@ moz,

In other words it's a dog whistle signal to investors in data companies not to be afraid and to corporate lawyers not to accuse her of interfering in legitimate business interests.

As a non US resident supposadly under the protection of both national and EU data protection legislation, who has had their PII misappropriated by US Corps in the past, I find that to be of no comfort, as it indicates the same preditory behaviour will continue unrestrained if not actually be encoraged further by the FTC.

Thus I now have even lower expectations of any constraint being applied at the US end, which is likely to give rise to more ECJ judgments against US Corps...

However, there is a rather nasty "back door" the US is insisting goes in all trade agreements, that alows US Corps to get around such judgments...

Peter GerdesApril 28, 2015 4:55 PM

Unfortunately, while it would be nice to retain obscurity we now have a choice between giving everyone access to previously obscure information and reserving that power to governments large corporations and the like.

Once information EXISTS on the internet sufficient computing power and network connections lets anyone with those resources uncover it. Thus, the choice is merely between ending obscurity in a democratic manner and only allowing the rich and powerful to pull aside the veil of obscurity.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.