Research on Balancing Privacy with Surveillance

Interesting research: Michael Kearns, Aaron Roth, Zhiwei Steven Wu, and Grigory Yaroslavtsev, “Private algorithms for the protected in social network search,” PNAS, Jan 2016:

Abstract: Motivated by tensions between data privacy for individual citizens and societal priorities such as counterterrorism and the containment of infectious disease, we introduce a computational model that distinguishes between parties for whom privacy is explicitly protected, and those for whom it is not (the targeted subpopulation). The goal is the development of algorithms that can effectively identify and take action upon members of the targeted subpopulation in a way that minimally compromises the privacy of the protected, while simultaneously limiting the expense of distinguishing members of the two groups via costly mechanisms such as surveillance, background checks, or medical testing. Within this framework, we provide provably privacy-preserving algorithms for targeted search in social networks. These algorithms are natural variants of common graph search methods, and ensure privacy for the protected by the careful injection of noise in the prioritization of potential targets. We validate the utility of our algorithms with extensive computational experiments on two large-scale social network datasets.

Posted on February 24, 2016 at 6:05 AM21 Comments

Comments

Privacy February 24, 2016 6:42 AM

The primary use of social media (Facebook, Twitter, Instagram et. al.) is to communicate with friends, family and, sometimes, businesses. The nature of social data aggregation means that it is impossible to maintain real privacy as a lot of services have ‘real name policies’ and sell data in order to provide the service. Truly anonymous services are used only by a few individuals.

Until there is legislative change businesses will continue to sell data with impunity: relying upon the implied/explicit consent of their users. We can’t rely on money-marking organizations to respect privacy.

I’m finding now that more people – especially within the 18-25 age range – are using ephemeral messaging apps like Snapchat. These seem to be a step towards privacy (and reducing the permanency of other social media platforms) and in part I think that’s welcome. However we have market leaders such as WhatsApp who are looking to monetize their service by providing their platform to advertisers/businesses.

Encrypted apps like Signal* (for voice and text) and Telegram* (for text) are used by very few people. In an ideal world everybody would be using such apps but we’re still left with the problem of metadata; there is a Windows/Mac/Linux based application called Ricochet which tries to eliminate metadata but I’ve got no experience of it.

Does anybody know how secure the ‘Secret Chats’ of Telegram are?
Is Signal still considered secure (assuming the operating system isn’t compromised)?

*
https://whispersystems.org/
https://telegram.org/
https://ricochet.im/

jbmartin6 February 24, 2016 8:30 AM

I’m not sure there is a balance. If you are under surveillance, by definition you don’t have privacy.

BoppingAround February 24, 2016 9:03 AM

are using ephemeral messaging apps like Snapchat

Allegedly ephemeral. Message contents can still be stored, whether by some smart hackery or by photographing the screen with another smartphone or camera.

Zuul February 24, 2016 9:12 AM

I’m concerned about the definition of “the targeted subpopulation”. Who decides that definition? If the answer is “law enforcement”, which “law enforcement”? China’s? Saudi Arabia’s? Ferguson’s? The FBI? I don’t trust any government not to abuse such a system.

Behrs February 24, 2016 9:21 AM

| ….”I’m not sure there is a balance.”

There is NO balance.

Anyone who speaks of “Balance” between surveillance & privacy … or liberty & security — is guaranteed to favor much less privacy and much less liberty.

The referenced “research” above is Orwellian nonsense, speaking of “targeted subpopulations” for surveillance. We now know with certainty that the U.S. government targets the entire population (“Collect it All!).

The U.S. IV Amendment does not specify some vague balance concept permitting general government surveillance, search and seizure against the populace– instead it specifies strict judicial probable cause requirements against specifically identified individuals and specifically identified information/property of those specific persons (not “subpopulations”).

But why be concerned with fundamental law… when ethereal notions of “Balance” can be introduced to excuse outrageous government conduct??

Johnny Cage February 24, 2016 9:45 AM

Bruce,

Continuing this narrative, check this informative post on slashdot.

http://yro.slashdot.org/story/16/02/23/1812257/doj-wants-apple-to-decrypt-12-more-iphones

The Wall Street Journal (paywalled) is reporting that the Department of Justice is seeking Apple’s help in decrypting 12 other iPhones that may contain crime-related evidence. The cases are not identified, though a list of the 12 phones in question has come out, but it is not known what level of Apple assistance is required (i.e., how many of those cases are waiting on the FBI request for special firmware to be developed and to be used on “one more phone”). It appears Tim Cook’s assertion that hundreds of requests are waiting on this software may not be a fabrication, and the goal is not about just one phone, but to set a precedent to unlock more phones.

As TechDirt (which also lists those 12 cases, a list which certainly does not encompass all the phones the Feds would like to peer into) puts it, “[O]nce again, Director Comey was flat out lying when he claimed the FBI has no interest in setting a precedent.”

Pizza Cup February 24, 2016 9:51 AM

Johnny Cage- Also interesting to note that 6 of the 12 devices the DOJ wants to unlock contain a secure enclave chip. I take Cook at his word on this, there is a HUGE, unseen backlog of iPhones just waiting for a precedent-setting court case that swings the way of Big Brah. The circles never touch in the Venn diagram of Comey’s mouth moving and truth.

Sasparilla February 24, 2016 10:28 AM

Um, there’s not supposed to be a true balance in this – at lest here in the U.S.. Our constitution is written specifically to handicap the govt because the writers knew it could not be trusted.

Great article over on Ars:

http://arstechnica.com/tech-policy/2016/02/snowden-lawyer-bill-of-rights-was-meant-to-make-governments-job-more-difficult/

Favorite quote from it: “the Constitution was written by people who were more worried about a government with too much power than they were about bad guys getting away. There’s no other way to read the Bill of Rights. At every turn it is making the government’s job deliberately more difficult. Not because they hated government, but because they understood it too well.”

Main Core #E19327546 February 24, 2016 11:21 AM

@Sarsaparilla, thanks for the nostalgic stroll down memory lane! The constitution, those were the days.

Your constitution’s gone. You’re not getting it back. You live in the United States of COG.

http://www.washingtonsblog.com/2013/06/is-this-the-real-reason-for-the-government-spying-on-americans.html

http://www.christopherketcham.com/wp-content/uploads/2010/02/The%20Last%20Roundup,%20Radar%20Magazine.pdf

Fortunately, you’ve got something much better and harder to revoke: the ICCPR. The COG regime is tying itself in knots trying to excuse their coup d’état to the outside world. When we finally root out the domestic CIA spies and put their heads on sticks, we’ll return to the ICCPR and not to that obsolete and half-assed bill of rights.

DavidFMM February 24, 2016 11:22 AM

[…] the Department of Justice is seeking Apple’s help in decrypting 12 other iPhones that may contain crime-related evidence […]

Ars Technica has posted an interview with Ben Wizner (an ACLU attorney who is helping represent Edward Snowden) who mentions that the FBI is waiting for the win on their request so that they can force Apple to help break 170 more iphones.

So much for the statement that this is just an isolated, one-time situation.

Here is the interview:
http://arstechnica.com/tech-policy/2016/02/snowden-lawyer-bill-of-rights-was-meant-to-make-governments-job-more-difficult/

Thoth February 24, 2016 6:33 PM

@Privacy
Tor doesn’t give much anonymity anymore. State actors and researchers have become very capable of observing traffic in Tor and attacking it. This automatically invalidates Ricochet’s claim of anonymity. In fact, it should be called pseudo-anonymity as it is not truely anonymous.

How would you benchmark Signal as secure ? Do you mean the metadata or the message content ? In an ideal environment (secure HW & SW), the message is most likely secure but it is not designed to mask metadata.

The huge problem with these open source COMSEC protocols is most of them rely on Tor for anonymity when it has been shown to be targetted successfully by HSA (High Strength Attacker – State Actors and the likes) and almost all these protocols have headers, likely to be weak against traffic analysis (i.e. measurement of message length) and leak metadata.

The more ideal COMSEC protocol should have the following properties:

  • Uses common P2P/F2F networks to chain broadcast encrypted message. If everyone is broadcasting messages, there is a many-to-many relation and makes tracking point-to-point harder.

  • Uses seemingly common P2P/F2F protocol to disguise protocol’s true nature so that most observers would mistake the protocol and also capable of bypassing deep packet inspection by firewalls.

  • Messages should not have predictable headers. Most common messages would have indicator headers and flags which are give-away signs of a protocol. If you send a secure message, you use the Public Key to wrap the Message Key and encrypt and sign the message. The recipient must be forced to use he Private Key to retrieve the Message Key and verify and decrypt the message as a proof of correct message instead of having headers indicating recipient Public Key hashes and what not which leaks too much information on the target recipient (thus also making PGP crypto very leaky). This is especially useful of the common P2P/F2F protocol used as a wrapper doesn’t support crypto and the content can be peered into. If the data is usong the crypto I described without predictable headers, all that an observer sees is a mess of jumbled data with even if the observer collects a million of those messages, the pseudo-random function of crypto without predictable headers would make the observer incapable of guessing thw traffic but it makes the protocol very rigid and inflexible (which can be good and bad).

  • Finally, the problem with traffic analysis on message length have always been a headache where protocol developers have been unwilling the cap the limit on each message and send multiple message blocks. Setting a fixed length block per message is helpful against traffic analysis. An accompanying method to boost traffic analysis security is to send message (including dummy messages) constantly at a randomized timing (store and forward tactic) in bulk message blocks like how a post man delivers the day’s mail. This store and forward bulk messages can be expensive and impractical to mobile network plans who are given ever shrinking limited bandwidth for chat messaging schemes unless you are OK with storing large message blocks on device and get home to offload onyour home WiFi into the network.

Ross February 24, 2016 9:38 PM

@Bruce

The framing (you mentioned it yourself in a recent post) CAN’T be “privacy” versus “security” -> if you do that you’ve already lost.

The “Security” being given requires the state to track and curate the spread of ideas – to magnify the ‘good’ ones and to gut the ‘bad’ ones.

It isn’t then a ‘privacy’ issue, but a freedom and liberty issue.

Please use the terminology privacy and liberty.

Mark Mayer February 24, 2016 11:02 PM

@Ross

I think Bruce was just using the framing used in the paper to make an accurate headline. Bruce’s stance should be pretty well known by now and I don’t think he needs to communicate that stance every time he blogs about some research paper.

That said, I certainly agree with you that the current big issue is security vs. surveillance (or, as I like to put it, security vs. making things easier for the FBI). The current mass media framing is privacy vs. security, which is a false issue because no security us being provided, although privacy vs surveillance seems to be a genuine issue. (A large part of the issue of privacy vs surveillance can be subsumed under security vs surveillance, imho.)

Security February 25, 2016 12:39 AM

So who exactly is “targeted” to not have BASIC HUMAN RIGHTS??? Prisoners? Slaves? Who exactly???? um… everyone?

JdL February 25, 2016 7:07 AM

…we introduce a computational model that distinguishes between parties for whom privacy is explicitly protected, and those for whom it is not (the targeted subpopulation).

It’s not hard to guess who the two populations will be: government thugs will be protected and the rest of us will be targeted.

Bilbo February 25, 2016 8:40 AM

Message contents can still be stored, whether by some smart hackery or by photographing the screen with another smartphone or camera.

Teenagers have already figured a way. It’s clever, but no “hackery”: take a screenshot while in airplane mode.

Max Shron February 25, 2016 10:09 AM

Differential privacy is a great idea, but suffers from the interesting side-effect that the noise has to go up the more times you ask questions. It is, in some sense, a “self destroying” window onto a data set. As time goes on, and the data set changes, you can reduce the noise back again, but it puts a pretty sharp limit on how often you can access it.

The upside is that you have to be very careful about how often you get to ask questions. The downside is that it’s a handicap that courts would probably not like.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.