June 15, 2016
by Bruce Schneier
CTO, Resilient, an IBM Company
schneier@schneier.com
https://www.schneier.com
A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit <https://www.schneier.com/crypto-gram.html>.
You can read this issue on the web at <https://www.schneier.com/crypto-gram/archives/2016/…>. These same essays and news items appear in the “Schneier on Security” blog at <http://www.schneier.com/>, along with a lively and intelligent comment section. An RSS feed is available.
In this issue:
- The Unfalsifiability of Security Claims
- Arresting People for Walking Away from Airport Security
- News
- Suckfly
- Schneier News
- Google Moving Forward on Automatic Logins
- Security and Human Behavior (SHB 2016)
The Unfalsifiability of Security Claims
Interesting research paper: Cormac Herley, “Unfalsifiability of security claims”:
There is an inherent asymmetry in computer security: things can be declared insecure by observation, but not the reverse. There is no observation that allows us to declare an arbitrary system or technique secure. We show that this implies that claims of necessary conditions for security (and sufficient conditions for insecurity) are unfalsifiable. This in turn implies an asymmetry in self-correction: while the claim that countermeasures are sufficient is always subject to correction, the claim that they are necessary is not. Thus, the response to new information can only be to ratchet upward: newly observed or speculated attack capabilities can argue a countermeasure in, but no possible observation argues one out. Further, when justifications are unfalsifiable, deciding the relative importance of defensive measures reduces to a subjective comparison of assumptions. Relying on such claims is the source of two problems: once we go wrong we stay wrong and errors accumulate, and we have no systematic way to rank or prioritize measures.
This is both true and not true.
Mostly, it’s true. It’s true in cryptography, where we can never say that an algorithm is secure. We can either show how it’s insecure, or say something like: all of these smart people have spent lots of hours trying to break it, and they can’t—but we don’t know what a smarter person who spends even more hours analyzing it will come up with. It’s true in things like airport security, where we can easily point out insecurities but are unable to similarly demonstrate that some measures are unnecessary. And this does lead to a ratcheting up on security, in the absence of constraints like budget or processing speed. It’s easier to demand that everyone take off their shoes for special screening, or that we add another four rounds to the cipher, than to argue the reverse.
But it’s not entirely true. It’s difficult, but we can analyze the cost-effectiveness of different security measures. We can compare them with each other. We can make estimations and decisions and optimizations. It’s just not easy, and often it’s more of an art than a science. But all is not lost.
Still, a very good paper and one worth reading.
Blog entry URL:
https://www.schneier.com/blog/archives/2016/05/…
Unfalsifiability of security claims:
http://research.microsoft.com/pubs/256133/…
Arresting People for Walking Away from Airport Security
A proposed law in Albany, NY, would make it a crime to walk away from airport screening.
Aside from wondering why county lawmakers are getting involved with what should be national policy, you have to ask: what are these people thinking?
They’re thinking in stories, of course. They have a movie plot in their heads, and they are imaging how this measure solves it.
The law is intended to cover what Apple described as a soft spot in the current system that allows passengers to walk away without boarding their flights if security staff flags them for additional scrutiny.
That could include would-be terrorists probing for weaknesses, Apple said, adding that his deputies currently have no legal grounds to question such a person.
Does anyone have any idea what stories these people have in their heads? What sorts of security weaknesses are exposed by walking up to airport security and then walking away?
http://www.timesunion.com/local/article/…
News
Here’s an interesting case of doctored urine-test samples from the Sochi Olympics. Evidence points to someone defeating the tamper resistance of the bottles. Someone figured out how to open the bottles, swap out the liquid, and replace the caps without leaving any visible signs of tampering.
http://www.nytimes.com/2016/05/14/sports/…
http://www.nytimes.com/interactive/2016/05/13/…
At the last match of the year for Manchester United, someone found a bomb in a toilet, and security evacuated all 75,000 people and canceled the match. Turns out it was a fake bomb left behind after a recent training exercise.
https://www.theguardian.com/football/2016/may/15/…
The Intercept is starting to publish a lot more documents from the Snowden archives. Last month, it published the first year of an internal newsletter called SIDtoday, along with several articles based on the documents.
https://theintercept.com/snowden-sidtoday/
It’s also making the archive available to more researchers.
https://theintercept.com/2016/05/16/…
Economists argue that the security needs of various crops are the cause of civilization size.
https://www.washingtonpost.com/news/wonk/wp/2016/04/…
Jonathan Mayer, Patrick Mutchler, and John C. Mitchell, “Evaluating the privacy properties of telephone metadata.”
http://www.pnas.org/content/113/20/5536.full
New research, but not a new result. There have been several similar studies over the years. This one uses only anonymized call and SMS metadata to identify people who volunteered for the study.
https://news.stanford.edu/2016/05/16/…
http://techcrunch.com/2016/05/17/…
http://www.dailydot.com/politics/…
https://www.theguardian.com/science/2016/may/16/…
Really interesting article on the difficulties involved with explosive detection at airport security checkpoints.
https://www.ctc.usma.edu/posts/…
I disagree with his conclusions—that more explosive-detection technology is needed at more places in society—but the technical information on explosives detection technology is really interesting.
Really interesting research: “Online tracking: A 1-million-site measurement and analysis,” by Steven Englehardt and Arvind Narayanan:
http://randomwalker.info/publications/…
https://freedom-to-tinker.com//englehardt/…
GCHQ discloses two OS X vulnerabilities to Apple:
http://www.scmagazine.com/…
https://support.apple.com/en-us/HT206567
Good debate in the Wall Street Journal on whether you should be allowed to prevent drones from flying over your property. This isn’t an obvious one; there are good arguments on both sides.
http://www.wsj.com/articles/…
There’s a new trend in Silicon Valley startups; companies are not collecting and saving data on their customers. I believe that all this data isn’t nearly as valuable as the big-data people are promising. Now that companies are recognizing that it is also a liability, I think we’re going to see more rational trade-offs about what to keep—and for how long—and what to discard.
https://www.washingtonpost.com/news/the-switch/wp/…
The Skein hash function is now part of FreeBSD.
https://reviews.freebsd.org/D6166
People can be identified from their driving patterns, using the internal computer network of the vehicles. The paper: “Automobile Driver Fingerprinting,” by Miro Enev, Alex Takahuwa, Karl Koscher, and Tadayoshi Kohno.
https://www.wired.com/2016/05/…
http://www.autosec.org/pubs/fingerprint.pdf
This is a good summary article on the fallibility of DNA evidence. Most interesting to me are the parts on the proprietary algorithms used in DNA matching. It’s the same problem as any biometric: we need to know the rates of both false positives and false negatives. And if these algorithms are being used to determine guilt, we have a right to examine them.
http://www.theatlantic.com/magazine/archive/2016/06/…
http://lawprofessors.typepad.com/evidenceprof/2016/…
http://www.theatlantic.com/science/archive/2015/10/…
http://www.texasmonthly.com/articles/false-impressions/
Stealth Falcon: new malware from (probably) the UAE:
https://citizenlab.org/2016/05/stealth-falcon/
http://www.nytimes.com/2016/05/30/technology/…
Lockpicking competitions in the 1850s.
https://muse.jhu.edu/article/597409/pdf
There’s a new piece of malware called Irongate, which is obviously inspired by Stuxnet. We don’t know who is responsible for it.
https://www.fireeye.com//threat-research/2016/…
https://motherboard.vice.com/read/…
http://www.darkreading.com/threat-intelligence/…
https://it.slashdot.org/story/16/06/02/2016208/…
There’s a new report on security vulnerabilities in the PC initialization/update process, allowing someone to hijack it to install malware.
https://duo.com//…
https://duo.com/assets/pdf/…
https://www.wired.com/2016/05/2036876/
New paper: “Physical Key Extraction Attacks on PCs,” by Daniel Genkin, Lev Pachmanov, Itamar Pipman, Adi Shamir, and Eran Tromer. They recover keys acoustically, from the high-frequency “coil whine” from the circuits, from a distance of about ten meters.
http://m.cacm.acm.org/magazines/2016/6/…
http://www.theregister.co.uk/2016/06/04/…
This is a long article, based on documents obtained via FOIA, about Edward Snowden’s attempts to raise his concerns inside the NSA.
https://news.vice.com/article/…
Really good investigative reporting on the automatic algorithms used to predict recidivism rates.
https://www.propublica.org/article/…
People who don’t want Waze routing cars through their neighborhoods are feeding it false data. Interesting story of data poisoning in real life, and the security measures to detect and discount it.
https://www.washingtonpost.com/local/…
This interesting essay argues that financial risks are generally not systemic risks, and instead are generally much smaller. That’s certainly been our experience to date.
http://voxeu.org/article/cyber-risk-systemic-risk
There’s an interesting message in the documents about Snowden that Vice magazine got out of the NSA with a FOIA request. At least in 2012, the NSA was using Word macros internally
https://motherboard.vice.com/read/…
https://www.lawfareblog.com/nsas-word-problem
http://www1.icsi.berkeley.edu/~nweaver/…
The Washington Post reporting is that Russian hackers penetrated the network of the Democratic National Committee and stole opposition research on Donald Trump. The evidence is from CrowdStrike. This seems like standard political espionage to me. We certainly don’t want it to happen, but we shouldn’t be surprised when it does.
https://www.washingtonpost.com/world/…
http://www.politico.com/story/2016/06/…
http://arstechnica.com/security/2016/06/…
https://politics.slashdot.org/story/16/06/14/…
Typosquatting is an old trick of registering a domain name a typo away from a popular domain name and using it for various nefarious purposes. Nikolai Philipp Tschacher just published a bachelor’s thesis where he does the same trick with the names of popular code libraries, and tricks 17,000 computers into running arbitrary code.
http://incolumitas.com/2016/06/08/…
http://incolumitas.com/data/thesis.pdf
http://arstechnica.com/security/2016/06/…
Suckfly
Suckfly seems to be another Chinese nation-state espionage tool, first stealing South Korean certificates and now attacking Indian networks.
Symantec has done a good job of explaining how Suckfly works, and there’s a lot of good detail in the blog posts. My only complaint is its reluctance to disclose who the targets are. It doesn’t name the South Korean companies whose certificates were stolen, and it doesn’t name the Indian companies that were hacked.
My guess is that Symantec can’t disclose those names, because those are all customers and Symantec has confidentiality obligations towards them. But by leaving this information out, Symantec is harming us all. We have to make decisions on the Internet all the time about who to trust and who to rely on. The more information we have, the better we can make those decisions. And the more companies are publicly called out when their security fails, the more they will try to make security better.
Symantec’s motivation in releasing information about Suckfly is marketing, and that’s fine. There, its interests and the interests of the research community are aligned. But here, the interests diverge, and this is the value of mandatory disclosure laws.
http://www.symantec.com/connect/s/…
http://www.symantec.com/connect/s/…
Schneier News
I’m speaking at the CSISAC Forum in Cancun, Mexico on June 21.
http://csisac.org/events/cancun16/
I spoke at InfoSec Europe in June, and there were several press articles:
http://www.infosecurity-magazine.com/news/…
http://www.theregister.co.uk/2016/06/10/…
http://www.computerweekly.com/news/450298210/…
http://www.v3.co.uk/v3-uk/news/2460942/…
http://linkis.com/techweekeurope.co.uk/kj7Zs
Google Moving Forward on Automatic Logins
Google is trying to bring this to Android developers by the end of the year:
Today, secure logins—like those used by banks or in the enterprise environment—often require more than just a username and password. They tend to also require the entry of a unique PIN, which is generally sent to your phone via SMS or emailed. This is commonly referred to as two-factor authentication, as it combines something you know (your password) with something you have in your possession, like your phone.
With Project Abacus, users would instead unlock devices or sign into applications based on a cumulative “Trust Score.” This score would be calculated using a variety of factors, including your typing patterns, current location, speed and voice patterns, facial recognition, and other things.
Basically, the system replaces traditional authentication—something you know, have, or are—with surveillance. So maybe this is a good idea, and maybe it isn’t. The devil is in the details.
It’s being called creepy. But, as we’ve repeatedly learned, creepy is subjective. What’s creepy now is perfectly normal two years later.
http://techcrunch.com/2016/05/23/…
http://www.cnet.com/news/…
http://www.engadget.com/2016/01/15/…
Security and Human Behavior (SHB 2016)
Earlier this month, I was at the ninth Workshop on Security and Human Behavior, hosted at Harvard University.
SHB is a small invitational gathering of people studying various aspects of the human side of security. The fifty or so people in the room include psychologists, economists, computer security researchers, sociologists, political scientists, philosophers, political scientists, neuroscientists, lawyers, anthropologists, business school professors, and a smattering of others. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.
These are the most intellectually stimulating two days of my year; this year someone called it “Bruce’s brain in conference form.”
The goal is maximum interaction and discussion. We do that by putting everyone on panels. There are eight six-person panels over the course of the two days. Everyone gets to talk for ten minutes about their work, and then there’s half an hour of discussion in the room. Then there are lunches, dinners, and receptions—all designed so people meet each other and talk.
Workshop on Security and Human Behavior:
https://www.schneier.com/shb2016/
This page lists the participants and gives links to some of their work.
https://www.schneier.com/shb2016/participants/
As usual, Ross Anderson liveblogged the talks.
https://www.lightbluetouchpaper.org/2016/05/31/…
Here are my posts on previous SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops.
https://www.schneier.com/blog/archives/2008/06/…
https://www.schneier.com/blog/archives/2009/06/…
https://www.schneier.com/blog/archives/2010/06/…
https://www.schneier.com/blog/archives/2011/06/…
https://www.schneier.com/blog/archives/2012/06/…
https://www.schneier.com/blog/archives/2013/06/…
https://www.schneier.com/blog/archives/2014/06/…
https://www.schneier.com/blog/archives/2015/06/…
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <https://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Bruce Schneier is an internationally renowned security technologist, called a “security guru” by The Economist. He is the author of 13 books—including his latest, “Data and Goliath”—as well as hundreds of articles, essays, and academic papers. His influential newsletter “Crypto-Gram” and his blog “Schneier on Security” are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press. Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation’s Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Chief Technology Officer at Resilient, an IBM Company. See <https://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of Resilient, an IBM Company.
Copyright (c) 2016 by Bruce Schneier.