Roger Grimes has written an interesting paper: "Implementing a Data-Driven Computer Security Defense." His thesis is that most organizations don't match their defenses to the actual risks. His paper explains how it got to be this way, and how to fix it.
Newly declassified: "A History of U.S. Communications Security (Volumes I and II)," the David G. Boak Lectures, National Security Agency (NSA), 1973. (The document was initially declassified in 2008. We just got a whole bunch of additional material declassified. Both versions are in the document, so you can compare and see what was kept secret seven years ago.)
In 2001, the Bush administration authorized -- almost certainly illegally -- the NSA to conduct bulk electronic surveillance on Americans: phone calls, e-mails, financial information, and so on. We learned a lot about the bulk phone metadata collection program from the documents provided by Edward Snowden, and it was the focus of debate surrounding the USA FREEDOM Act. E-mail metadata surveillance, however, wasn't part of that law. We learned the name of the program -- STELLAR WIND -- when it was leaked in 2004. But supposedly the NSA stopped collecting that data in 2011, because it wasn't cost-effective.
"The internet metadata collection program authorized by the FISA court was discontinued in 2011 for operational and resource reasons and has not been restarted," Shawn Turner, the Obama administration's director of communications for National Intelligence, said in a statement to the Guardian."
When Turner said that in 2013, we knew from the Snowden documents that the NSA was still collecting some Americans' Internet metadata from communications links between the US and abroad. Now we have more proof. It turns out that the NSA never stopped collecting e-mail metadata on Americans. They just cancelled one particular program and changed the legal authority under which they collected it.
The report explained that there were two other legal ways to get such data. One was the collection of bulk data that had been gathered in other countries, where the N.S.A.'s activities are largely not subject to regulation by the Foreign Intelligence Surveillance Act and oversight by the intelligence court.
The N.S.A. had long barred analysts from using Americans' data that had been swept up abroad, but in November 2010 it changed that rule, documents leaked by Edward J. Snowden have shown. The inspector general report cited that change to the N.S.A.'s internal procedures.
The other replacement source for the data was collection under the FISA Amendments Act of 2008, which permits warrantless surveillance on domestic soil that targets specific noncitizens abroad, including their new or stored emails to or from Americans.
In Data and Goliath, I wrote:
Some members of Congress are trying to impose limits on the NSA, and some of their proposals have real teeth and might make a difference. Even so, I don't have any hope of meaningful congressional reform right now, because all of the proposals focus on specific programs and authorities: the telephone metadata collection program under Section 215, bulk records collection under Section 702, and so on. It's a piecemeal approach that can't work. We are now beyond the stage where simple legal interventions can make a difference. There's just too much secrecy, and too much shifting of programs amongst different legal justifications.
The NSA continually plays this shell game with Congressional overseers. Whenever an intelligence-community official testifies that something is not being done under this particular program, or this particular authority, you can be sure that it's being done under some other program or some other authority. In particular, the NSA regularly uses rules that allow them to conduct bulk surveillance outside the US -- rules that largely evade both Congressional and Judicial oversight -- to conduct bulk surveillance on Americans. Effective oversight of the NSA is impossible in the face of this level of misdirection and deception.
In 2013, in the early days of the Snowden leaks, Harvard Law School professor and former Assistant Attorney General Jack Goldsmith reflected on the increase in NSA surveillance post 9/11. He wrote:
Two important lessons of the last dozen years are (1) the government will increase its powers to meet the national security threat fully (because the People demand it), and (2) the enhanced powers will be accompanied by novel systems of review and transparency that seem to those in the Executive branch to be intrusive and antagonistic to the traditional national security mission, but that in the end are key legitimating factors for the expanded authorities.
Goldsmith is right, and I think about this quote as I read news articles about surveillance policies with headlines like "Political winds shifting on surveillance after Paris attacks?"
The politics of surveillance are the politics of fear. As long as the people are afraid of terrorism -- regardless of how realistic their fears are -- they will demand that the government keep them safe. And if the government can convince them that it needs this or that power in order to keep the people safe, the people will willingly grant them those powers. That's Goldsmith's first point.
Today, in the wake of the horrific and devastating Paris terror attacks, we're at a pivotal moment. People are scared, and already Western governments are lining up to authorize more invasive surveillance powers. The US want to back-door encryption products in some vain hope that the bad guys are 1) naive enough to use those products for their own communications instead of more secure ones, and 2) too stupid to use the back doors against the rest of us. The UK is trying to rush the passage of legislation that legalizes a whole bunch of surveillance activities that GCHQ has already been doing to its own citizens. France just gave its police a bunch of new powers. It doesn't matter that mass surveillance isn't an effective anti-terrorist tool: a scared populace wants to be reassured.
And politicians want to reassure. It's smart politics to exaggerate the threat. It's smart politics to do something, even if that something isn't effective at mitigating the threat. The surveillance apparatus has the ear of the politicians, and the primary tool in its box is more surveillance. There's minimal political will to push back on those ideas, especially when people are scared.
...the officials of that security state have bet the farm on the preeminence of the terrorist 'threat,' which has, not so surprisingly, left them eerily reliant on the Islamic State and other such organizations for the perpetuation of their way of life, their career opportunities, their growing powers, and their relative freedom to infringe on basic rights, as well as for that comfortably all-embracing blanket of secrecy that envelops their activities.
Goldsmith's second point is more subtle: when these power increases are made in public, they're legitimized through bureaucracy. Together, the scared populace and their scared elected officials serve to make the expanded national security and law enforcement powers normal.
Terrorism is singularly designed to push our fear buttons in ways completely out of proportion to the actual threat. And as long as people are scared of terrorism, they'll give their governments all sorts of new powers of surveillance, arrest, detention, and so on, regardless of whether those powers actual combat the actual threat. This means that those who want those powers need a steady stream of terrorist attacks to enact their agenda. It's not that these people are actively rooting for the terrorists, but they know a good opportunity when they see it.
Although "the legislative environment is very hostile today," the intelligence community's top lawyer, Robert S. Litt, said to colleagues in an August e-mail, which was obtained by The Post, "it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement."
The Paris attacks could very well be that event.
I am very worried that the Obama administration has already secretly told the NSA to increase its surveillance inside the US. And I am worried that there will be new legislation legitimizing that surveillance and granting other invasive powers to law enforcement. As Goldsmith says, these powers will be accompanied by novel systems of review and transparency. But I have no faith that those systems will be effective in limiting abuse any more than they have been over the last couple of decades.
There hasn't been that much written about surveillance and big data being used to manipulate voters. In Data and Goliath, I wrote:
Unique harms can arise from the use of surveillance data in politics. Election politics is very much a type of marketing, and politicians are starting to use personalized marketing's capability to discriminate as a way to track voting patterns and better "sell" a candidate or policy position. Candidates and advocacy groups can create ads and fund-raising appeals targeted to particular categories: people who earn more than $100,000 a year, gun owners, people who have read news articles on one side of a particular issue, unemployed veterans...anything you can think of. They can target outraged ads to one group of people, and thoughtful policy-based ads to another. They can also fine-tune their get-out-the-vote campaigns on Election Day, and more efficiently gerrymander districts between elections. Such use of data will likely have fundamental effects on democracy and voting.
A new research paper looks at the trends:
Abstract: This paper surveys the various voter surveillance practices recently observed in the United States, assesses the extent to which they have been adopted in other democratic countries, and discusses the broad implications for privacy and democracy. Four broad trends are discussed: the move from voter management databases to integrated voter management platforms; the shift from mass-messaging to micro-targeting employing personal data from commercial data brokerage firms; the analysis of social media and the social graph; and the decentralization of data to local campaigns through mobile applications. The de-alignment of the electorate in most Western societies has placed pressures on parties to target voters outside their traditional bases, and to find new, cheaper, and potentially more intrusive, ways to influence their political behavior. This paper builds on previous research to consider the theoretical tensions between concerns for excessive surveillance, and the broad democratic responsibility of parties to mobilize voters and increase political engagement. These issues have been insufficiently studied in the surveillance literature. They are not just confined to the privacy of the individual voter, but relate to broader dynamics in democratic politics.
Divers are counting them:
Squid gather and mate with as many partners as possible, then die, in an annual ritual off Rapid Head on the Fleurieu Peninsula, south of Adelaide.
Department of Environment divers will check the waters and gather data on how many eggs are left by the spawning squid.
No word on how many are expected. Ten? Ten billion? I have no idea.
As usual, you can also use this squid post to talk about the security stories in the news that I haven't covered.
Reputation is a social mechanism by which we come to trust one another, in all aspects of our society. I see it as a security mechanism. The promise and threat of a change in reputation entices us all to be trustworthy, which in turn enables others to trust us. In a very real sense, reputation enables friendships, commerce, and everything else we do in society. It's old, older than our species, and we are finely tuned to both perceive and remember reputation information, and broadcast it to others.
The nature of how we manage reputation has changed in the past couple of decades, and Gloria Origgi alludes to the change in her remarks. Reputation now involves technology. Feedback and review systems, whether they be eBay rankings, Amazon reviews, or Uber ratings, are reputational systems. So is Google PageRank. Our reputations are, at least in part, based on what we say on social networking sites like Facebook and Twitter. Basically, what were wholly social systems have become socio-technical systems.
This change is important, for both the good and the bad of what it allows.
An example might make this clearer. In a small town, everyone knows each other, and lenders can make decisions about whom to loan money to, based on reputation (like in the movie It's a Wonderful Life). The system isn't perfect; it is prone to "old-boy network" preferences and discrimination against outsiders. The real problem, though, is that the system doesn't scale. To enable lending on a larger scale, we replaced personal reputation with a technological system: credit reports and scores. They work well, and allow us to borrow money from strangers halfway across the country -- and lending has exploded in our society, in part because of it. But the new system can be attacked technologically. Someone could hack the credit bureau's database and enhance her reputation by boosting her credit score. Or she could steal someone else's reputation. All sorts of attacks that just weren't possible with a wholly personal reputation system become possible against a system that works as a technological reputation system.
We like socio-technical systems of reputation because they empower us in so many ways. People can achieve a level of fame and notoriety much more easily on the Internet. Totally new ways of making a living -- think of Uber and Airbnb, or popular bloggers and YouTubers -- become possible. But the downsides are considerable. The hacker tactic of social engineering involves fooling someone by hijacking the reputation of someone else. Most social media companies make their money leeching off our activities on their sites. And because we trust the reputational information from these socio-technical systems, anyone who can figure out how to game those systems can artificially boost their reputation. Amazon, eBay, Yelp, and others have been trying to deal with fake reviews for years. And you can buy Twitter followers and Facebook likes cheap.
Reputation has always been gamed. It's been an eternal arms race between those trying to artificially enhance their reputation and those trying to detect those enhancements. In that respect, nothing is new here. But technology changes the mechanisms of both enhancement and enhancement detection. There's power to be had on either side of that arms race, and it'll be interesting to watch each side jockeying for the upper hand.
They're for carrying cash through dangerous territory:
SDR Traveller caters to people who, for one reason or another, need to haul huge amounts of cash money through dangerous territory. The bags are made from a super strong, super light synthetic material designed for yacht sails, are RFID-shielded, and are rated by how much cash in US$100 bills each can carry....
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.