Entries Tagged "secrecy"

Page 6 of 21

DHS Puts its Head in the Sand

On the subject of the recent Washington Post Snowden document, the DHS sent this e-mail out to at least some of its employees:

From: xxxxx
Sent: Thursday, July 11, 2013 10:28 AM
To: xxxxx
Cc: xxx Security Reps; xxx SSO; xxxx;xxxx
Subject: //// SECURITY ADVISORY//// NEW WASHINGTON POST WEBPAGE ARTICLE—DO NOT CLICK ON THIS LINK

I have been advised that this article is on the Washington Post’s Website today and has a clickable link title “The NSA Slide you never seen” that must not be opened. This link opens up a classified document which will raise the classification level of your Unclassified workstation to the classification of the slide which is reported to be TS/NF. This has been verified by our Mission Partner and the reason for this email.

If opened on your home or work computer you are obligated to report this to the SSO as your computer could then be considered a classified workstation.

Again, please exercise good judgment when visiting these webpages and clicking on such links. You are violating your Non-Disclosure Agreement in which you promise by signing that you will protect Classified National Security Information. You may be subject to any administrative or legal action from the Government.

SSOs, please pass this on to your respective components as this may be a threat to the systems under your jurisdiction.

This is not just ridiculous, it’s idiotic. Why put DHS employees at a disadvantage by trying to prevent them from knowing what the rest of the world knows? The point of classification is to keep something out of the hands of the bad guys. Once a document is public, the bad guys have access to it. The harm is already done. Can someone think of a reason for this DHS policy other than spite?

Posted on July 17, 2013 at 2:45 PMView Comments

A Problem with the US Privacy and Civil Liberties Oversight Board

I haven’t heard much about the Privacy and Civil Liberties Oversight Board. They recently held hearings regarding the Snowden documents.

This particular comment stood out:

Rachel Brand, another seemingly unsympathetic board member, concluded: “There is nothing that is more harmful to civil liberties than terrorism. This discussion here has been quite sterile because we have not been talking about terrorism.”

If terrorism harms civil liberties, it’s because elected officials react in panic and revoke them.

I’m not optimistic about this board.

Posted on July 16, 2013 at 7:11 AMView Comments

Musing on Secret Languages

This is really interesting. It starts by talking about a “cant” dictionary of 16th-century thieves’ argot, and ends up talking about secret languages in general.

Incomprehension breeds fear. A secret language can be a threat: signifier has no need of signified in order to pack a punch. Hearing a conversation in a language we don’t speak, we wonder whether we’re being mocked. The klezmer-loshn spoken by Jewish musicians allowed them to talk about the families and wedding guests without being overheard. Germanía and Grypsera are prison languages designed to keep information from guards—the first in sixteenth-century Spain, the second in today’s Polish jails. The same logic shows how a secret language need not be the tongue of a minority or an oppressed group: given the right circumstances, even a national language can turn cryptolect. In 1680, as Moroccan troops besieged the short-lived British city of Tangier, Irish soldiers manning the walls resorted to speaking as Gaeilge, in Irish, for fear of being understood by English-born renegades in the Sultan’s armies. To this day, the Irish abroad use the same tactic in discussing what should go unheard, whether bargaining tactics or conversations about taxi-drivers’ haircuts. The same logic lay behind North African slave-masters’ insistence that their charges use the Lingua Franca (a pidgin based on Italian and Spanish and used by traders and slaves in the early modern Mediterranean) so that plots of escape or revolt would not go unheard. A Flemish captive, Emanuel d’Aranda, said that on one slave-galley alone, he heard “the Turkish, the Arabian, Lingua Franca, Spanish, French, Dutch, and English.” On his arrival at Algiers, his closest companion was an Icelander. In such a multilingual environment, the Lingua Franca didn’t just serve for giving orders, but as a means of restricting chatter and intrigue between slaves. If the key element of the secret language is that it obscures the understandings of outsiders, a national tongue can serve just as well as an argot.

Posted on July 10, 2013 at 5:55 AMView Comments

Secrecy and Privacy

Interesting article on the history of, and the relationship between, secrecy and privacy.

As a matter of historical analysis, the relationship between secrecy and privacy can be stated in an axiom: the defense of privacy follows, and never precedes, the emergence of new technologies for the exposure of secrets. In other words, the case for privacy always comes too late. The horse is out of the barn. The post office has opened your mail. Your photograph is on Facebook. Google already knows that, notwithstanding your demographic, you hate kale.

Posted on June 26, 2013 at 12:35 PMView Comments

Details of NSA Data Requests from US Corporations

Facebook (here), Apple (here), and Yahoo (here) have all released details of US government requests for data. They each say that they’ve turned over user data for about 10,000 people, although the time frames are different. The exact number isn’t important; what’s important is that it’s much lower than the millions implied by the PRISM document.

Now the big question: do we believe them? If we don’t, what would it take before we did believe them?

Posted on June 18, 2013 at 4:00 PMView Comments

NSA Secrecy and Personal Privacy

In an excellent essay about privacy and secrecy, law professor Daniel Solove makes an important point. There are two types of NSA secrecy being discussed. It’s easy to confuse them, but they’re very different.

Of course, if the government is trying to gather data about a particular suspect, keeping the specifics of surveillance efforts secret will decrease the likelihood of that suspect altering his or her behavior.

But secrecy at the level of an individual suspect is different from keeping the very existence of massive surveillance programs secret. The public must know about the general outlines of surveillance activities in order to evaluate whether the government is achieving the appropriate balance between privacy and security. What kind of information is gathered? How is it used? How securely is it kept? What kind of oversight is there? Are these activities even legal? These questions can’t be answered, and the government can’t be held accountable, if surveillance programs are completely classified.

This distinction is also becoming important as Snowden keeps talking. There are a lot of articles about Edward Snowden cooperating with the Chinese government. I have no idea if this is true—Snowden denies it—or if it’s part of an American smear campaign designed to change the debate from the NSA surveillance programs to the whistleblower’s actions. (It worked against Assange.) In anticipation of the inevitable questions, I want to change a previous assessment statement: I consider Snowden a hero for whistleblowing on the existence and details of the NSA surveillance programs, but not for revealing specific operational secrets to the Chinese government. Charles Pierce wishes Snowden would stop talking. I agree; the more this story is about him the less it is about the NSA. Stop giving interviews and let the documents do the talking.

Back to Daniel Solove, this excellent 2011 essay on the value of privacy is making the rounds again. And it should.

Many commentators had been using the metaphor of George Orwell’s 1984 to describe the problems created by the collection and use of personal data. I contended that the Orwell metaphor, which focuses on the harms of surveillance (such as inhibition and social control) might be apt to describe law enforcement’s monitoring of citizens. But much of the data gathered in computer databases is not particularly sensitive, such as one’s race, birth date, gender, address, or marital status. Many people do not care about concealing the hotels they stay at, the cars they own or rent, or the kind of beverages they drink. People often do not take many steps to keep such information secret. Frequently, though not always, people’s activities would not be inhibited if others knew this information.

I suggested a different metaphor to capture the problems: Franz Kafka’s The Trial, which depicts a bureaucracy with inscrutable purposes that uses people’s information to make important decisions about them, yet denies the people the ability to participate in how their information is used. The problems captured by the Kafka metaphor are of a different sort than the problems caused by surveillance. They often do not result in inhibition or chilling. Instead, they are problems of information processing—the storage, use, or analysis of data—rather than information collection. They affect the power relationships between people and the institutions of the modern state. They not only frustrate the individual by creating a sense of helplessness and powerlessness, but they also affect social structure by altering the kind of relationships people have with the institutions that make important decisions about their lives.

The whole essay is worth reading, as is—I hope—my essay on the value of privacy from 2006.

I have come to believe that the solution to all of this is regulation. And it’s not going to be the regulation of data collection; it’s going to be the regulation of data use.

EDITED TO ADD (6/18): A good rebutttal to the “nothing to hide” argument.

Posted on June 18, 2013 at 11:02 AMView Comments

Evidence that the NSA Is Storing Voice Content, Not Just Metadata

Interesting speculation that the NSA is storing everyone’s phone calls, and not just metadata. Definitely worth reading.

I expressed skepticism about this just a month ago. My assumption had always been that everyone’s compressed voice calls is just too much data to move around and store. Now, I don’t know.

There’s a bit of a conspiracy-theory air to all of this speculation, but underestimating what the NSA will do is a mistake. General Alexander has told members of Congress that they can record the contents of phone calls. And they have the technical capability.

Earlier reports have indicated that the NSA has the ability to record nearly all domestic and international phone calls—in case an analyst needed to access the recordings in the future. A Wired magazine article last year disclosed that the NSA has established “listening posts” that allow the agency to collect and sift through billions of phone calls through a massive new data center in Utah, “whether they originate within the country or overseas.” That includes not just metadata, but also the contents of the communications.

William Binney, a former NSA technical director who helped to modernize the agency’s worldwide eavesdropping network, told the Daily Caller this week that the NSA records the phone calls of 500,000 to 1 million people who are on its so-called target list, and perhaps even more. “They look through these phone numbers and they target those and that’s what they record,” Binney said.

Brewster Kahle, a computer engineer who founded the Internet Archive, has vast experience storing large amounts of data. He created a spreadsheet this week estimating that the cost to store all domestic phone calls a year in cloud storage for data-mining purposes would be about $27 million per year, not counting the cost of extra security for a top-secret program and security clearances for the people involved.

I believe that, to the extent that the NSA is analyzing and storing conversations, they’re doing speech-to-text as close to the source as possible and working with that. Even if you have to store the audio for conversations in foreign languages, or for snippets of conversations the conversion software is unsure of, it’s a lot fewer bits to move around and deal with.

And, by the way, I hate the term “metadata.” What’s wrong with “traffic analysis,” which is what we’ve always called that sort of thing?

Posted on June 18, 2013 at 5:57 AMView Comments

More on Feudal Security

Facebook regularly abuses the privacy of its users. Google has stopped supporting its popular RSS feeder. Apple prohibits all iPhone apps that are political or sexual. Microsoft might be cooperating with some governments to spy on Skype calls, but we don’t know which ones. Both Twitter and LinkedIn have recently suffered security breaches that affected the data of hundreds of thousands of their users.

If you’ve started to think of yourself as a hapless peasant in a Game of Thrones power struggle, you’re more right than you may realize. These are not traditional companies, and we are not traditional customers. These are feudal lords, and we are their vassals, peasants, and serfs.

Power has shifted in IT, in favor of both cloud-service providers and closed-platform vendors. This power shift affects many things, and it profoundly affects security.

Traditionally, computer security was the user’s responsibility. Users purchased their own antivirus software and firewalls, and any breaches were blamed on their inattentiveness. It’s kind of a crazy business model. Normally we expect the products and services we buy to be safe and secure, but in IT we tolerated lousy products and supported an enormous aftermarket for security.

Now that the IT industry has matured, we expect more security “out of the box.” This has become possible largely because of two technology trends: cloud computing and vendor-controlled platforms. The first means that most of our data resides on other networks: Google Docs, Salesforce.com, Facebook, Gmail. The second means that our new Internet devices are both closed and controlled by the vendors, giving us limited configuration control: iPhones, ChromeBooks, Kindles, BlackBerry PDAs. Meanwhile, our relationship with IT has changed. We used to use our computers to do things. We now use our vendor-controlled computing devices to go places. All of these places are owned by someone.

The new security model is that someone else takes care of it—without telling us any of the details. I have no control over the security of my Gmail or my photos on Flickr. I can’t demand greater security for my presentations on Prezi or my task list on Trello, no matter how confidential they are. I can’t audit any of these cloud services. I can’t delete cookies on my iPad or ensure that files are securely erased. Updates on my Kindle happen automatically, without my knowledge or consent. I have so little visibility into the security of Facebook that I have no idea what operating system they’re using.

There are a lot of good reasons why we’re all flocking to these cloud services and vendor-controlled platforms. The benefits are enormous, from cost to convenience to reliability to security itself. But it is inherently a feudal relationship. We cede control of our data and computing platforms to these companies and trust that they will treat us well and protect us from harm. And if we pledge complete allegiance to them—if we let them control our email and calendar and address book and photos and everything—we get even more benefits. We become their vassals; or, on a bad day, their serfs.

There are a lot of feudal lords out there. Google and Apple are the obvious ones, but Microsoft is trying to control both user data and the end-user platform as well. Facebook is another lord, controlling much of the socializing we do on the Internet. Other feudal lords are smaller and more specialized—Amazon, Yahoo, Verizon, and so on—but the model is the same.

To be sure, feudal security has its advantages. These companies are much better at security than the average user. Automatic backup has saved a lot of data after hardware failures, user mistakes, and malware infections. Automatic updates have increased security dramatically. This is also true for small organizations; they are more secure than they would be if they tried to do it themselves. For large corporations with dedicated IT security departments, the benefits are less clear. Sure, even large companies outsource critical functions like tax preparation and cleaning services, but large companies have specific requirements for security, data retention, audit, and so on—and that’s just not possible with most of these feudal lords.

Feudal security also has its risks. Vendors can, and do, make security mistakes affecting hundreds of thousands of people. Vendors can lock people into relationships, making it hard for them to take their data and leave. Vendors can act arbitrarily, against our interests; Facebook regularly does this when it changes peoples’ defaults, implements new features, or modifies its privacy policy. Many vendors give our data to the government without notice, consent, or a warrant; almost all sell it for profit. This isn’t surprising, really; companies should be expected to act in their own self-interest and not in their users’ best interest.

The feudal relationship is inherently based on power. In Medieval Europe, people would pledge their allegiance to a feudal lord in exchange for that lord’s protection. This arrangement changed as the lords realized that they had all the power and could do whatever they wanted. Vassals were used and abused; peasants were tied to their land and became serfs.

It’s the Internet lords’ popularity and ubiquity that enable them to profit; laws and government relationships make it easier for them to hold onto power. These lords are vying with each other for profits and power. By spending time on their sites and giving them our personal information—whether through search queries, e-mails, status updates, likes, or simply our behavioral characteristics—we are providing the raw material for that struggle. In this way we are like serfs, toiling the land for our feudal lords. If you don’t believe me, try to take your data with you when you leave Facebook. And when war breaks out among the giants, we become collateral damage.

So how do we survive? Increasingly, we have little alternative but to trust someone, so we need to decide who we trust—and who we don’t—and then act accordingly. This isn’t easy; our feudal lords go out of their way not to be transparent about their actions, their security, or much of anything. Use whatever power you have—as individuals, none; as large corporations, more—to negotiate with your lords. And, finally, don’t be extreme in any way: politically, socially, culturally. Yes, you can be shut down without recourse, but it’s usually those on the edges that are affected. Not much solace, I agree, but it’s something.

On the policy side, we have an action plan. In the short term, we need to keep circumvention—the ability to modify our hardware, software, and data files—legal and preserve net neutrality. Both of these things limit how much the lords can take advantage of us, and they increase the possibility that the market will force them to be more benevolent. The last thing we want is the government—that’s us—spending resources to enforce one particular business model over another and stifling competition.

In the longer term, we all need to work to reduce the power imbalance. Medieval feudalism evolved into a more balanced relationship in which lords had responsibilities as well as rights. Today’s Internet feudalism is both ad hoc and one-sided. We have no choice but to trust the lords, but we receive very few assurances in return. The lords have a lot of rights, but few responsibilities or limits. We need to balance this relationship, and government intervention is the only way we’re going to get it. In medieval Europe, the rise of the centralized state and the rule of law provided the stability that feudalism lacked. The Magna Carta first forced responsibilities on governments and put humans on the long road toward government by the people and for the people.

We need a similar process to rein in our Internet lords, and it’s not something that market forces are likely to provide. The very definition of power is changing, and the issues are far bigger than the Internet and our relationships with our IT providers.

This essay originally appeared on the Harvard Business Review website. It is an update of this earlier essay on the same topic. “Feudal security” is a metaphor I have been using a lot recently; I wrote this essay without rereading my previous essay.

EDITED TO ADD (6/13): There is another way the feudal metaphor applies to the Internet. There is no commons; every part of the Internet is owned by someone. This article explores that aspect of the metaphor.

Posted on June 13, 2013 at 11:34 AMView Comments

Prosecuting Snowden

Edward Snowden broke the law by releasing classified information. This isn’t under debate; it’s something everyone with a security clearance knows. It’s written in plain English on the documents you have to sign when you get a security clearance, and it’s part of the culture. The law is there for a good reason, and secrecy has an important role in military defense.

But before the Justice Department prosecutes Snowden, there are some other investigations that ought to happen.

We need to determine whether these National Security Agency programs are themselves legal. The administration has successfully barred anyone from bringing a lawsuit challenging these laws, on the grounds of national secrecy. Now that we know those arguments are without merit, it’s time for those court challenges.

It’s clear that some of the NSA programs exposed by Snowden violate the Constitution and others violate existing laws. Other people have an opposite view. The courts need to decide.

We need to determine whether classifying these programs is legal. Keeping things secret from the people is a very dangerous practice in a democracy, and the government is permitted to do so only under very specific circumstances. Reading the documents leaked so far, I don’t see anything that needs to be kept secret. The argument that exposing these documents helps the terrorists doesn’t even pass the laugh test; there’s nothing here that changes anything any potential terrorist would do or not do. But in any case, now that the documents are public, the courts need to rule on the legality of their secrecy.

And we need to determine how we treat whistle-blowers in this country. We have whistle-blower protection laws that apply in some cases, particularly when exposing fraud, and other illegal behavior. NSA officials have repeatedly lied about the existence, and details, of these programs to Congress.

Only after all of these legal issues have been resolved should any prosecution of Snowden move forward. Because only then will we know the full extent of what he did, and how much of it is justified.

I believe that history will hail Snowden as a hero—his whistle-blowing exposed a surveillance state and a secrecy machine run amok. I’m less optimistic of how the present day will treat him, and hope that the debate right now is less about the man and more about the government he exposed.

This essay was originally published on the New York Times Room for Debate blog, as part of a series of essays on the topic.

EDITED TO ADD (6/13): There’s a big discussion of this on Reddit.

Posted on June 12, 2013 at 6:16 AMView Comments

1 4 5 6 7 8 21

Sidebar photo of Bruce Schneier by Joe MacInnis.