Online Activism and the Computer Fraud and Abuse Act
Good essay by Molly Sauter: basically, there is no legal avenue for activism and protest on the Internet.
Also note Sauter’s new book, The Coming Swarm.
Page 400
Good essay by Molly Sauter: basically, there is no legal avenue for activism and protest on the Internet.
Also note Sauter’s new book, The Coming Swarm.
This article reads like snake oil. But the company was founded by Lars Knudsen, so it can’t possibly be.
I’m curious.
Just the thing for smuggling data out of secure locations.
In July, I wrote about an unpatchable USB vulnerability called BadUSB. Code for the vulnerability has been published.
Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World is finished. I submitted it to my publisher, Norton, this morning. In a few weeks, I’ll get the copyedited manuscript back, and a few weeks after that, it’ll go into production. Stacks of printed books will come out the other end in February, and the book will be published on March 9. There’s already an Amazon page, but it’s still pretty preliminary. And I expect the price to go down.
Books are both a meandering and clarifying process for me, and I figure out what I’m writing about as I write about it. Data and Goliath started out being about security and power in cyberspace, and ended up being about digital surveillance and what to do about it.
This is the table of contents:
Part 2: What’s at Stake
Part 3: What to Do About It
Fundamentally, the issues surrounding mass surveillance are tensions between group interest and self-interest, a topic I covered in depth in Liars and Outliers. We’re promised great benefits if we allow all of our data to be collected in one place; at the same time, it can be incredibly personal. I see this tension playing out in many areas: location data, social graphs, medical data, search histories. Figuring out the proper balances between group and self-interests, and ensuring that those balances are maintained, is the fundamental issue of the information age. It’s how we are going to be judged by our descendants fifty years from now.
Anyway, the book is done and at the publisher. I’m happy with it; the manuscript is so tight you can bounce a quarter off of it. This is a complicated topic, and I think I distilled it down into 80,000 words that are both understandable by the lay reader and interesting to the policy wonk or technical geek. It’s also an important topic, and I hope the book becomes a flash point for discussion and debate.
But that’s not for another five months. You might think that’s a long time, but in publishing that’s incredibly fast. I convinced Norton to go with this schedule by stressing that the book becomes less timely every second it’s not published. (An exaggeration, I know, but they bought it.) Now I just hope that nothing major happens between now and then to render the book obsolete.
For now, I want to get back to writing shorter pieces. Writing a book can be all-consuming, and I generally don’t have time for anything else. Look at my essays. Last year, I wrote 59 essays. This year so far: 17. That’s an effect of writing the book. Now that it’s done, expect more essays on news websites and longer posts on this blog. It’ll be good to be thinking about something else for a change.
If anyone works for a publication, and wants to write a review, conduct an interview, publish an excerpt, or otherwise help me get the word out about the book, please e-mail me and I will pass you on to Norton’s publicity department. I think this book has a real chance of breaking out of my normal security market.
Last week, Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone’s encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.
From now on, all the phone’s data is protected. It can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments. A user’s iPhone data is now more secure.
To hear US law enforcement respond, you’d think Apple’s move heralded an unstoppable crime wave. See, the FBI had been using that vulnerability to get into people’s iPhones. In the words of cyberlaw professor Orin Kerr, “How is the public interest served by a policy that only thwarts lawful search warrants?”
Ah, but that’s the thing: You can’t build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.
Backdoor access built for the good guys is routinely used by the bad guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The same thing happened in Italy in 2006.
In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with US government surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.
This doesn’t stop the FBI and Justice Department from pumping up the fear. Attorney General Eric Holder threatened us with kidnappers and sexual predators.
The former head of the FBI’s criminal investigative division went even further, conjuring up kidnappers who are also sexual predators. And, of course, terrorists.
FBI Director James Comey claimed that Apple’s move allows people to “place themselves beyond the law” and also invoked that now overworked “child kidnapper.” John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: “Apple will become the phone of choice for the pedophile.”
It’s all bluster. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping. And, more importantly, there’s no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012—and the investigations proceeded in some other way.
This is why the FBI’s scare stories tend to wither after public scrutiny. A former FBI assistant director wrote about a kidnapped man who would never have been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn’t true.
We’ve seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh and others would repeatedly use the example of mobster John Gotti to illustrate why the ability to tap telephones was so vital. But the Gotti evidence was collected using a room bug, not a telephone tap. And those same scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing has changed.
Strong encryption has been around for years. Both Apple’s FileVault and Microsoft’s BitLocker encrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption built-in. There are literally thousands of encryption products without back doors for sale, and some have been around for decades. Even if the US bans the stuff, foreign companies will corner the market because many of us have legitimate needs for security.
Law enforcement has been complaining about “going dark” for decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to ensure that phone calls would remain tappable even as they became digital. They tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong encryption in 2010. Now, in the post-Snowden era, they’re about to try again.
We need to fight this. Strong encryption protects us from a panoply of threats. It protects us from hackers and criminals. It protects our businesses from competitors and foreign spies. It protects people in totalitarian governments from arrest and detention. This isn’t just me talking: The FBI also recommends you encrypt your data for security.
As for law enforcement? The recent decades have given them an unprecedented ability to put us under surveillance and access our data. Our cell phones provide them with a detailed history of our movements. Our call records, e-mail history, buddy lists, and Facebook pages tell them who we associate with. The hundreds of companies that track us on the Internet tell them what we’re thinking about. Ubiquitous cameras capture our faces everywhere. And most of us back up our iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the golden age of surveillance.
After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal trade-off. I think he’s right.
Given everything that has made it easier for governments and others to intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance between government access and our security/privacy. More companies should follow Apple’s lead and make encryption the easy-to-use default. And let’s wait for some actual evidence of harm before we acquiesce to police demands for reduced security.
This essay previously appeared on CNN.com
EDITED TO ADD (10/6): Three more essays worth reading. As is this on all the other ways Apple and the government have to get at your iPhone data.
And an Washington Post editorial manages to say this:
How to resolve this? A police “back door” for all smartphones is undesirable—a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.
Because a “secure golden key” is completely different from a “back door.”
EDITED TO ADD (10/7): Another essay.
EDITED TO ADD (10/9): Three more essays that are worth reading.
EDITED TO ADD (10/12): Another essay.
McDonald’s has a Halloween-themed burger with a squid-ink bun. Only in Japan.
As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.
Former NSA employee—not technical director, as the link says—explains how NSA bulk surveillance works, using some of the Snowden documents. Very interesting.
EDITED TO ADD (10/4): Apologies to Binney for downgrading his role at the NSA. He was not the technical director of the NSA, which is what I was thinking of, but he was a technical director at the NSA:
“In ’97, I became the technical director of the geopolitical—military
geopolitical analysis and reporting shop for the world, which was about
6,000 people,” Binney told Frontline.
Whatever the case, he does know what he’s talking about when he talks about NSA surveillance.
The NSA is building a private cloud with its own security features:
As a result, the agency can now track every instance of every individual accessing what is in some cases a single word or name in a file. This includes when it arrived, who can access it, who did access it, downloaded it, copied it, printed it, forwarded it, modified it, or deleted it.
[…]
“All of this I can do in the cloud but—in many cases—it cannot be done in the legacy systems, many of which were created before such advanced data provenance technology existed.” Had this ability all been available at the time, it is unlikely that U.S. soldier Bradley Manning would have succeeded in obtaining classified documents in 2010.
Maybe.
Firechat is a secure wireless peer-to-peer chat app:
Firechat is theoretically resistant to the kind of centralized surveillance that the Chinese government (as well as western states, especially the US and the UK) is infamous for. Phones connect directly to one another, establish encrypted connections, and transact without sending messages to servers where they can be sniffed and possibly decoded.
Sidebar photo of Bruce Schneier by Joe MacInnis.