Entries Tagged "FBI"

Page 6 of 23

Another FBI Filing on the San Bernardino iPhone Case

The FBI’s reply to Apple is more of a character assassination attempt than a legal argument. It’s as if it only cares about public opinion at this point.

Although notice the threat in footnote 9 on page 22:

For the reasons discussed above, the FBI cannot itself modify the software on Farook’s iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers.

This should immediately remind everyone of the Lavabit case, where the FBI did ask for the site’s master key in order to get at one user. Ladar Levison commented on the similarities. He, of course, shut his service down rather than turn over the master key. A company as large as Apple does not have that option. Marcy Wheeler wrote about this in detail.

My previous three posts on this are here, here, and here, all with lots of interesting links to various writings on this case.

EDITED TO ADD:The New York Times reports that the White House might have overreached in this case.

John Oliver has a great segment on this. With a Matt Blaze cameo!

Good NPR interview with Richard Clarke.

Well, I don’t think it’s a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they’re much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it’s Jim Comey and the attorney general is letting him get away with it.

Senator Lindsay Graham is changing his views:

“It’s just not so simple,” Graham said. “I thought it was that simple.”

Steven Levy on the history angle of this story.

Benjamin Wittes on possible legislative options.

EDITED TO ADD (3/17): Apple’s latest response is pretty withering. Commentary from Susan Crawford. FBI and China are on the same side. How this fight risks the whole US tech industry.

EDITED TO ADD (3/18): Tim Cook interview. Apple engineers might refuse to help the FBI, if Apple loses the case. And I should have previously posted this letter from racial justice activists, and this more recent essay on how this affects the LGBTQ community.

EDITED TO ADD (3/21): Interesting article on the Apple/FBI tensions that led to this case.

Posted on March 16, 2016 at 6:12 AMView Comments

Lots More Writing about the FBI vs. Apple

I have written two posts on the case, and at the bottom of those essays are lots of links to other essays written by other people. Here are more links.

If you read just one thing on the technical aspects of this case, read Susan Landau’s testimony before the House Judiciary Committee. It’s very comprehensive, and very good.

Others are testifying, too.

Apple is fixing the vulnerability. The Justice Department wants Apple to unlock nine more phones.

Apple prevailed in a different iPhone unlocking case.

Why the First Amendment is a bad argument. And why the All Writs Act is the wrong tool.

Dueling poll results: Pew Research reports that 51% side with the FBI, while a Reuters poll reveals that “forty-six percent of respondents said they agreed with Apple’s position, 35 percent said they disagreed and 20 percent said they did not know,” and that “a majority of Americans do not want the government to have access to their phone and Internet communications, even if it is done in the name of stopping terror attacks.”

One of the worst possible outcomes from this story is that people stop installing security updates because they don’t trust them. After all, a security update mechanism is also a mechanism by which the government can install a backdoor. Here’s one essay that talks about that. Here’s another.

Cory Doctorow comments on the FBI’s math denialism. Yochai Benkler sees this as a symptom of a greater breakdown in government trust. More good commentary from Jeff Schiller, Julian Sanchez, and Jonathan Zdziarski. Marcy Wheeler’s comments. Two posts by Dan Wallach. Michael Chertoff and associates weigh in on the side of security over surveillance.

Here’s a Catholic op-ed on Apple’s side. Bill Gates sides with the FBI. And a great editorial cartoon.

Here’s high snark from Stewart Baker. Baker asks some very good (and very snarky) questions. But the questions are beside the point. This case isn’t about Apple or whether Apple is being hypocritical, any more than climate change is about Al Gore’s character. This case is about the externalities of what the government is asking for.

One last thing to read.

Okay, one more, on the more general back door issue.

EDITED TO ADD (3/2): Wall Street Journal editorial. And here’s video from the House Judiciary Committee hearing. Skip to around 34:50 to get to the actual beginning.

EDITED TO ADD (3/3): Interview with Rep. Darrell Issa. And at the RSA Conference this week, both Defense Secretary Ash Carter and Microsoft’s chief legal officer Brad Smith sided with Apple against the FBI.

EDITED TO ADD (3/4): Comments on the case from the UN High Commissioner for Human Rights.

EDITED TO ADD (3/7): Op ed by Apple. And an interesting article on the divide in the Obama Administration.

EDITED TO ADD (3/10): Another good essay.

EDITED TO ADD (3/13): President Obama’s comments on encryption: he wants back doors. Cory Doctorow reports.

Posted on March 1, 2016 at 6:47 AMView Comments

Underage Hacker Is behind Attacks against US Government

It’s a teenager:

British police have arrested a teenager who allegedly was behind a series of audacious—and, for senior U.S. national security officials, embarrassing—hacks targeting personal accounts or top brass at the CIA, FBI, Homeland Security Department, the White House and other federal agencies, according to U.S. officials briefed on the investigation.

[…]

The prominent victims have included CIA Director John Brennan, whose personal AOL account was breached, the then FBI Deputy Director Mark Giuliano, and James Clapper, the director of National Intelligence.

This week, the latest target became apparent when personal details of 20,000 FBI employees surfaced online.

By then a team of some of the FBI’s sharpest cyber experts had homed in on their suspect, officials said. They were shocked to find that a “16-year-old computer nerd” had done so well to cover his tracks, a U.S. official said. a

Not really surprised, but underscores how diffuse the threat is.

Posted on February 18, 2016 at 6:02 AMView Comments

Large-Scale FBI Hacking

As part of a child pornography investigation, the FBI hacked into over 1,300 computers.

But after Playpen was seized, it wasn’t immediately closed down, unlike previous dark web sites that have been shuttered” by law enforcement. Instead, the FBI ran Playpen from its own servers in Newington, Virginia, from February 20 to March 4, reads a complaint filed against a defendant in Utah. During this time, the FBI deployed what is known as a network investigative technique (NIT), the agency’s term for a hacking tool.

While Playpen was being run out of a server in Virginia, and the hacking tool was infecting targets, “approximately 1300 true internet protocol (IP) addresses were identified during this time,” according to the same complaint.

The FBI seems to have obtained a single warrant, but it’s hard to believe that a legal warrant could allow the police to hack 1,300 different computers. We do know that the FBI is very vague about the extent of its operations in warrant applications. And surely we need actual public debate about this sort of technique.

Also, “Playpen” is a super-creepy name for a child porn site. I feel icky just typing it.

Posted on February 9, 2016 at 6:25 AMView Comments

Security vs. Surveillance

Both the “going dark” metaphor of FBI Director James Comey and the contrasting “golden age of surveillance” metaphor of privacy law professor Peter Swire focus on the value of data to law enforcement. As framed in the media, encryption debates are about whether law enforcement should have surreptitious access to data, or whether companies should be allowed to provide strong encryption to their customers.

It’s a myopic framing that focuses only on one threat—criminals, including domestic terrorists—and the demands of law enforcement and national intelligence. This obscures the most important aspects of the encryption issue: the security it provides against a much wider variety of threats.

Encryption secures our data and communications against eavesdroppers like criminals, foreign governments, and terrorists. We use it every day to hide our cell phone conversations from eavesdroppers, and to hide our Internet purchasing from credit card thieves. Dissidents in China and many other countries use it to avoid arrest. It’s a vital tool for journalists to communicate with their sources, for NGOs to protect their work in repressive countries, and for attorneys to communicate with their clients.

Many technological security failures of today can be traced to failures of encryption. In 2014 and 2015, unnamed hackers—probably the Chinese government—stole 21.5 million personal files of U.S. government employees and others. They wouldn’t have obtained this data if it had been encrypted. Many large-scale criminal data thefts were made either easier or more damaging because data wasn’t encrypted: Target, TJ Maxx, Heartland Payment Systems, and so on. Many countries are eavesdropping on the unencrypted communications of their own citizens, looking for dissidents and other voices they want to silence.

Adding backdoors will only exacerbate the risks. As technologists, we can’t build an access system that only works for people of a certain citizenship, or with a particular morality, or only in the presence of a specified legal document. If the FBI can eavesdrop on your text messages or get at your computer’s hard drive, so can other governments. So can criminals. So can terrorists. This is not theoretical; again and again, backdoor accesses built for one purpose have been surreptitiously used for another. Vodafone built backdoor access into Greece’s cell phone network for the Greek government; it was used against the Greek government in 2004-2005. Google kept a database of backdoor accesses provided to the U.S. government under CALEA; the Chinese breached that database in 2009.

We’re not being asked to choose between security and privacy. We’re being asked to choose between less security and more security.

This trade-off isn’t new. In the mid-1990s, cryptographers argued that escrowing encryption keys with central authorities would weaken security. In 2013, cybersecurity researcher Susan Landau published her excellent book Surveillance or Security?, which deftly parsed the details of this trade-off and concluded that security is far more important.

Ubiquitous encryption protects us much more from bulk surveillance than from targeted surveillance. For a variety of technical reasons, computer security is extraordinarily weak. If a sufficiently skilled, funded, and motivated attacker wants in to your computer, they’re in. If they’re not, it’s because you’re not high enough on their priority list to bother with. Widespread encryption forces the listener—whether a foreign government, criminal, or terrorist—to target. And this hurts repressive governments much more than it hurts terrorists and criminals.

Of course, criminals and terrorists have used, are using, and will use encryption to hide their planning from the authorities, just as they will use many aspects of society’s capabilities and infrastructure: cars, restaurants, telecommunications. In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless because the honest so outnumber the dishonest. Compare this with the tactic of secretly poisoning all the food at a restaurant. Yes, we might get lucky and poison a terrorist before he strikes, but we’ll harm all the innocent customers in the process. Weakening encryption for everyone is harmful in exactly the same way.

This essay previously appeared as part of the paper “Don’t Panic: Making Progress on the ‘Going Dark’ Debate.” It was reprinted on Lawfare. A modified version was reprinted by the MIT Technology Review.

Posted on February 3, 2016 at 6:09 AMView Comments

Did Carnegie Mellon Attack Tor for the FBI?

There’s pretty strong evidence that the team of researchers from Carnegie Mellon University who cancelled their scheduled 2015 Black Hat talk deanonymized Tor users for the FBI.

Details are in this Vice story and this Wired story (and these two follow-on Vice stories). And here’s the reaction from the Tor Project.

Nicholas Weaver guessed this back in January.

The behavior of the researchers is reprehensible, but the real issue is that CERT Coordination Center (CERT/CC) has lost its credibility as an honest broker. The researchers discovered this vulnerability and submitted it to CERT. Neither the researchers nor CERT disclosed this vulnerability to the Tor Project. Instead, the researchers apparently used this vulnerability to deanonymize a large number of hidden service visitors and provide the information to the FBI.

Does anyone still trust CERT to behave in the Internet’s best interests?

EDITED TO ADD (12/14): I was wrong. CERT did disclose to Tor.

Posted on November 16, 2015 at 6:19 AMView Comments

Police Want Genetic Data from Corporate Repositories

Both the FBI and local law enforcement are trying to get the genetic data stored at companies like 23andMe.

No surprise, really.

As NYU law professor Erin Murphy told the New Orleans Advocate regarding the Usry case, gathering DNA information is “a series of totally reasonable steps by law enforcement.” If you’re a cop trying to solve a crime, and you have DNA at your disposal, you’re going to want to use it to further your investigation. But the fact that your signing up for 23andMe or Ancestry.com means that you and all of your current and future family members could become genetic criminal suspects is not something most users probably have in mind when trying to find out where their ancestors came from.

Posted on October 22, 2015 at 6:40 AMView Comments

Obama Administration Not Pursuing a Backdoor to Commercial Encryption

The Obama Administration is not pursuing a law that would force computer and communications manufacturers to add backdoors to their products for law enforcement. Sensibly, they concluded that criminals, terrorists, and foreign spies would use that backdoor as well.

Score one for the pro-security side in the Second Crypto War.

It’s certainly not over. The FBI hasn’t given up on an encryption backdoor (or other backdoor access to plaintext) since the early 1990s, and it’s not going to give up now. I expect there will be more pressure on companies, both overt and covert, more insinuations that strong security is somehow responsible for crime and terrorism, and more behind-closed-doors negotiations.

Posted on October 14, 2015 at 9:39 AMView Comments

No-Fly List Uses Predictive Assessments

The US government has admitted that it uses predictive assessments to put people on the no-fly list:

In a little-noticed filing before an Oregon federal judge, the US Justice Department and the FBI conceded that stopping US and other citizens from travelling on airplanes is a matter of “predictive assessments about potential threats,” the government asserted in May.

“By its very nature, identifying individuals who ‘may be a threat to civil aviation or national security’ is a predictive judgment intended to prevent future acts of terrorism in an uncertain context,” Justice Department officials Benjamin C Mizer and Anthony J Coppolino told the court on 28 May.

“Judgments concerning such potential threats to aviation and national security call upon the unique prerogatives of the Executive in assessing such threats.”

It is believed to be the government’s most direct acknowledgement to date that people are not allowed to fly because of what the government believes they might do and not what they have already done.

When you have a secret process that can judge and penalize people without due process or oversight, this is the kind of thing that happens.

Posted on August 20, 2015 at 6:19 AMView Comments

Another Salvo in the Second Crypto War (of Words)

Prosecutors from New York, London, Paris, and Madrid wrote an op-ed in yesterday’s New York Times in favor of backdoors in cell phone encryption. There are a number of flaws in their argument, ranging from how easy it is to get data off an encrypted phone to the dangers of designing a backdoor in the first place, but all of that has been said before. And since anecdote can be more persuasive than data, the op-ed started with one:

In June, a father of six was shot dead on a Monday afternoon in Evanston, Ill., a suburb 10 miles north of Chicago. The Evanston police believe that the victim, Ray C. Owens, had also been robbed. There were no witnesses to his killing, and no surveillance footage either.

With a killer on the loose and few leads at their disposal, investigators in Cook County, which includes Evanston, were encouraged when they found two smartphones alongside the body of the deceased: an iPhone 6 running on Apple’s iOS 8 operating system, and a Samsung Galaxy S6 Edge running on Google’s Android operating system. Both devices were passcode protected.

You can guess the rest. A judge issued a warrant, but neither Apple nor Google could unlock the phones. “The homicide remains unsolved. The killer remains at large.”

The Intercept researched the example, and it seems to be real. The phones belonged to the victim, and…

According to Commander Joseph Dugan of the Evanston Police Department, investigators were able to obtain records of the calls to and from the phones, but those records did not prove useful. By contrast, interviews with people who knew Owens suggested that he communicated mainly through text messages—the kind that travel as encrypted data—and had made plans to meet someone shortly before he was shot.

The information on his phone was not backed up automatically on Apple’s servers—apparently because he didn’t use wi-fi, which backups require.

[…]

But Dugan also wasn’t as quick to lay the blame solely on the encrypted phones. “I don’t know if getting in there, getting the information, would solve the case,” he said, “but it definitely would give us more investigative leads to follow up on.”

This is the first actual example I’ve seen illustrating the value of a backdoor. Unlike the increasingly common example of an ISIL handler abroad communicating securely with a radicalized person in the US, it’s an example where a backdoor might have helped. I say “might have,” because the Galaxy S6 is not encrypted by default, which means the victim deliberately turned the encryption on. If the native smartphone encryption had been backdoored, we don’t know if the victim would have turned it on nevertheless, or if he would have employed a different, non-backdoored, app.

The authors’ other examples are much sloppier:

Between October and June, 74 iPhones running the iOS 8 operating system could not be accessed by investigators for the Manhattan district attorney’s office—despite judicial warrants to search the devices. The investigations that were disrupted include the attempted murder of three individuals, the repeated sexual abuse of a child, a continuing sex trafficking ring and numerous assaults and robberies.

[…]

In France, smartphone data was vital to the swift investigation of the Charlie Hebdo terrorist attacks in January, and the deadly attack on a gas facility at Saint-Quentin-Fallavier, near Lyon, in June. And on a daily basis, our agencies rely on evidence lawfully retrieved from smartphones to fight sex crimes, child abuse, cybercrime, robberies or homicides.

We’ve heard that 74 number before. It’s over nine months, in an office that handles about 100,000 cases a year: less than 0.1% of the time. Details about those cases would be useful, so we can determine if encryption was just an impediment to investigation, or resulted in a criminal going free. The government needs to do a better job of presenting empirical data to support its case for backdoors. That they’re unable to do so suggests very strongly that an empirical analysis wouldn’t favor the government’s case.

As to the Charlie Hebdo case, it’s not clear how much of that vital smartphone data was actual data, and how much of it was unable-to-be-encrypted metadata. I am reminded of the examples that then-FBI-Director Louis Freeh would give during the First Crypto Wars in the 1990s. The big one used to illustrate the dangers of encryption was Mafia boss John Gotti. But the surveillance that convicted him was a room bug, not a wiretap. Given that the examples from FBI Director James Comey’s “going dark” speech last year were bogus, skepticism in the face of anecdote seems prudent.

So much of this “going dark” versus the “golden age of surveillance” debate depends on where you start from. Referring to that first Evanston example and the inability to get evidence from the victim’s phones, the op-ed authors write: “Until very recently, this situation would not have occurred.” That’s utter nonsense. From the beginning of time until very recently, this was the only situation that could have occurred. Objects in the vicinity of an event were largely mute about the past. Few things, save for eyewitnesses, could ever reach back in time and produce evidence. Even 15 years ago, the victim’s cell phone would have had no evidence on it that couldn’t have been obtained elsewhere, and that’s if the victim had been carrying a cell phone at all.

For most of human history, surveillance has been expensive. Over the last couple of decades, it has become incredibly cheap and almost ubiquitous. That a few bits and pieces are becoming expensive again isn’t a cause for alarm.

This essay originally appeared on Lawfare.

EDITED TO ADD (8/13): Excellent parody/commentary: “When Curtains Block Justice.”

Posted on August 12, 2015 at 2:18 PMView Comments

1 4 5 6 7 8 23

Sidebar photo of Bruce Schneier by Joe MacInnis.