Entries Tagged "Apple"

Page 10 of 17

Apple's Differential Privacy

At the Apple Worldwide Developers Conference earlier this week, Apple talked about something called “differential privacy.” We know very little about the details, but it seems to be an anonymization technique designed to collect user data without revealing personal information.

What we know about anonymization is that it’s much harder than people think, and it’s likely that this technique will be full of privacy vulnerabilities. (See, for example, the excellent work of Latanya Sweeney.) As expected, security experts are skeptical. Here’s Matt Green trying to figure it out.

So while I applaud Apple for trying to improve privacy within its business models, I would like some more transparency and some more public scrutiny.

EDITED TO ADD (6/17): Adam Shostack comments. And more commentary from Tom’s Guide.

EDITED TO ADD (6/17): Here’s a slide deck on privacy from the WWDC.

Posted on June 16, 2016 at 9:30 PMView Comments

GCHQ Discloses Two OS X Vulnerabilities to Apple

This is good news:

Communications and Electronics Security Group (CESG), the information security arm of GCHQ, was credited with the discovery of two vulnerabilities that were patched by Apple last week.

The flaws could allow hackers to corrupt memory and cause a denial of service through a crafted app or execute arbitrary code in a privileged context.

The memory handling vulnerabilities (CVE-2016-1822 and CVE-2016-1829) affect OS X El Capitan v10.11 and later operating systems, according to Apple’s 2016-003 security update. The memory corruption vulnerabilities allowed hackers to execute arbitrary code with kernel privileges.

There’s still a lot that needs to be said about this equities process.

Posted on May 24, 2016 at 2:12 PMView Comments

Lawful Hacking and Continuing Vulnerabilities

The FBI’s legal battle with Apple is over, but the way it ended may not be good news for anyone.

Federal agents had been seeking to compel Apple to break the security of an iPhone 5c that had been used by one of the San Bernardino, Calif., terrorists. Apple had been fighting a court order to cooperate with the FBI, arguing that the authorities’ request was illegal and that creating a tool to break into the phone was itself harmful to the security of every iPhone user worldwide.

Last week, the FBI told the court it had learned of a possible way to break into the phone using a third party’s solution, without Apple’s help. On Monday, the agency dropped the case because the method worked. We don’t know who that third party is. We don’t know what the method is, or which iPhone models it applies to. Now it seems like we never will.

The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations.

Compare this iPhone vulnerability with another, one that was made public on the same day the FBI said it might have found its own way into the San Bernardino phone. Researchers at Johns Hopkins University announced last week that they had found a significant vulnerability in the iMessage protocol. They disclosed the vulnerability to Apple in the fall, and last Monday, Apple released an updated version of its operating system that fixed the vulnerability. (That’s iOS 9.3­you should download and install it right now.) The Hopkins team didn’t publish its findings until Apple’s patch was available, so devices could be updated to protect them from attacks using the researchers’ discovery.

This is how vulnerability research is supposed to work.

Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and­—more important­—everyone is more secure as a result of the work.

The FBI is doing the exact opposite. It has been given whatever vulnerability it used to get into the San Bernardino phone in secret, and it is keeping it secret. All of our iPhones remain vulnerable to this exploit. This includes the iPhones used by elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents.

This is the trade-off we have to consider: do we prioritize security over surveillance, or do we sacrifice security for surveillance?

The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies. A vulnerability in Windows 10, for example, affects all of us who use Windows 10. And it can be used by anyone who knows it, be they the FBI, a gang of cyber criminals, the intelligence agency of another country—anyone.

And once a vulnerability is found, it can be used for attack­—like the FBI is doing—or for defense, as in the Johns Hopkins example.

Over years of battling attackers and intruders, we’ve learned a lot about computer vulnerabilities. They’re plentiful: vulnerabilities are found and fixed in major systems all the time. They’re regularly discovered independently, by outsiders rather than by the original manufacturers or programmers. And once they’re discovered, word gets out. Today’s top-secret National Security Agency attack techniques become tomorrow’s PhD theses and the next day’s hacker tools.

The attack/defense trade-off is not new to the US government. They even have a process for deciding what to do when a vulnerability is discovered: whether they should be disclosed to improve all of our security, or kept secret to be used for offense. The White House claims that it prioritizes defense, and that general vulnerabilities in widely used computer systems are patched.

Whatever method the FBI used to get into the San Bernardino shooter’s iPhone is one such vulnerability. The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately.

This case has always been more about the PR battle and potential legal precedent than about the particular phone. And while the legal dispute is over, there are other cases involving other encrypted devices in other courts across the country. But while there will always be a few computers­—corporate servers, individual laptops or personal smartphones—­that the FBI would like to break into, there are far more such devices that we need to be secure.

One of the most surprising things about this debate is the number of former national security officials who came out on Apple’s side. They understand that we are singularly vulnerable to cyberattack, and that our cyberdefense needs to be as strong as possible.

The FBI’s myopic focus on this one investigation is understandable, but in the long run, it’s damaging to our national security.

This essay previously appeared in the Washington Post, with a far too click-bait headline.

EDITED TO ADD: To be fair, the FBI probably doesn’t know what the vulnerability is. And I wonder how easy it would be for Apple to figure it out. Given that the FBI has to exhaust all avenues of access before demanding help from Apple, we can learn which models are vulnerable by watching which legal suits are abandoned now that the FBI knows about this method.

Matt Blaze makes excellent points about how the FBI should disclose the vulnerabilities it uses, in order to improve computer security. That was part of a New York Times “Room for Debate” on hackers helping the FBI.

Susan Landau’s excellent Congressional testimony on the topic.

Posted on March 30, 2016 at 4:54 PMView Comments

FBI vs. Apple: Who Is Helping the FBI?

On Monday, the FBI asked the court for a two-week delay in a scheduled hearing on the San Bernardino iPhone case, because some “third party” approached it with a way into the phone. It wanted time to test this access method.

Who approached the FBI? We have no idea.

I have avoided speculation because the story makes no sense. Why did this third party wait so long? Why didn’t the FBI go through with the hearing anyway?

Now we have speculation that the third party is the Israeli forensic company Cellebrite. From its website:

Support for Locked iOS Devices Using UFED Physical Analyzer

Using UFED Physical Analyzer, physical and file system extractions, decoding and analysis can be performed on locked iOS devices with a simple or complex passcode. Simple passcodes will be recovered during the physical extraction process and enable access to emails and keychain passwords. If a complex password is set on the device, physical extraction can be performed without access to emails and keychain. However, if the complex password is known, emails and keychain passwords will be available.

My guess is that it’s not them. They have an existing and ongoing relationship with the FBI. If they could crack the phone, they would have done it months ago. This purchase order seems to be coincidental.

In any case, having a company name doesn’t mean that the story makes any more sense, but there it is. We’ll know more in a couple of weeks, although I doubt the FBI will share any more than they absolutely have to.

This development annoys me in every way. This case was never about the particular phone, it was about the precedent and the general issue of security vs. surveillance. This will just come up again another time, and we’ll have to through this all over again—maybe with a company that isn’t as committed to our privacy as Apple is.

EDITED TO ADD: Watch former NSA Director Michael Hayden defend Apple and iPhone security. I’ve never seen him so impassioned before.

EDITED TO ADD (3/26): Marcy Wheeler has written extensively about the Cellebrite possibility

Posted on March 24, 2016 at 12:34 PMView Comments

iMessage Encryption Flaw Found and Fixed

Matthew Green and team found and reported a significant iMessage encryption flaw last year.

Green suspected there might be a flaw in iMessage last year after he read an Apple security guide describing the encryption process and it struck him as weak. He said he alerted the firm’s engineers to his concern. When a few months passed and the flaw remained, he and his graduate students decided to mount an attack to show that they could pierce the encryption on photos or videos sent through iMessage.

It took a few months, but they succeeded, targeting phones that were not using the latest operating system on iMessage, which launched in 2011.

To intercept a file, the researchers wrote software to mimic an Apple server. The encrypted transmission they targeted contained a link to the photo stored in Apple’s iCloud server as well as a 64-digit key to decrypt the photo.

Although the students could not see the key’s digits, they guessed at them by a repetitive process of changing a digit or a letter in the key and sending it back to the target phone. Each time they guessed a digit correctly, the phone accepted it. They probed the phone in this way thousands of times.

“And we kept doing that,” Green said, “until we had the key.”

A modified version of the attack would also work on later operating systems, Green said, adding that it would likely have taken the hacking skills of a nation-state.

This flaw is fixed in iOS 9.3. (You should download and install it now.)

I wrote about this flaw in IEEE Security and Privacy earlier this year:

Going back to the new vulnerability that you’ll learn about in mid-February, the lead researcher wrote to me: “If anyone tells you that [the vendor] can just ‘tweak’ the system a little bit to add key escrow or to man-in-the-middle specific users, they need to spend a few days watching the authentication dance between [the client device/software] and the umpteen servers it talks to just to log into the network. I’m frankly amazed that any of it works at all, and you couldn’t pay me enough to tamper with any of it.” This is an important piece of wisdom.

The designers of this system aren’t novices. They’re an experienced team with some of the best security engineers in the field. If these guys can’t get the security right, just imagine how much worse it is for smaller companies without this team’s level of expertise and resources. Now imagine how much worse it would be if you add a government-mandated back door. There are more opportunities to get security wrong, and more engineering teams without the time and expertise necessary to get

Related: A different iOS flaw was reported last week. Called AceDeceiver, it is a Trojan that allows an attacker to install malicious software onto an iOS device, bypassing Apple’s DRM protections. I don’t believe that Apple has fixed this yet, although it seems as if Apple just has to add a certificate revocation list, or make the certs nonreplayable by having some mandatory interaction with the iTunes store.

EDITED (4/14): The paper describing the iMessage flaw.

Posted on March 21, 2016 at 1:45 PMView Comments

Another FBI Filing on the San Bernardino iPhone Case

The FBI’s reply to Apple is more of a character assassination attempt than a legal argument. It’s as if it only cares about public opinion at this point.

Although notice the threat in footnote 9 on page 22:

For the reasons discussed above, the FBI cannot itself modify the software on Farook’s iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers.

This should immediately remind everyone of the Lavabit case, where the FBI did ask for the site’s master key in order to get at one user. Ladar Levison commented on the similarities. He, of course, shut his service down rather than turn over the master key. A company as large as Apple does not have that option. Marcy Wheeler wrote about this in detail.

My previous three posts on this are here, here, and here, all with lots of interesting links to various writings on this case.

EDITED TO ADD:The New York Times reports that the White House might have overreached in this case.

John Oliver has a great segment on this. With a Matt Blaze cameo!

Good NPR interview with Richard Clarke.

Well, I don’t think it’s a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they’re much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it’s Jim Comey and the attorney general is letting him get away with it.

Senator Lindsay Graham is changing his views:

“It’s just not so simple,” Graham said. “I thought it was that simple.”

Steven Levy on the history angle of this story.

Benjamin Wittes on possible legislative options.

EDITED TO ADD (3/17): Apple’s latest response is pretty withering. Commentary from Susan Crawford. FBI and China are on the same side. How this fight risks the whole US tech industry.

EDITED TO ADD (3/18): Tim Cook interview. Apple engineers might refuse to help the FBI, if Apple loses the case. And I should have previously posted this letter from racial justice activists, and this more recent essay on how this affects the LGBTQ community.

EDITED TO ADD (3/21): Interesting article on the Apple/FBI tensions that led to this case.

Posted on March 16, 2016 at 6:12 AMView Comments

Lots More Writing about the FBI vs. Apple

I have written two posts on the case, and at the bottom of those essays are lots of links to other essays written by other people. Here are more links.

If you read just one thing on the technical aspects of this case, read Susan Landau’s testimony before the House Judiciary Committee. It’s very comprehensive, and very good.

Others are testifying, too.

Apple is fixing the vulnerability. The Justice Department wants Apple to unlock nine more phones.

Apple prevailed in a different iPhone unlocking case.

Why the First Amendment is a bad argument. And why the All Writs Act is the wrong tool.

Dueling poll results: Pew Research reports that 51% side with the FBI, while a Reuters poll reveals that “forty-six percent of respondents said they agreed with Apple’s position, 35 percent said they disagreed and 20 percent said they did not know,” and that “a majority of Americans do not want the government to have access to their phone and Internet communications, even if it is done in the name of stopping terror attacks.”

One of the worst possible outcomes from this story is that people stop installing security updates because they don’t trust them. After all, a security update mechanism is also a mechanism by which the government can install a backdoor. Here’s one essay that talks about that. Here’s another.

Cory Doctorow comments on the FBI’s math denialism. Yochai Benkler sees this as a symptom of a greater breakdown in government trust. More good commentary from Jeff Schiller, Julian Sanchez, and Jonathan Zdziarski. Marcy Wheeler’s comments. Two posts by Dan Wallach. Michael Chertoff and associates weigh in on the side of security over surveillance.

Here’s a Catholic op-ed on Apple’s side. Bill Gates sides with the FBI. And a great editorial cartoon.

Here’s high snark from Stewart Baker. Baker asks some very good (and very snarky) questions. But the questions are beside the point. This case isn’t about Apple or whether Apple is being hypocritical, any more than climate change is about Al Gore’s character. This case is about the externalities of what the government is asking for.

One last thing to read.

Okay, one more, on the more general back door issue.

EDITED TO ADD (3/2): Wall Street Journal editorial. And here’s video from the House Judiciary Committee hearing. Skip to around 34:50 to get to the actual beginning.

EDITED TO ADD (3/3): Interview with Rep. Darrell Issa. And at the RSA Conference this week, both Defense Secretary Ash Carter and Microsoft’s chief legal officer Brad Smith sided with Apple against the FBI.

EDITED TO ADD (3/4): Comments on the case from the UN High Commissioner for Human Rights.

EDITED TO ADD (3/7): Op ed by Apple. And an interesting article on the divide in the Obama Administration.

EDITED TO ADD (3/10): Another good essay.

EDITED TO ADD (3/13): President Obama’s comments on encryption: he wants back doors. Cory Doctorow reports.

Posted on March 1, 2016 at 6:47 AMView Comments

Decrypting an iPhone for the FBI

Earlier this week, a federal magistrate ordered Apple to assist the FBI in hacking into the iPhone used by one of the San Bernardino shooters. Apple will fight this order in court.

The policy implications are complicated. The FBI wants to set a precedent that tech companies will assist law enforcement in breaking their users’ security, and the technology community is afraid that the precedent will limit what sorts of security features it can offer customers. The FBI sees this as a privacy vs. security debate, while the tech community sees it as a security vs. surveillance debate.

The technology considerations are more straightforward, and shine a light on the policy questions.

The iPhone 5c in question is encrypted. This means that someone without the key cannot get at the data. This is a good security feature. Your phone is a very intimate device. It is likely that you use it for private text conversations, and that it’s connected to your bank accounts. Location data reveals where you’ve been, and correlating multiple phones reveals who you associate with. Encryption protects your phone if it’s stolen by criminals. Encryption protects the phones of dissidents around the world if they’re taken by local police. It protects all the data on your phone, and the apps that increasingly control the world around you.

This encryption depends on the user choosing a secure password, of course. If you had an older iPhone, you probably just used the default four-digit password. That’s only 10,000 possible passwords, making it pretty easy to guess. If the user enabled the more-secure alphanumeric password, that means a harder-to-guess password.

Apple added two more security features on the iPhone. First, a phone could be configured to erase the data after too many incorrect password guesses. And it enforced a delay between password guesses. This delay isn’t really noticeable by the user if you type the wrong password and then have to retype the correct password, but it’s a large barrier for anyone trying to guess password after password in a brute-force attempt to break into the phone.

But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key. This is what the FBI ­ and now the court ­ is demanding Apple do: It wants Apple to rewrite the phone’s software to make it possible to guess possible passwords quickly and automatically.

The FBI’s demands are specific to one phone, which might make its request seem reasonable if you don’t consider the technological implications: Authorities have the phone in their lawful possession, and they only need help seeing what’s on it in case it can tell them something about how the San Bernardino shooters operated. But the hacked software the court and the FBI wants Apple to provide would be general. It would work on any phone of the same model. It has to.

Make no mistake; this is what a backdoor looks like. This is an existing vulnerability in iPhone security that could be exploited by anyone.

There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world. Have the Chinese, for instance, written a hacked Apple operating system that records conversations and automatically forwards them to police? They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.

And while this sort of attack might be limited to state actors today, remember that attacks always get easier. Technology broadly spreads capabilities, and what was hard yesterday becomes easy tomorrow. Today’s top-secret NSA programs become tomorrow’s PhD theses and the next day’s hacker tools. Soon this flaw will be exploitable by cybercriminals to steal your financial data. Everyone with an iPhone is at risk, regardless of what the FBI demands Apple do

What the FBI wants to do would make us less secure, even though it’s in the name of keeping us safe from harm. Powerful governments, democratic and totalitarian alike, want access to user data for both law enforcement and social control. We cannot build a backdoor that only works for a particular type of government, or only in the presence of a particular court order.

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.

This essay previously appeared in the Washington Post

The original essay contained a major error.

I wrote: “This is why Apple fixed this security flaw in 2014. Apple’s iOS 8.0 and its phones with an A7 or later processor protect the phone’s software as well as the data. If you have a newer iPhone, you are not vulnerable to this attack. You are more secure – from the government of whatever country you’re living in, from cybercriminals and from hackers.” Also: “We are all more secure now that Apple has closed that vulnerability.”

That was based on a misunderstanding of the security changes Apple made in what is known as the “Secure Enclave.” It turns out that all iPhones have this security vulnerability: all can have their software updated without knowing the password. The updated code has to be signed with Apple’s key, of course, which adds a major difficulty to the attack.

Dan Guido writes:

If the device lacks a Secure Enclave, then a single firmware update to iOS will be sufficient to disable passcode delays and auto erase. If the device does contain a Secure Enclave, then two firmware updates, one to iOS and one to the Secure Enclave, are required to disable these security features. The end result in either case is the same. After modification, the device is able to guess passcodes at the fastest speed the hardware supports.

The recovered iPhone is a model 5C. The iPhone 5C lacks TouchID and, therefore, lacks a Secure Enclave. The Secure Enclave is not a concern. Nearly all of the passcode protections are implemented in software by the iOS operating system and are replaceable by a single firmware update.

EDITED TO ADD (2/22): Lots more on my previous blog post on the topic.

How to set a longer iPhone password and thwart this kind of attack. Comey on the issue. And a secret memo describes the FBI’s broader strategy to weaken security.

Orin Kerr’s thoughts: Part 1, Part 2, and Part 3.

EDITED TO ADD (2/22): Tom Cook’s letter to his employees, and an FAQ. How CALEA relates to all this. Here’s what’s not available in the iCloud backup. The FBI told the county to change the password on the phone—that’s why they can’t get in. What the FBI needs is technical expertise, not back doors. And it’s not just this iPhone; the FBI wants Apple to break into lots of them. What China asks of tech companies—not that this is a country we should particularly want to model. Former NSA Director Michael Hayden on the case. There is a quite a bit of detail about the Apple efforts to assist the FBI in the legal motion the Department of Justice filed. Two good essays. Jennifer Granick’s comments.

In my essay, I talk about other countries developing this capability with Apple’s knowledge or consent. Making it work requires stealing a copy of Apple’s code-signing key, something that has been done by the authors of Stuxnet (probably the US) and Flame (probably Russia) in the past.

Posted on February 22, 2016 at 6:58 AMView Comments

Judge Demands that Apple Backdoor an iPhone

A judge has ordered that Apple bypass iPhone security in order for the FBI to attempt a brute-force password attack on an iPhone 5c used by one of the San Bernardino killers. Apple is refusing.

The order is pretty specific technically. This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.

From Apple’s statement about its refusal:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks ­ from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers ­ including tens of millions of American citizens ­ from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Congressman Ted Lieu comments.

Here’s an interesting essay about why Tim Cook and Apple are such champions for encryption and privacy.

Today I walked by a television showing CNN. The sound was off, but I saw an aerial scene which I presume was from San Bernardino, and the words “Apple privacy vs. national security.” If that’s the framing, we lose. I would have preferred to see “National security vs. FBI access.”

Slashdot thread.

EDITED TO ADD (2/18): Good analysis of Apple’s case. Interesting debate. Nicholas Weaver’s comments. And commentary from some other planet.

EDITED TO ADD (2/19): Ben Adida comments:

What’s probably happening is that the FBI is using this as a test case for the general principle that they should be able to compel tech companies to assist in police investigations. And that’s pretty smart, because it’s a pretty good test case: Apple obviously wants to help prevent terrorist attacks, so they’re left to argue the slippery slope argument in the face of an FBI investigation of a known terrorist. Well done, FBI, well done.

And Julian Sanchez’s comments. His conclusion:

These, then, are the high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.

A New York Times editorial.

Also, two questions: One, what do we know about Apple’s assistance in the past, and why this one is different? Two, has anyone speculated on how much this will cost Apple? The FBI is demanding that Apple give them free engineering work. What’s the value of that work?

EDITED TO ADD (2/20): Jonathan Zdziarski writes on the differences between the FBI compelling someone to provide a service versus build a tool, and why the latter will 1) be difficult and expensive, 2) will get out into the wild, and 3) set a dangerous precedent.

This answers my first question, above:

For years, the government could come to Apple with a subpoena and a phone, and have the manufacturer provide a disk image of the device. This largely worked because Apple didn’t have to hack into their phones to do this. Up until iOS 8, the encryption Apple chose to use in their design was easily reversible when you had code execution on the phone (which Apple does). So all through iOS 7, Apple only needed to insert the key into the safe and provide FBI with a copy of the data.

EFF wrote a good technical explainer on the case. My only complaint is with the last section. I have heard directly from Apple that this technique still works on current model phones using the current iOS version.

I am still stunned by how good a case the FBI chose to push this. They have all the sympathy in the media that they could hope for.

EDITED TO ADD (2/20): Tim Cook as privacy advocate. How the back door works on modern iPhones. Why the average American should care. The grugq on what this all means.

EDITED TO ADD (2/22): I wrote an op ed for the Washington Post.

Posted on February 17, 2016 at 2:15 PMView Comments

Mac OS X, iOS, and Flash Had the Most Discovered Vulnerabilities in 2015

Interesting analysis:

Which software had the most publicly disclosed vulnerabilities this year? The winner is none other than Apple’s Mac OS X, with 384 vulnerabilities. The runner-up? Apple’s iOS, with 375 vulnerabilities.

Rounding out the top five are Adobe’s Flash Player, with 314 vulnerabilities; Adobe’s AIR SDK, with 246 vulnerabilities; and Adobe AIR itself, also with 246 vulnerabilities. For comparison, last year the top five (in order) were: Microsoft’s Internet Explorer, Apple’s Mac OS X, the Linux Kernel, Google’s Chrome, and Apple’s iOS.

The article goes on to explain why Windows vulnerabilities might be counted higher, and gives the top 50 software packages for vulnerabilities.

The interesting discussion topic is how this relates to how secure the software is. Is software with more discovered vulnerabilities better because they’re all fixed? Is software with more discovered vulnerabilities less secure because there are so many? Or are they all equally bad, and people just look at some software more than others? No one knows.

Posted on January 11, 2016 at 2:33 PMView Comments

1 8 9 10 11 12 17

Sidebar photo of Bruce Schneier by Joe MacInnis.