Entries Tagged "cell phones"

Page 9 of 29

Facebook Using Physical Location to Suggest Friends

This could go badly:

People You May Know are people on Facebook that you might know,” a Facebook spokesperson said. “We show you people based on mutual friends, work and education information, networks you’re part of, contacts you’ve imported and many other factors.”

One of those factors is smartphone location. A Facebook spokesperson said though that shared location alone would not result in a friend suggestion, saying that the two parents must have had something else in common, such as overlapping networks.

“Location information by itself doesn’t indicate that two people might be friends,” said the Facebook spokesperson. “That’s why location is only one of the factors we use to suggest people you may know.”

The article goes on to describe situations where you don’t want Facebook to do this: Alcoholics Anonymous meetings, singles bars, some Tinder dates, and so on. But this is part of Facebook’s aggressive use of location data in many of its services.

BoingBoing post.

EDITED TO ADD: Facebook backtracks.

Posted on June 28, 2016 at 6:56 AMView Comments

FTC Investigating Android Patching Practices

It’s a known truth that most Android vulnerabilities don’t get patched. It’s not Google’s fault. It releases the patches, but the phone carriers don’t push them down to their smartphone users.

Now the Federal Communications Commission and the Federal Trade Commission are investigating, sending letters to major carriers and device makers.

I think this is a good thing. This is a long-existing market failure, and a place where we need government regulation to make us all more secure.

Posted on May 11, 2016 at 2:37 PMView Comments

Smartphone Forensics to Detect Distraction

The company Cellebrite is developing a portable forensics device that would determine if a smartphone user was using the phone at a particular time. The idea is to test phones of drivers after accidents:

Under the first-of-its-kind legislation proposed in New York, drivers involved in accidents would have to submit their phone to roadside testing from a textalyzer to determine whether the driver was using a mobile phone ahead of a crash. In a bid to get around the Fourth Amendment right to privacy, the textalyzer allegedly would keep conversations, contacts, numbers, photos, and application data private. It will solely say whether the phone was in use prior to a motor-vehicle mishap. Further analysis, which might require a warrant, could be necessary to determine whether such usage was via hands-free dashboard technology and to confirm the original finding.

This is interesting technology. To me, it feels no more intrusive than a breathalyzer, assuming that the textalyzer has all the privacy guards described above.

Slashdot thread. Reddit thread.

EDITED TO ADD (4/19): Good analysis and commentary.

Posted on April 13, 2016 at 6:51 AMView Comments

Bypassing Phone Security through Social Engineering

This works:

Khan was arrested in mid-July 2015. Undercover police officers posing as company managers arrived at his workplace and asked to check his driver and work records, according to the source. When they disputed where he was on a particular day, he got out his iPhone and showed them the record of his work.

The undercover officers asked to see his iPhone and Khan handed it over. After that, he was arrested. British police had 30 seconds to change the password settings to keep the phone open.

Reminds me about how the FBI arrested Ross William Ulbricht:

The agents had tailed him, waiting for the 29-year-old to open his computer and enter his passwords before swooping in.

That also works.

And, yes, I understand that none of this would have worked with the already dead Syed Farook and his iPhone.

Posted on April 7, 2016 at 6:39 AMView Comments

ISIS Encryption Opsec

Tidbits from the New York Times:

The final phase of Mr. Hame’s training took place at an Internet cafe in Raqqa, where an Islamic State computer specialist handed him a USB key. It contained CCleaner, a program used to erase a user’s online history on a given computer, as well as TrueCrypt, an encryption program that was widely available at the time and that experts say has not yet been cracked.

[…]

More than a year and a half earlier, the would-be Cannes bomber, Ibrahim Boudina, had tried to erase the previous three days of his search history, according to details in his court record, but the police were still able to recover it. They found that Mr. Boudina had been researching how to connect to the Internet via a secure tunnel and how to change his I.P. address.

Though he may have been aware of the risk of discovery, perhaps he was not worried enough.

Mr. Boudina had been sloppy enough to keep using his Facebook account, and his voluminous chat history allowed French officials to determine his allegiance to the Islamic State. Wiretaps of his friends and relatives, later detailed in French court records obtained by The Times and confirmed by security officials, further outlined his plot, which officials believe was going to target the annual carnival on the French Riviera.

Mr. Hame, in contrast, was given strict instructions on how to communicate. After he used TrueCrypt, he was to upload the encrypted message folder onto a Turkish commercial data storage site, from where it would be downloaded by his handler in Syria. He was told not to send it by email, most likely to avoid generating the metadata that records details like the point of origin and destination, even if the content of the missive is illegible. Mr. Hame described the website as “basically a dead inbox.”

The ISIS technician told Mr. Hame one more thing: As soon as he made it back to Europe, he needed to buy a second USB key, and transfer the encryption program to it. USB keys are encoded with serial numbers, so the process was not unlike a robber switching getaway cars.

“He told me to copy what was on the key and then throw it away,” Mr. Hame explained. “That’s what I did when I reached Prague.”

Mr. Abaaoud was also fixated on cellphone security. He jotted down the number of a Turkish phone that he said would be left in a building in Syria, but close enough to the border to catch the Turkish cell network, according to Mr. Hame’s account. Mr. Abaaoud apparently figured investigators would be more likely to track calls from Europe to Syrian phone numbers, and might overlook calls to a Turkish one.

Next to the number, Mr. Abaaoud scribbled “Dad.”

This seems like exactly the sort of opsec I would set up for an insurgent group.

EDITED TO ADD: Mistakes in the article. For example:

And now I’ve read one of the original French documents and confirmed my suspicion that the NYTimes article got details wrong.

The original French uses the word “boîte”, which matches the TrueCrypt term “container”. The original French didn’t use the words “fichier” (file), “dossier” (folder), or “répertoire” (directory). This makes so much more sense, and gives us more confidence we know what they were doing.

The original French uses the term “site de partage”, meaning a “sharing site”, which makes more sense than a “storage” site.

The document I saw says the slip of paper had login details for the file sharing site, not a TrueCrypt password. Thus, when the NYTimes article says “TrueCrypt login credentials”, we should correct it to “file sharing site login credentials”, not “TrueCrypt passphrase”.

MOST importantly, according the subject, the login details didn’t even work. It appears he never actually used this method—he was just taught how to use it. He no longer remembers the site’s name, other than it might have the word “share” in its name. We see this a lot: ISIS talks a lot about encryption, but the evidence of them actually using it is scant.

Posted on March 31, 2016 at 6:10 AMView Comments

Lawful Hacking and Continuing Vulnerabilities

The FBI’s legal battle with Apple is over, but the way it ended may not be good news for anyone.

Federal agents had been seeking to compel Apple to break the security of an iPhone 5c that had been used by one of the San Bernardino, Calif., terrorists. Apple had been fighting a court order to cooperate with the FBI, arguing that the authorities’ request was illegal and that creating a tool to break into the phone was itself harmful to the security of every iPhone user worldwide.

Last week, the FBI told the court it had learned of a possible way to break into the phone using a third party’s solution, without Apple’s help. On Monday, the agency dropped the case because the method worked. We don’t know who that third party is. We don’t know what the method is, or which iPhone models it applies to. Now it seems like we never will.

The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations.

Compare this iPhone vulnerability with another, one that was made public on the same day the FBI said it might have found its own way into the San Bernardino phone. Researchers at Johns Hopkins University announced last week that they had found a significant vulnerability in the iMessage protocol. They disclosed the vulnerability to Apple in the fall, and last Monday, Apple released an updated version of its operating system that fixed the vulnerability. (That’s iOS 9.3­you should download and install it right now.) The Hopkins team didn’t publish its findings until Apple’s patch was available, so devices could be updated to protect them from attacks using the researchers’ discovery.

This is how vulnerability research is supposed to work.

Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and­—more important­—everyone is more secure as a result of the work.

The FBI is doing the exact opposite. It has been given whatever vulnerability it used to get into the San Bernardino phone in secret, and it is keeping it secret. All of our iPhones remain vulnerable to this exploit. This includes the iPhones used by elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents.

This is the trade-off we have to consider: do we prioritize security over surveillance, or do we sacrifice security for surveillance?

The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies. A vulnerability in Windows 10, for example, affects all of us who use Windows 10. And it can be used by anyone who knows it, be they the FBI, a gang of cyber criminals, the intelligence agency of another country—anyone.

And once a vulnerability is found, it can be used for attack­—like the FBI is doing—or for defense, as in the Johns Hopkins example.

Over years of battling attackers and intruders, we’ve learned a lot about computer vulnerabilities. They’re plentiful: vulnerabilities are found and fixed in major systems all the time. They’re regularly discovered independently, by outsiders rather than by the original manufacturers or programmers. And once they’re discovered, word gets out. Today’s top-secret National Security Agency attack techniques become tomorrow’s PhD theses and the next day’s hacker tools.

The attack/defense trade-off is not new to the US government. They even have a process for deciding what to do when a vulnerability is discovered: whether they should be disclosed to improve all of our security, or kept secret to be used for offense. The White House claims that it prioritizes defense, and that general vulnerabilities in widely used computer systems are patched.

Whatever method the FBI used to get into the San Bernardino shooter’s iPhone is one such vulnerability. The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately.

This case has always been more about the PR battle and potential legal precedent than about the particular phone. And while the legal dispute is over, there are other cases involving other encrypted devices in other courts across the country. But while there will always be a few computers­—corporate servers, individual laptops or personal smartphones—­that the FBI would like to break into, there are far more such devices that we need to be secure.

One of the most surprising things about this debate is the number of former national security officials who came out on Apple’s side. They understand that we are singularly vulnerable to cyberattack, and that our cyberdefense needs to be as strong as possible.

The FBI’s myopic focus on this one investigation is understandable, but in the long run, it’s damaging to our national security.

This essay previously appeared in the Washington Post, with a far too click-bait headline.

EDITED TO ADD: To be fair, the FBI probably doesn’t know what the vulnerability is. And I wonder how easy it would be for Apple to figure it out. Given that the FBI has to exhaust all avenues of access before demanding help from Apple, we can learn which models are vulnerable by watching which legal suits are abandoned now that the FBI knows about this method.

Matt Blaze makes excellent points about how the FBI should disclose the vulnerabilities it uses, in order to improve computer security. That was part of a New York Times “Room for Debate” on hackers helping the FBI.

Susan Landau’s excellent Congressional testimony on the topic.

Posted on March 30, 2016 at 4:54 PMView Comments

FBI vs. Apple: Who Is Helping the FBI?

On Monday, the FBI asked the court for a two-week delay in a scheduled hearing on the San Bernardino iPhone case, because some “third party” approached it with a way into the phone. It wanted time to test this access method.

Who approached the FBI? We have no idea.

I have avoided speculation because the story makes no sense. Why did this third party wait so long? Why didn’t the FBI go through with the hearing anyway?

Now we have speculation that the third party is the Israeli forensic company Cellebrite. From its website:

Support for Locked iOS Devices Using UFED Physical Analyzer

Using UFED Physical Analyzer, physical and file system extractions, decoding and analysis can be performed on locked iOS devices with a simple or complex passcode. Simple passcodes will be recovered during the physical extraction process and enable access to emails and keychain passwords. If a complex password is set on the device, physical extraction can be performed without access to emails and keychain. However, if the complex password is known, emails and keychain passwords will be available.

My guess is that it’s not them. They have an existing and ongoing relationship with the FBI. If they could crack the phone, they would have done it months ago. This purchase order seems to be coincidental.

In any case, having a company name doesn’t mean that the story makes any more sense, but there it is. We’ll know more in a couple of weeks, although I doubt the FBI will share any more than they absolutely have to.

This development annoys me in every way. This case was never about the particular phone, it was about the precedent and the general issue of security vs. surveillance. This will just come up again another time, and we’ll have to through this all over again—maybe with a company that isn’t as committed to our privacy as Apple is.

EDITED TO ADD: Watch former NSA Director Michael Hayden defend Apple and iPhone security. I’ve never seen him so impassioned before.

EDITED TO ADD (3/26): Marcy Wheeler has written extensively about the Cellebrite possibility

Posted on March 24, 2016 at 12:34 PMView Comments

Another FBI Filing on the San Bernardino iPhone Case

The FBI’s reply to Apple is more of a character assassination attempt than a legal argument. It’s as if it only cares about public opinion at this point.

Although notice the threat in footnote 9 on page 22:

For the reasons discussed above, the FBI cannot itself modify the software on Farook’s iPhone without access to the source code and Apple’s private electronic signature. The government did not seek to compel Apple to turn those over because it believed such a request would be less palatable to Apple. If Apple would prefer that course, however, that may provide an alternative that requires less labor by Apple programmers.

This should immediately remind everyone of the Lavabit case, where the FBI did ask for the site’s master key in order to get at one user. Ladar Levison commented on the similarities. He, of course, shut his service down rather than turn over the master key. A company as large as Apple does not have that option. Marcy Wheeler wrote about this in detail.

My previous three posts on this are here, here, and here, all with lots of interesting links to various writings on this case.

EDITED TO ADD:The New York Times reports that the White House might have overreached in this case.

John Oliver has a great segment on this. With a Matt Blaze cameo!

Good NPR interview with Richard Clarke.

Well, I don’t think it’s a fierce debate. I think the Justice Department and the FBI are on their own here. You know, the secretary of defense has said how important encryption is when asked about this case. The National Security Agency director and three past National Security Agency directors, a former CIA director, a former Homeland Security secretary have all said that they’re much more sympathetic with Apple in this case. You really have to understand that the FBI director is exaggerating the need for this and is trying to build it up as an emotional case, organizing the families of the victims and all of that. And it’s Jim Comey and the attorney general is letting him get away with it.

Senator Lindsay Graham is changing his views:

“It’s just not so simple,” Graham said. “I thought it was that simple.”

Steven Levy on the history angle of this story.

Benjamin Wittes on possible legislative options.

EDITED TO ADD (3/17): Apple’s latest response is pretty withering. Commentary from Susan Crawford. FBI and China are on the same side. How this fight risks the whole US tech industry.

EDITED TO ADD (3/18): Tim Cook interview. Apple engineers might refuse to help the FBI, if Apple loses the case. And I should have previously posted this letter from racial justice activists, and this more recent essay on how this affects the LGBTQ community.

EDITED TO ADD (3/21): Interesting article on the Apple/FBI tensions that led to this case.

Posted on March 16, 2016 at 6:12 AMView Comments

Company Tracks Iowa Caucusgoers by their Cell Phones

It’s not just governments. Companies like Dstillery are doing this too:

“We watched each of the caucus locations for each party and we collected mobile device ID’s,” Dstillery CEO Tom Phillips said. “It’s a combination of data from the phone and data from other digital devices.”

Dstillery found some interesting things about voters. For one, people who loved to grill or work on their lawns overwhelmingly voted for Trump in Iowa, according to Phillips. There was some pretty unexpected characteristics that came up too.

“NASCAR was the one outlier, for Trump and Clinton,” Phillips said. “In Clinton’s counties, NASCAR way over-indexed.”

Kashmir Hill wondered how:

What really happened is that Dstillery gets information from people’s phones via ad networks. When you open an app or look at a browser page, there’s a very fast auction that happens where different advertisers bid to get to show you an ad. Their bid is based on how valuable they think you are, and to decide that, your phone sends them information about you, including, in many cases, an identifying code (that they’ve built a profile around) and your location information, down to your latitude and longitude.

Yes, for the vast majority of people, ad networks are doing far more information collection about them than the NSA­—but they don’t explicitly link it to their names.

So on the night of the Iowa caucus, Dstillery flagged all the auctions that took place on phones in latitudes and longitudes near caucus locations. It wound up spotting 16,000 devices on caucus night, as those people had granted location privileges to the apps or devices that served them ads. It captured those mobile ID’s and then looked up the characteristics associated with those IDs in order to make observations about the kind of people that went to Republican caucus locations (young parents) versus Democrat caucus locations. It drilled down farther (e.g., ‘people who like NASCAR voted for Trump and Clinton’) by looking at which candidate won at a particular caucus location.

Okay, so it didn’t collect names. But how much harder could that have been?

Posted on March 2, 2016 at 6:34 AMView Comments

1 7 8 9 10 11 29

Sidebar photo of Bruce Schneier by Joe MacInnis.