Entries Tagged "Android"

Page 4 of 8

Personal Data Sharing by Mobile Apps

Interesting research:

“Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps,” by Jinyan Zang, Krysta Dummit, James Graves, Paul Lisker, and Latanya Sweeney.

We tested 110 popular, free Android and iOS apps to look for apps that shared personal, behavioral, and location data with third parties.

73% of Android apps shared personal information such as email address with third parties, and 47% of iOS apps shared geo-coordinates and other location data with third parties.

93% of Android apps tested connected to a mysterious domain, safemovedm.com, likely due to a background process of the Android phone.

We show that a significant proportion of apps share data from user inputs such as personal information or search terms with third parties without Android or iOS requiring a notification to the user.

EDITED TO ADD: News article.

Posted on November 13, 2015 at 6:08 AMView Comments

Security Risks of Unpatched Android Software

A lot has been written about the security vulnerability resulting from outdated and unpatched Android software. The basic problem is that while Google regularly updates the Android software, phone manufacturers don’t regularly push updates out to Android users.

New research tries to quantify the risk:

We are presenting a paper at SPSM next week that shows that, on average over the last four years, 87% of Android devices are vulnerable to attack by malicious apps. This is because manufacturers have not provided regular security updates. Some manufacturers are much better than others however, and our study shows that devices built by LG and Motorola, as well as those devices shipped under the Google Nexus brand are much better than most. Users, corporate buyers and regulators can find further details on manufacturer performance at AndroidVulnerabilities.org.

Posted on October 21, 2015 at 6:22 AMView Comments

Regularities in Android Lock Patterns

Interesting:

Marte Løge, a 2015 graduate of the Norwegian University of Science and Technology, recently collected and analyzed almost 4,000 ALPs as part of her master’s thesis. She found that a large percentage of them­ — 44 percent­ — started in the top left-most node of the screen. A full 77 percent of them started in one of the four corners. The average number of nodes was about five, meaning there were fewer than 9,000 possible pattern combinations. A significant percentage of patterns had just four nodes, shrinking the pool of available combinations to 1,624. More often than not, patterns moved from left to right and top to bottom, another factor that makes guessing easier.

EDITED TO ADD (9/10): Similar research on this sort of thing.

Posted on August 26, 2015 at 6:24 AMView Comments

Another Salvo in the Second Crypto War (of Words)

Prosecutors from New York, London, Paris, and Madrid wrote an op-ed in yesterday’s New York Times in favor of backdoors in cell phone encryption. There are a number of flaws in their argument, ranging from how easy it is to get data off an encrypted phone to the dangers of designing a backdoor in the first place, but all of that has been said before. And since anecdote can be more persuasive than data, the op-ed started with one:

In June, a father of six was shot dead on a Monday afternoon in Evanston, Ill., a suburb 10 miles north of Chicago. The Evanston police believe that the victim, Ray C. Owens, had also been robbed. There were no witnesses to his killing, and no surveillance footage either.

With a killer on the loose and few leads at their disposal, investigators in Cook County, which includes Evanston, were encouraged when they found two smartphones alongside the body of the deceased: an iPhone 6 running on Apple’s iOS 8 operating system, and a Samsung Galaxy S6 Edge running on Google’s Android operating system. Both devices were passcode protected.

You can guess the rest. A judge issued a warrant, but neither Apple nor Google could unlock the phones. “The homicide remains unsolved. The killer remains at large.”

The Intercept researched the example, and it seems to be real. The phones belonged to the victim, and…

According to Commander Joseph Dugan of the Evanston Police Department, investigators were able to obtain records of the calls to and from the phones, but those records did not prove useful. By contrast, interviews with people who knew Owens suggested that he communicated mainly through text messages — the kind that travel as encrypted data — and had made plans to meet someone shortly before he was shot.

The information on his phone was not backed up automatically on Apple’s servers — apparently because he didn’t use wi-fi, which backups require.

[…]

But Dugan also wasn’t as quick to lay the blame solely on the encrypted phones. “I don’t know if getting in there, getting the information, would solve the case,” he said, “but it definitely would give us more investigative leads to follow up on.”

This is the first actual example I’ve seen illustrating the value of a backdoor. Unlike the increasingly common example of an ISIL handler abroad communicating securely with a radicalized person in the US, it’s an example where a backdoor might have helped. I say “might have,” because the Galaxy S6 is not encrypted by default, which means the victim deliberately turned the encryption on. If the native smartphone encryption had been backdoored, we don’t know if the victim would have turned it on nevertheless, or if he would have employed a different, non-backdoored, app.

The authors’ other examples are much sloppier:

Between October and June, 74 iPhones running the iOS 8 operating system could not be accessed by investigators for the Manhattan district attorney’s office — despite judicial warrants to search the devices. The investigations that were disrupted include the attempted murder of three individuals, the repeated sexual abuse of a child, a continuing sex trafficking ring and numerous assaults and robberies.

[…]

In France, smartphone data was vital to the swift investigation of the Charlie Hebdo terrorist attacks in January, and the deadly attack on a gas facility at Saint-Quentin-Fallavier, near Lyon, in June. And on a daily basis, our agencies rely on evidence lawfully retrieved from smartphones to fight sex crimes, child abuse, cybercrime, robberies or homicides.

We’ve heard that 74 number before. It’s over nine months, in an office that handles about 100,000 cases a year: less than 0.1% of the time. Details about those cases would be useful, so we can determine if encryption was just an impediment to investigation, or resulted in a criminal going free. The government needs to do a better job of presenting empirical data to support its case for backdoors. That they’re unable to do so suggests very strongly that an empirical analysis wouldn’t favor the government’s case.

As to the Charlie Hebdo case, it’s not clear how much of that vital smartphone data was actual data, and how much of it was unable-to-be-encrypted metadata. I am reminded of the examples that then-FBI-Director Louis Freeh would give during the First Crypto Wars in the 1990s. The big one used to illustrate the dangers of encryption was Mafia boss John Gotti. But the surveillance that convicted him was a room bug, not a wiretap. Given that the examples from FBI Director James Comey’s “going dark” speech last year were bogus, skepticism in the face of anecdote seems prudent.

So much of this “going dark” versus the “golden age of surveillance” debate depends on where you start from. Referring to that first Evanston example and the inability to get evidence from the victim’s phones, the op-ed authors write: “Until very recently, this situation would not have occurred.” That’s utter nonsense. From the beginning of time until very recently, this was the only situation that could have occurred. Objects in the vicinity of an event were largely mute about the past. Few things, save for eyewitnesses, could ever reach back in time and produce evidence. Even 15 years ago, the victim’s cell phone would have had no evidence on it that couldn’t have been obtained elsewhere, and that’s if the victim had been carrying a cell phone at all.

For most of human history, surveillance has been expensive. Over the last couple of decades, it has become incredibly cheap and almost ubiquitous. That a few bits and pieces are becoming expensive again isn’t a cause for alarm.

This essay originally appeared on Lawfare.

EDITED TO ADD (8/13): Excellent parody/commentary: “When Curtains Block Justice.”

Posted on August 12, 2015 at 2:18 PMView Comments

Stagefright Vulnerability in Android Phones

The Stagefright vulnerability for Android phones is a bad one. It’s exploitable via a text message (details depend on auto downloading of the particular phone), it runs at an elevated privilege (again, the severity depends on the particular phone — on some phones it’s full privilege), and it’s trivial to weaponize. Imagine a worm that infects a phone and then immediately sends a copy of itself to everyone on that phone’s contact list.

The worst part of this is that it’s an Android exploit, so most phones won’t be patched anytime soon — if ever. (The people who discovered the bug alerted Google in April. Google has sent patches to its phone manufacturer partners, but most of them have not sent the patch to Android phone users.)

Posted on July 28, 2015 at 6:37 AMView Comments

Using Secure Chat

Micah Lee has a good tutorial on installing and using secure chat.

To recap: We have installed Orbot and connected to the Tor network on Android, and we have installed ChatSecure and created an anonymous secret identity Jabber account. We have added a contact to this account, started an encrypted session, and verified that their OTR fingerprint is correct. And now we can start chatting with them with an extraordinarily high degree of privacy.

FBI Director James Comey, UK Prime Minister David Cameron, and totalitarian governments around the world all don’t want you to be able to do this.

Posted on July 17, 2015 at 6:35 AMView Comments

Samsung Television Spies on Viewers

Earlier this week, we learned that Samsung televisions are eavesdropping on their owners. If you have one of their Internet-connected smart TVs, you can turn on a voice command feature that saves you the trouble of finding the remote, pushing buttons and scrolling through menus. But making that feature work requires the television to listen to everything you say. And what you say isn’t just processed by the television; it may be forwarded over the Internet for remote processing. It’s literally Orwellian.

This discovery surprised people, but it shouldn’t have. The things around us are increasingly computerized, and increasingly connected to the Internet. And most of them are listening.

Our smartphones and computers, of course, listen to us when we’re making audio and video calls. But the microphones are always there, and there are ways a hacker, government, or clever company can turn those microphones on without our knowledge. Sometimes we turn them on ourselves. If we have an iPhone, the voice-processing system Siri listens to us, but only when we push the iPhone’s button. Like Samsung, iPhones with the “Hey Siri” feature enabled listen all the time. So do Android devices with the “OK Google” feature enabled, and so does an Amazon voice-activated system called Echo. Facebook has the ability to turn your smartphone’s microphone on when you’re using the app.

Even if you don’t speak, our computers are paying attention. Gmail “listens” to everything you write, and shows you advertising based on it. It might feel as if you’re never alone. Facebook does the same with everything you write on that platform, and even listens to the things you type but don’t post. Skype doesn’t listen — we think — but as Der Spiegel notes, data from the service “has been accessible to the NSA’s snoops” since 2011.

So the NSA certainly listens. It listens directly, and it listens to all these companies listening to you. So do other countries like Russia and China, which we really don’t want listening so closely to their citizens.

It’s not just the devices that listen; most of this data is transmitted over the Internet. Samsung sends it to what was referred to as a “third party” in its policy statement. It later revealed that third party to be a company you’ve never heard of — Nuance — that turns the voice into text for it. Samsung promises that the data is erased immediately. Most of the other companies that are listening promise no such thing and, in fact, save your data for a long time. Governments, of course, save it, too.

This data is a treasure trove for criminals, as we are learning again and again as tens and hundreds of millions of customer records are repeatedly stolen. Last week, it was reported that hackers had accessed the personal records of some 80 million Anthem Health customers and others. Last year, it was Home Depot, JP Morgan, Sony and many others. Do we think Nuance’s security is better than any of these companies? I sure don’t.

At some level, we’re consenting to all this listening. A single sentence in Samsung’s 1,500-word privacy policy, the one most of us don’t read, stated: “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.” Other services could easily come with a similar warning: Be aware that your e-mail provider knows what you’re saying to your colleagues and friends and be aware that your cell phone knows where you sleep and whom you’re sleeping with — assuming that you both have smartphones, that is.

The Internet of Things is full of listeners. Newer cars contain computers that record speed, steering wheel position, pedal pressure, even tire pressure — and insurance companies want to listen. And, of course, your cell phone records your precise location at all times you have it on — and possibly even when you turn it off. If you have a smart thermostat, it records your house’s temperature, humidity, ambient light and any nearby movement. Any fitness tracker you’re wearing records your movements and some vital signs; so do many computerized medical devices. Add security cameras and recorders, drones and other surveillance airplanes, and we’re being watched, tracked, measured and listened to almost all the time.

It’s the age of ubiquitous surveillance, fueled by both Internet companies and governments. And because it’s largely happening in the background, we’re not really aware of it.

This has to change. We need to regulate the listening: both what is being collected and how it’s being used. But that won’t happen until we know the full extent of surveillance: who’s listening and what they’re doing with it. Samsung buried its listening details in its privacy policy — they have since amended it to be clearer — and we’re only having this discussion because a Daily Beast reporter stumbled upon it. We need more explicit conversation about the value of being able to speak freely in our living rooms without our televisions listening, or having e-mail conversations without Google or the government listening. Privacy is a prerequisite for free expression, and losing that would be an enormous blow to our society.

This essay previously appeared on CNN.com.

ETA (2/16): A German translation by Damian Weber.

Posted on February 13, 2015 at 7:01 AMView Comments

Surveillance Detection for Android Phones

It’s called SnoopSnitch:

SnoopSnitch is an app for Android devices that analyses your mobile radio traffic to tell if someone is listening in on your phone conversations or tracking your location. Unlike standard antivirus apps, which are designed to combat software intrusions or steal personal info, SnoopSnitch picks up on things like fake mobile base stations or SS7 exploits. As such, it’s probably ideally suited to evading surveillance from local government agencies.

The app was written by German outfit Security Research Labs, and is available for free on the Play Store. Unfortunately, you’ll need a rooted Android device running a Qualcomm chipset to take advantage.

Download it here.

Posted on January 14, 2015 at 1:18 PMView Comments

Whatsapp Is Now End-to-End Encrypted

Whatsapp is now offering end-to-end message encryption:

Whatsapp will integrate the open-source software Textsecure, created by privacy-focused non-profit Open Whisper Systems, which scrambles messages with a cryptographic key that only the user can access and never leaves his or her device.

I don’t know the details, but the article talks about perfect forward secrecy. Moxie Marlinspike is involved, which gives me some confidence that it’s a robust implementation.

EDITED TO ADD (11/20): Slashdot thread.

Posted on November 18, 2014 at 12:35 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.