Entries Tagged "cell phones"

Page 19 of 29

Smart Phone Privacy App

MobileScope looks like a great tool for monitoring and controlling what information third parties get from your smart phone apps:

We built MobileScope as a proof-of-concept tool that automates much of what we were doing manually; monitoring mobile devices for surprising traffic and highlighting potentially privacy-revealing flows

[…]

Unlike PCs, we have little control over the underlying privacy and security features of our mobile devices. They come pre-installed with locked-down operating systems that often restrict their owners from exercising meaningful control unless they’re willing to void their warranty and jailbreak the device.

Our current plans are to release MobileScope in the coming weeks and allow interested consumers, developers, regulators, and press to see what information their mobile devices can transmit.

Posted on May 11, 2012 at 6:42 AMView Comments

Stolen Phone Database

This article talks about a database of stolen cell phone IDs that will be used to deny service. While I think this is a good idea, I don’t know how much it would deter cell phone theft. As long as there are countries that don’t implement blocking based on the IDs in the databases—and surely there will always be—there will be a market for stolen cell phones.

Plus, think of the possibilities for a denial-of-service attack. Can I report your cell phone as stolen and have it turned off? Surely no political party will think of doing that to the phones of all the leaders of a rival party the weekend before a major election.

Posted on April 18, 2012 at 6:49 AMView Comments

Lost Smart Phones and Human Nature

Symantec deliberately “lost” a bunch of smart phones with tracking software on them, just to see what would happen:

Some 43 percent of finders clicked on an app labeled “online banking.” And 53 percent clicked on a filed named “HR salaries.” A file named “saved passwords” was opened by 57 percent of finders. Social networking tools and personal e-mail were checked by 60 percent. And a folder labeled “private photos” tempted 72 percent.

Collectively, 89 percent of finders clicked on something they probably shouldn’t have.

Meanwhile, only 50 percent of finders offered to return the gadgets, even though the owner’s name was listed clearly within the contacts file.

[…]

Some might consider the 50 percent return rate a victory for humanity, but that wasn’t really the point of Symantec’s project. The firm wanted to see if—even among what seem to be honest people—the urge to peek into someone’s personal data was just too strong to resist. It was.

EDITED TO ADD (4/13): Original study.

Posted on April 4, 2012 at 6:07 AMView Comments

Law Enforcement Forensics Tools Against Smart Phones

Turns out the password can be easily bypassed:

XRY works by first jailbreaking the handset. According to Micro Systemation, no ‘backdoors’ created by Apple used, but instead it makes use of security flaws in the operating system the same way that regular jailbreakers do.

Once the iPhone has been jailbroken, the tool then goes on to ‘brute-force’ the passcode, trying every possible four digit combination until the correct password has been found. Given the limited number of possible combinations for a four-digit passcode—10,000, ranging from 0000 to 9999—this doesn’t take long.

Once the handset has been jailbroken and the passcode guessed, all the data on the handset, including call logs, messages, contacts, GPS data and even keystrokes, can be accessed and examined.

One of the morals is to use an eight-digit passcode.

EDITED TO ADD (4/13): This has been debunked. The 1Password blog has a fairly lengthy post discussing the details of the XRY tool.

Posted on April 3, 2012 at 2:01 PMView Comments

NSA's Secure Android Spec

The NSA has released its specification for a secure Android.

One of the interesting things it’s requiring is that all data be tunneled through a secure VPN:

Inter-relationship to Other Elements of the Secure VoIP System

The phone must be a commercial device that supports the ability to pass data over a commercial cellular network. Standard voice phone calls, with the exception of emergency 911 calls, shall not be allowed. The phone must function on US CDMA & GSM networks and OCONUS on GSM networks with the same functionality.

All data communications to/from the mobile device must go through the VPN tunnel to the VPN gateway in the infrastructure; no other communications in or out of the mobile device are permitted.

Applications on the phone additionally encrypt their communications to servers in infrastructure, or to other phones; all those communications must be tunneled through the VPN.

The more I look at mobile security, the more I think a secure tunnel is essential.

Posted on March 7, 2012 at 1:35 PMView Comments

Mobile Malware Is Increasing

According to a report by Juniper, mobile malware is increasing dramatically.

In 2011, we saw unprecedented growth of mobile malware attacks with a 155 percent increase across all platforms. Most noteworthy was the dramatic growth in Android Malware from roughly 400 samples in June to over 13,000 samples by the end of 2011. This amounts to a cumulative increase of 3,325 percent. Notable in these findings is a significant number of malware samples obtained from third-party applications stores, which do not enjoy the benefit or protection from Google’s newly announced Android Market scanning techniques.

We also observed a new level of sophistication of many attacks. Malware writers used new and novel ways to exploit vulnerabilities. 2011 saw malware like Droid KungFu, which used encrypted payloads to avoid detection and Droid Dream, which cleverly disguised itself as a legitimate application, are a sign of things to come.

News story.

I don’t think this is surprising at all. Mobile is the new platform. Mobile is a very intimate platform. It’s where the attackers are going to go.

Posted on February 23, 2012 at 6:27 AMView Comments

Improving the Security of Four-Digit PINs on Cell Phones

The author of this article notices that it’s often easy to guess a cell phone PIN because of smudge marks on the screen. Those smudge marks indicate the four PIN digits, so an attacker knows that the PIN is one of 24 possible permutations of those digits.

Then he points out that if your PIN has only three different digits—1231, for example—the PIN can be one of 36 different possibilities.

So it’s more security, although not much more secure.

Posted on January 6, 2012 at 6:30 AMView Comments

Recent Developments in Full Disclosure

Last week, I had a long conversation with Robert Lemos over an article he was writing about full disclosure. He had noticed that companies have recently been reacting more negatively to security researchers publishing vulnerabilities about their products.

The debate over full disclosure is as old as computing, and I’ve written about it before. Disclosing security vulnerabilities is good for security and good for society, but vendors really hate it. It results in bad press, forces them to spend money fixing vulnerabilities, and comes out of nowhere. Over the past decade or so, we’ve had an uneasy truce between security researchers and product vendors. That truce seems to be breaking down.

Lemos believes the problem is that because today’s research targets aren’t traditional computer companies—they’re phone companies, or embedded system companies, or whatnot—they’re not aware of the history of the debate or the truce, and are responding more viscerally. For example, Carrier IQ threatened legal action against the researcher that outed it, and only backed down after the EFF got involved. I am reminded of the reaction of locksmiths to Matt Blaze’s vulnerability disclosures about lock security; they thought he was evil incarnate for publicizing hundred-year-old security vulnerabilities in lock systems. And just last week, I posted about a full-disclosure debate in the virology community.

I think Lemos has put his finger on part of what’s going on, but that there’s more. I think that companies, both computer and non-computer, are trying to retain control over the situation. Apple’s heavy-handed retaliation against researcher Charlie Miller is an example of that. On one hand, Apple should know better than to do this. On the other hand, it’s acting in the best interest of its brand: the fewer researchers looking for vulnerabilities, the fewer vulnerabilities it has to deal with.

It’s easy to believe that if only people wouldn’t disclose problems, we could pretend they didn’t exist, and everything would be better. Certainly this is the position taken by the DHS over terrorism: public information about the problem is worse than the problem itself. It’s similar to Americans’ willingness to give both Bush and Obama the power to arrest and indefinitely detain any American without any trial whatsoever. It largely explains the common public backlash against whistle-blowers. What we don’t know can’t hurt us, and what we do know will also be known by those who want to hurt us.

There’s some profound psychological denial going on here, and I’m not sure of the implications of it all. It’s worth paying attention to, though. Security requires transparency and disclosure, and if we willingly give that up, we’re a lot less safe as a society.

Posted on December 6, 2011 at 7:31 AMView Comments

Carrier IQ Spyware

Spyware on many smart phones monitors your every action, including collecting individual keystrokes. The company that makes and runs this software on behalf of different carriers, Carrier IQ, freaked when a security researcher outed them. It initially claimed it didn’t monitor keystrokes—an easily refuted lie—and threatened to sue the researcher. It took EFF getting involved to get the company to back down. (A good summary of the details is here. This is pretty good, too.)

Carrier IQ is reacting really badly here. Threatening the researcher was a panic reaction, but I think it’s still clinging to the notion that it can keep the details of what it does secret, or hide behind such statements such as:

Our customers select which metrics they need to gather based on their business need—such as network planning, customer care, device performance—within the bounds of the agreement they form with their end users.

Or hair-splitting denials it’s been giving to the press.

In response to some questions from PCMag, a Carrier IQ spokeswoman said “we count and summarize performance; we do not record keystrokes, capture screen shots, SMS, email, or record conversations.”

“Our software does not collect the content of messages,” she said.

How then does Carrier IQ explain the video posted by Trevor Eckhart, which showed an Android-based phone running Carrier IQ in the background and grabbing data like encrypted Google searches?

“While ‘security researchers’ have identified that we examine many aspects of a device, our software does not store or transmit what consumers view on their screen or type,” the spokeswoman said. “Just because every application on your phone reads the keyboard does not make every application a key-logging application. Our software measures specific performance metrics that help operators improve the customer experience.”

The spokeswoman said Carrier IQ would record the fact that a text message was sent correctly, for example, but the company “cannot record what the content of the SMS was.” Similarly, Carrier IQ records where you were when a call dropped, but cannot record the conversation, and can determine which applications drain battery life but cannot capture screen shots, she said.

Several things matter here: 1) what data the CarrerIQ app collects on the handset, 2) what data the CarrerIQ app routinely transmits to the carriers, and 3) what data can the CarrierIQ app transmit to the carrier if asked. Can the carrier enable the logging of everything in response to a request from the FBI? We have no idea.

Expect this story to unfold considerably in the coming weeks. Everyone is pointing fingers of blame at everyone else, and Sen. Franken has asked the various companies involved for details.

One more detail is worth mentioning. Apple announced it no longer uses CarrierIQ in iOS5. I’m sure this means that they have their own surveillance software running, not that they’re no longer conducting surveillance on their users.

EDITED TO ADD (12/14): This is an excellent round-up of everything known about CarrierIQ.

Posted on December 5, 2011 at 6:05 AMView Comments

1 17 18 19 20 21 29

Sidebar photo of Bruce Schneier by Joe MacInnis.