Entries Tagged "Google"

Page 14 of 19

New Attacks on CAPTCHAs

Nice research:

Abstract: We report a novel attack on two CAPTCHAs that have been widely deployed on the Internet, one being Google’s home design and the other acquired by Google (i.e. reCAPTCHA). With a minor change, our attack program also works well on the latest ReCAPTCHA version, which uses a new defence mechanism that was unknown to us when we designed our attack. This suggests that our attack works in a fundamental level. Our attack appears to be applicable to a whole family of text CAPTCHAs that build on top of the popular segmentation-resistant mechanism of “crowding character together” for security. Next, we propose a novel framework that guides the application of our well-tested security engineering methodology for evaluating CAPTCHA robustness, and we propose a new general principle for CAPTCHA design.

Posted on October 12, 2011 at 6:57 AMView Comments

Three Emerging Cyber Threats

On Monday, I participated in a panel at the Information Systems Forum in Berlin. The moderator asked us what the top three emerging threats were in cyberspace. I went last, and decided to focus on the top three threats that are not criminal:

  1. The Rise of Big Data. By this I mean industries that trade on our data. These include traditional credit bureaus and data brokers, but also data-collection companies like Facebook and Google. They’re collecting more and more data about everyone, often without their knowledge and explicit consent, and selling it far and wide: to both other corporate users and to government. Big data is becoming a powerful industry, resisting any calls to regulate its behavior.
  2. Ill-Conceived Regulations from Law Enforcement. We’re seeing increasing calls to regulate cyberspace in the mistaken belief that this will fight crime. I’m thinking about data retention laws, Internet kill switches, and calls to eliminate anonymity. None of these will work, and they’ll all make us less safe.
  3. The Cyberwar Arms Race. I’m not worried about cyberwar, but I am worried about the proliferation of cyber weapons. Arms races are fundamentally destabilizing, especially when their development can be so easily hidden. I worry about cyberweapons being triggered by accident, cyberweapons getting into the wrong hands and being triggered on purpose, and the inability to reliably trace a cyberweapon leading to increased distrust. Plus, arms races are expensive.

That’s my list, and they all have the potential to be more dangerous than cybercriminals.

Posted on September 23, 2011 at 6:53 AMView Comments

Forged Google Certificate

There’s been a forged Google certificate out in the wild for the past month and a half. Whoever has it—evidence points to the Iranian government—can, if they’re in the right place, launch man-in-the-middle attacks against Gmail users and read their mail. This isn’t Google’s mistake; the certificate was issued by a Dutch CA that has nothing to do with Google.

This attack illustrates one of the many security problems with SSL: there are too many single points of trust.

EDITED TO ADD (9/1): It seems that 200 forged certificates were generated, not just for Google.

EDITED TO ADD (9/14): More news.

Posted on September 1, 2011 at 5:46 AMView Comments

Smartphone Keystroke Logging Using the Motion Sensor

Clever:

“When the user types on the soft keyboard on her smartphone (especially when she holds her phone by hand rather than placing it on a fixed surface), the phone vibrates. We discover that keystroke vibration on touch screens are highly correlated to the keys being typed.”

Applications like TouchLogger could be significant because they bypass protections built into both Android and Apple’s competing iOS that prevent a program from reading keystrokes unless it’s active and receives focus from the screen. It was designed to work on an HTC Evo 4G smartphone. It had an accuracy rate of more than 70 percent of the input typed into the number-only soft keyboard of the device. The app worked by using the phone’s accelerometer to gauge the motion of the device each time a soft key was pressed.

Paper here. More articles.

Posted on August 23, 2011 at 2:09 PMView Comments

Pseudonymity

Long essay on the value of pseudonymity. From the conclusions:

Here lies the huge irony in this discussion. Persistent pseudonyms aren’t ways to hide who you are. They provide a way to be who you are. You can finally talk about what you really believe; your real politics, your real problems, your real sexuality, your real family, your real self. Much of the support for “real names” comes from people who don’t want to hear about controversy, but controversy is only a small part of the need for pseudonyms. For most of us, it’s simply the desire to be able to talk openly about the things that matter to every one of us who uses the Internet. The desire to be judged—not by our birth, not by our sex, and not by who we work for—but by what we say.

[…]

I leave you with this question. What if I had posted this under my pseudonym? Why should that have made a difference? I would have written the same words, but ironically, I could have added some more personal and perhaps persuasive arguments which I dare not make under this account. Because I was forced to post this under my real name, I had to weaken my arguments; I had to share less of myself. Have you ever met “Kee Hinckley”? Have you met me under my other name? Does it matter? There is nothing real on the Internet; all you know about me is my words. You can look me up on Google, and still all you will know is my words. One real person wrote this post. It could have been submitted under either name. But one of them is not allowed to. Does that really make sense?

Behind every pseudonym is a real person. Deny the pseudonym and you deny the person.

This is, of a course, a response to the Google+ names policy.

Posted on August 22, 2011 at 6:01 AMView Comments

Google Detects Malware in its Search Data

This is interesting:

As we work to protect our users and their information, we sometimes discover unusual patterns of activity. Recently, we found some unusual search traffic while performing routine maintenance on one of our data centers. After collaborating with security engineers at several companies that were sending this modified traffic, we determined that the computers exhibiting this behavior were infected with a particular strain of malicious software, or “malware.” As a result of this discovery, today some people will see a prominent notification at the top of their Google web search results….

There’s a lot that Google sees as a result of it’s unique and prominent position in the Internet. Some of it is going to be stuff they never considered. And while they use a lot of it to make money, it’s good of them to give this one back to the Internet users.

Posted on July 20, 2011 at 6:23 AMView Comments

Protecting Private Information on Smart Phones

AppFence is a technology—with a working prototype—that protects personal information on smart phones. It does this by either substituting innocuous information in place of sensitive information or blocking attempts by the application to send the sensitive information over the network.

The significance of systems like AppFence is that they have the potential to change the balance of power in privacy between mobile application developers and users. Today, application developers get to choose what information an application will have access to, and the user faces a take-it-or-leave-it proposition: users must either grant all the permissions requested by the application developer or abandon installation. Take-it-or-leave it offers may make it easier for applications to obtain access to information that users don’t want applications to have. Many applications take advantage of this to gain access to users’ device identifiers and location for behavioral tracking and advertising. Systems like AppFence could make it harder for applications to access these types of information without more explicit consent and cooperation from users.

The problem is that the mobile OS providers might not like AppFence. Google probably doesn’t care, but Apple is one of the biggest consumers of iPhone personal information. Right now, the prototype only works on Android, because it requires flashing the phone. In theory, the technology can be made to work on any mobile OS, but good luck getting Apple to agree to it.

Posted on June 24, 2011 at 6:37 AMView Comments

Ebook Fraud

Interesting post—and discussion—on Making Light about ebook fraud. Currently there are two types of fraud. The first is content farming, discussed in these two interesting blog posts. People are creating automatically generated content, web-collected content, or fake content, turning it into a book, and selling it on an ebook site like Amazon.com. Then they use multiple identities to give it good reviews. (If it gets a bad review, the scammer just relists the same content under a new name.) That second blog post contains a screen shot of something called “Autopilot Kindle Cash,” which promises to teach people how to post dozens of ebooks to Amazon.com per day.

The second type of fraud is stealing a book and selling it as an ebook. So someone could scan a real book and sell it on an ebook site, even though he doesn’t own the copyright. It could be a book that isn’t already available as an ebook, or it could be a “low cost” version of a book that is already available. Amazon doesn’t seem particularly motivated to deal with this sort of fraud. And it too is suitable for automation.

Broadly speaking, there’s nothing new here. All complex ecosystems have parasites, and every open communications system we’ve ever built gets overrun by scammers and spammers. Far from making editors superfluous, systems that democratize publishing have an even greater need for editors. The solutions are not new, either: reputation-based systems, trusted recommenders, white lists, takedown notices. Google has implemented a bunch of security countermeasures against content farming; ebook sellers should implement them as well. It’ll be interesting to see what particular sort of mix works in this case.

Posted on April 4, 2011 at 9:18 AMView Comments

1 12 13 14 15 16 19

Sidebar photo of Bruce Schneier by Joe MacInnis.