Also, I’m going to try something new. Let’s use this weekly squid post to talk about the security stories in the news that I didn’t cover. I’ll be doing this every Friday, so please save any stories you want to post about for squid threads.
Blog: July 2011 Archives
Security researcher Charlie Miller, widely known for his work on Mac OS X and Apple’s iOS, has discovered an interesting method that enables him to completely disable the batteries on Apple laptops, making them permanently unusable, and perform a number of other unintended actions. The method, which involves accessing and sending instructions to the chip housed on smart batteries could also be used for more malicious purposes down the road.
What he found is that the batteries are shipped from the factory in a state called “sealed mode” and that there’s a four-byte password that’s required to change that. By analyzing a couple of updates that Apple had sent to fix problems in the batteries in the past, Miller found that password and was able to put the battery into “unsealed mode.”
From there, he could make a few small changes to the firmware, but not what he really wanted. So he poked around a bit more and found that a second password was required to move the battery into full access mode, which gave him the ability to make any changes he wished. That password is a default set at the factory and it’s not changed on laptops before they’re shipped. Once he had that, Miller found he could do a lot of interesting things with the battery.
“That lets you access it at the same level as the factory can,” he said. “You can read all the firmware, make changes to the code, do whatever you want. And those code changes will survive a reinstall of the OS, so you could imagine writing malware that could hide on the chip on the battery. You’d need a vulnerability in the OS or something that the battery could then attack, though.”
As components get smarter, they also get more vulnerable.
ShareMeNot is a Firefox add-on for preventing tracking from third-party buttons (like the Facebook “Like” button or the Google “+1” button) until the user actually chooses to interact with them. That is, ShareMeNot doesn’t disable/remove these buttons completely. Rather, it allows them to render on the page, but prevents the cookies from being sent until the user actually clicks on them, at which point ShareMeNot releases the cookies and the user gets the desired behavior (i.e., they can Like or +1 the page).
Companies would be better off if they all provided meaningful privacy protections for consumers, but privacy is a collective action problem for them: many companies would love to see the ecosystem fixed, but no one wants to put themselves at a competitive disadvantage by imposing unilateral limitations on what they can do with user data.
The solution — and one endorsed by the essay — is a comprehensive privacy law. That reduces the incentive to defect.
Matt Blaze analyzes the 2010 U.S. Wiretap Report.
In 2000, government policy finally reversed course, acknowledging that encryption needed to become a critical part of security in modern networks, something that deserved to be encouraged, even if it might occasionally cause some trouble for law enforcement wiretappers. And since that time the transparent use of cryptography by everyday people (and criminals) has, in fact, exploded. Crypto software and algorithms, once categorized for arms control purposes as a “munition” alongside rocket launchers and nuclear triggers, can now be openly discussed, improved and incorporated into products and services without the end user even knowing that it’s there. Virtually every cellular telephone call is today encrypted and effectively impervious to unauthorized over-the-air eavesdropping. Web transactions, for everything from commerce to social networking, are now routinely encrypted end-to-end. (A few applications, particularly email and wireline telephony, remain stubbornly unencrypted, but they are increasingly the exception rather than the rule.)
So, with this increasing proliferation of eavesdrop-thwarting encryption built in to our infrastructure, we might expect law enforcement wiretap rooms to have become quiet, lonely places.
But not so fast: the latest wiretap report identifies a total of just six (out of 3194) cases in which encryption was encountered, and that prevented recovery of evidence a grand total of … (drumroll) … zero times. Not once. Previous wiretap reports have indicated similarly minuscule numbers.
I second Matt’s recommendation of Susan Landau’s book: Surveillance or Security: The Risks Posed by New Wiretapping Technologies (MIT Press, 2011). It’s an excellent discussion of the security and politics of wiretapping.
Halderman argued that secure software tends to come from companies that have a culture of taking security seriously. But it’s hard to mandate, or even to measure, “security consciousness” from outside a company. A regulatory agency can force a company to go through the motions of beefing up its security, but it’s not likely to be effective unless management’s heart is in it.
This is a key advantage of using liability as the centerpiece of security policy. By making companies financially responsible for the actual harms caused by security failures, lawsuits give management a strong motivation to take security seriously without requiring the government to directly measure and penalize security problems. Sony allegedly laid off security personnel ahead of this year’s attacks. Presumably it thought this would be a cost-saving move; a big class action lawsuit could ensure that other companies don’t repeat that mistake in future.
The access control provided by a physical lock is based on the assumption that the information content of the corresponding key is private — that duplication should require either possession of the key or a priori knowledge of how it was cut. However, the ever-increasing capabilities and prevalence of digital imaging technologies present a fundamental challenge to this privacy assumption. Using modest imaging equipment and standard computer vision algorithms, we demonstrate the effectiveness of physical key teleduplication — extracting a key’s complete and precise bitting code at a distance via optical decoding and then cutting precise duplicates. We describe our prototype system, Sneakey, and evaluate its effectiveness, in both laboratory and real-world settings, using the most popular residential key types in the U.S.
The design of common keys actually makes this process easier. There are only ten possible positions for each pin, any single key uses only half of those positions, and the positions of adjacent pins are deliberately set far apart.
EDITED TO ADD (7/26): I seem to have written about this in 2009. Apologies.
No indication about how well it works:
The smartphone-based scanner, named Mobile Offender Recognition and Information System, or MORIS, is made by BI2 Technologies in Plymouth, Massachusetts, and can be deployed by officers out on the beat or back at the station.
An iris scan, which detects unique patterns in a person’s eyes, can reduce to seconds the time it takes to identify a suspect in custody. This technique also is significantly more accurate than results from other fingerprinting technology long in use by police, BI2 says.
When attached to an iPhone, MORIS can photograph a person’s face and run the image through software that hunts for a match in a BI2-managed database of U.S. criminal records. Each unit costs about $3,000.
Roughly 40 law enforcement units nationwide will soon be using the MORIS, including Arizona’s Pinal County Sheriff’s Office, as well as officers in Hampton City in Virginia and Calhoun County in Alabama.
Sometimes too much security isn’t good.
After observing children on playgrounds in Norway, England and Australia, Dr. Sandseter identified six categories of risky play: exploring heights, experiencing high speed, handling dangerous tools, being near dangerous elements (like water or fire), rough-and-tumble play (like wrestling), and wandering alone away from adult supervision. The most common is climbing heights.
“Climbing equipment needs to be high enough, or else it will be too boring in the long run,” Dr. Sandseter said. “Children approach thrills and risks in a progressive manner, and very few children would try to climb to the highest point for the first time they climb. The best thing is to let children encounter these challenges from an early age, and they will then progressively learn to master them through their play over the years.”
By gradually exposing themselves to more and more dangers on the playground, children are using the same habituation techniques developed by therapists to help adults conquer phobias, according to Dr. Sandseter and a fellow psychologist, Leif Kennair, of the Norwegian University for Science and Technology.
“Risky play mirrors effective cognitive behavioral therapy of anxiety,” they write in the journal Evolutionary Psychology, concluding that this “anti-phobic effect” helps explain the evolution of children’s fondness for thrill-seeking. While a youthful zest for exploring heights might not seem adaptive — why would natural selection favor children who risk death before they have a chance to reproduce? — the dangers seemed to be outweighed by the benefits of conquering fear and developing a sense of mastery.
Sidebar photo of Bruce Schneier by Joe MacInnis.