Entries Tagged "Apple"
Page 12 of 12
While there are an enormous variety of operating systems to choose from, only four “core” lineages exist in the mainstream — Windows, OS X, Linux and UNIX. Each system carries its own baggage of vulnerabilities ranging from local exploits and user introduced weaknesses to remotely available attack vectors.
As far as “straight-out-of-box” conditions go, both Microsoft’s Windows and Apple’s OS X are ripe with remotely accessible vulnerabilities. Even before enabling the servers, Windows based machines contain numerous exploitable holes allowing attackers to not only access the system but also execute arbitrary code. Both OS X and Windows were susceptible to additional vulnerabilities after enabling the built-in services. Once patched, however, both companies support a product that is secure, at least from the outside. The UNIX and Linux variants present a much more robust exterior to the outside. Even when the pre-configured server binaries are enabled, each system generally maintained its integrity against remote attacks. Compared with the Microsoft and Apple products, however, UNIX and Linux systems tend to have a higher learning curve for acceptance as desktop platforms.
When it comes to business, most systems have the benefit of trained administrators and IT departments to properly patch and configure the operating systems and their corresponding services. Things are different with home computers. The esoteric nature of the UNIX and Linux systems tend to result in home users with an increased understanding of security concerns. An already “hardened” operating system therefore has the benefit of a knowledgeable user base. The more consumer oriented operating systems made by Microsoft and Apple are each hardened in their own right. As soon as users begin to arbitrarily enable remote services or fiddle with the default configurations, the systems quickly become open to intrusion. Without a diligence for applying the appropriate patches or enabling automatic updates, owners of Windows and OS X systems are the most susceptible to quick and thorough remote violations by hackers.
Good essay on “faux disclosure”: disclosing a vulnerability without really disclosing it.
You’ve probably heard of full disclosure, the security philosophy that calls for making public all details of vulnerabilities. It has been the subject of debates among
researchers, vendors, and security firms. But the story that grabbed most of the headlines at the Black Hat Briefings in Las Vegas last week was based on a different type of disclosure. For lack of a better name, I’ll call it faux disclosure. Here’s why.
Security researchers Dave Maynor of ISS and Johnny Cache — a.k.a. Jon Ellch — demonstrated an exploit that allowed them to install a rootkit on an Apple laptop in less than a minute. Well, sort of; they showed a video of it, and also noted that they’d used a third-party Wi-Fi card in the demo of the exploit, rather than the MacBook’s internal Wi-Fi card. But they said that the exploit would work whether the third-party card — which they declined to identify — was inserted
in a Mac, Windows, or Linux laptop.
How is that for murky and non-transparent? The whole world is at risk — if the exploit is real — whenever the unidentified card is used. But they won’t say which card, although many sources presume the card is based on the Atheros chipset, which Apple employs.
It gets worse. Brian Krebs of the Washington Post, who first reported on the exploit, updated his original story and has reported that Maynor said, “Apple had leaned on Maynor and Ellch pretty hard not to make this an issue about the Mac drivers — mainly because Apple had not fixed the problem yet.”
That’s part of what is meant by full disclosure these days — giving the vendor a chance fix the vulnerability before letting the whole world know about it. That way, the thinking goes, the only people who get hurt by it are the people who get exploited by it. But damage to the responsible vendor’s image is mitigated somewhat, and many in the security business seem to think that damage control is more important than anything that might happen to any of the vendor’s customers.
Big deal. Publicly traded corporations like Apple and Microsoft and all the rest have been known to ignore ethics, morality, any consideration of right or wrong, or anything at all that might divert them from their ultimate goal: to maximize profits. Because of this,
some corporations only speak the truth when it is in their best interest. Otherwise, they lie or maintain silence.
Full disclosure is the only thing that forces vendors to fix security problems. The further we move away from full disclosure, the less incentive vendors have to fix problems and the more at-risk we all are.
What happens if you distribute 50 million small,valuable, and easily sellable objects into the hands of men, women, and children all over the world, and tell them to walk around the streets with them? Why, people steal them, of course.
“Rise in crime blamed on iPods”, yells the front page of London’s Metro. “Muggers targeting iPod users”, says ITV. This is the reaction to the government’s revelation that robberies across the UK have risen by 8 per cent in the last year, from 90,747 to 98,204. The Home Secretary, John Reid, attributes this to the irresistible lure of “young people carrying expensive goods, such as mobile phones and MP3 players”. A separate British Crime Survey, however, suggests robbery has risen by 22 per cent, to 311,000.
This shouldn’t come as a surprise, just as it wasn’t a surprise in the 1990s when there was a wave of high-priced sneaker thefts. Or that there is also a wave of laptop thefts.
What to do about it? Basically, there’s not much you can do except be careful. Muggings have long been a low-risk crime, so it makes sense that we’re seeing an increase in them as the value of what people are carrying on their person goes up. And people carrying portable music players have an unmistakable indicator: those ubiquitous ear buds.
The economics of this crime are such that it will continue until one of three things happens. One, portable music players become much less valuable. Two, the costs of the crime become much higher. Three, society deals with its underclass and gives them a better career option than iPod thief.
And on a related topic, here’s a great essay by Cory Doctorow on how Apple’s iTunes copy protection screws the music industry.
EDITED TO ADD (8/5): Eric Rescorla comments.
I’ve previously written about the risks of small portable computing devices; how more and more data can be stored on them, and then lost or stolen. But there’s another risk: if an attacker can convince you to plug his USB device into your computer, he can take it over.
Plug an iPod or USB stick into a PC running Windows and the device can literally take over the machine and search for confidential documents, copy them back to the iPod or USB’s internal storage, and hide them as “deleted” files. Alternatively, the device can simply plant spyware, or even compromise the operating system. Two features that make this possible are the Windows AutoRun facility and the ability of peripherals to use something called direct memory access (DMA). The first attack vector you can and should plug; the second vector is the result of a design flaw that’s likely to be with us for many years to come.
The article has the details, but basically you can configure a file on your USB device to automatically run when it’s plugged into a computer. That file can, of course, do anything you want it to.
Recently I’ve been seeing more and more written about this attack. The Spring 2006 issue of 2600 Magazine, for example, contains a short article called “iPod Sneakiness” (unfortunately, not on line). The author suggests that you can innocently ask someone at an Internet cafe if you can plug your iPod into his computer to power it up — and then steal his passwords and critical files.
And here’s an article about someone who used this trick in a penetration test:
We figured we would try something different by baiting the same employees that were on high alert. We gathered all the worthless vendor giveaway thumb drives collected over the years and imprinted them with our own special piece of software. I had one of my guys write a Trojan that, when run, would collect passwords, logins and machine-specific information from the user’s computer, and then email the findings back to us.
The next hurdle we had was getting the USB drives in the hands of the credit union’s internal users. I made my way to the credit union at about 6 a.m. to make sure no employees saw us. I then proceeded to scatter the drives in the parking lot, smoking areas, and other areas employees frequented.
Once I seeded the USB drives, I decided to grab some coffee and watch the employees show up for work. Surveillance of the facility was worth the time involved. It was really amusing to watch the reaction of the employees who found a USB drive. You know they plugged them into their computers the minute they got to their desks.
I immediately called my guy that wrote the Trojan and asked if anything was received at his end. Slowly but surely info was being mailed back to him. I would have loved to be on the inside of the building watching as people started plugging the USB drives in, scouring through the planted image files, then unknowingly running our piece of software.
There is a defense. From the first article:
AutoRun is just a bad idea. People putting CD-ROMs or USB drives into their computers usually want to see what’s on the media, not have programs automatically run. Fortunately you can turn AutoRun off. A simple manual approach is to hold down the “Shift” key when a disk or USB storage device is inserted into the computer. A better way is to disable the feature entirely by editing the Windows Registry. There are many instructions for doing this online (just search for “disable autorun”) or you can download and use Microsoft’s TweakUI program, which is part of the Windows XP PowerToys download. With Windows XP you can also disable AutoRun for CDs by right-clicking on the CD drive icon in the Windows explorer, choosing the AutoPlay tab, and then selecting “Take no action” for each kind of disk that’s listed. Unfortunately, disabling AutoPlay for CDs won’t always disable AutoPlay for USB devices, so the registry hack is the safest course of action.
In the 1990s, the Macintosh operating system had this feature, which was removed after a virus made use of it in 1998. Microsoft needs to remove this feature as well.
EDITED TO ADD (6/12): In the penetration test, they didn’t use AutoRun.
Forget RFID. Well, don’t, but National Scientific Corporation has a prototype of a WiFi tagging system that, like RFID, lets you track things in real-time and space. The advantage that the WiFi Tracker system has over passive RFID tracking is that you can keep tabs on objects with WiFi Tracker tags (which can hold up to 256K of data) from as far as a few hundred meters away (the range of passive RFID taggers is just a few meters). While you can do something similar with active RFID tags, with WiFi Tracker companies can use their pre-existing WiFi network to track things rather than having to build a whole new RFID system.
In other news, Apple is adding WiFi to the iPod.
And, of course, you can be tracked from your cellphone:
But the FBI and the U.S. Department of Justice have seized on the ability to
locate a cellular customer and are using it to track Americans’ whereabouts
surreptitiously–even when there’s no evidence of wrongdoing.
A pair of court decisions in the last few weeks shows that judges are split
on whether this is legal. One federal magistrate judge in Wisconsin on Jan.
17 ruled it was unlawful, but another nine days later in Louisiana decided
that it was perfectly OK.
This is an unfortunate outcome, not least because it shows that some judges
are reluctant to hold federal agents and prosecutors to the letter of the
It’s also unfortunate because it demonstrates that the FBI swore never to
use a 1994 surveillance law to track cellular phones–but then, secretly,
went ahead and did it, anyway.
The Hymn Project exists to break the iTunes mp4 copy-protection scheme, so you can hear the music you bought on any machine you want.
The purpose of the Hymn Project is to allow you to exercise your fair-use rights under copyright law. The various software provided on this web site allows you to free your iTunes Music Store purchases (protected AAC / .m4p) from their DRM restrictions with no loss of sound quality. These songs can then be played outside of the iTunes environment, even on operating systems not supported by iTunes and on hardware not supported by Apple.
Initially, the software recovered your iTunes password (your key, basically) from your hard drive. In response, Apple obfuscated the format and no one has yet figured out how to recover the keys cleanly. To get around this, they developed a program called FairKeys that impersonates iTunes and contacts the server. Since the iTunes client can still get your password, this works.
FairKeys … pretends to be a copy of iTunes running on an imaginary computer, one of the five computers that you’re currently allowed to authorize for playing your iTMS purchases. FairKeys logs into Apple’s web servers to get your keys the same way iTunes does when it needs to get new keys. At least for now, at this stage of the cat-and-mouse game, FairKeys knows how to request your keys and how to decode the response which contains your keys, and once it has those keys it can store them for immediate or future use by JHymn.
More security by inconvenience, and yet another illustration of the neverending arms race between attacker and defender.
Sidebar photo of Bruce Schneier by Joe MacInnis.