Apple JailBreakMe Vulnerability

Good information from Mikko Hyppönen.

Q: What is this all about?
A: It's about a site called jailbreakme.com that enables you to Jailbreak your iPhones and iPads just by visiting the site.

Q: So what's the problem?
A: The problem is that the site uses a zero-day vulnerability to execute code on the device.

Q: How does the vulnerability work?
A: Actually, it's two vulnerabilities. First one uses a corrupted font embedded in a PDF file to execute code and the second one uses a vulnerability in the kernel to escalate the code execution to unsandboxed root.

Q: How difficult was it to create this exploit?
A: Very difficult.

Q: How difficult would it be for someone else to modify the exploit now that it's out?
A: Quite easy.

Here's the JailBreakMe blog.

EDITED TO ADD (8/14): Apple has released a patch. It doesn't help people with old model iPhones and iPod Touches, or work for people who've jailbroken their phones.

EDITED TO ADD (8/15): More info.

Posted on August 10, 2010 at 12:12 PM • 48 Comments

Comments

RichAugust 10, 2010 12:40 PM

What is awesome is that most iPhone users will see this as a good thing. They will think "HEY! Now I can jail break my phone! And it's ssoo easy!". They wont even realize what it is that can be done. And Apple won't care. The only way people will learn is if something REALLY bad happens. And all we need for that is some nerdy hacker to have a bad day.

Just imagine: Youre a hacker, chilling out at the coffee shop, toying around with Airpwn, and you notice an iPhone user using Wifi... You know where I'm going with this.

kashmarekAugust 10, 2010 12:45 PM

"Q: How difficult was it to create this exploit?
A: Very difficult."

Apparently, not difficult enough. Or, are these exploits there by design? Microsoft has been patching exploits (vulnerabilities) for 10 years or more. Who put those in? Just bad programming? I no longer think so.

OttoAugust 10, 2010 12:59 PM

The last time jailbreakme.com was working (back in the iOS 2.x days), it also used a remote exploit to install itself, however the interesting thing was that it also patched the vulnerability behind it. Which led to the quite entertaining situation that it was actually safer to jailbreak than it was to not jailbreak.

This new one doesn't patch the hole behind it (probably because they wanted to get it working quickly), but the PDF Loading Warner can be installed via Cydia and it will act as a confirmation screen whenever some site, or anything else, tries to load a PDF, which is necessary to exploit the vulnerability.

t3knomanserAugust 10, 2010 1:02 PM

@Rich: Apple is already working on a fix, so they apparently care- I'm sure a coming OS update will resolve it. And for people that use this to Jailbreak, there's a band-aid- a plugin that prevents PDFs from downloading unless you explicitly allow it.

@Kashmarek: Software is complex. An operating system and all of the components that surround it in modern computers is pretty much as complex as software gets. The more complex a system is, the more likely it is to have unforseen consequences. This can be bugs, novel behavior, or security flaws.

Brandon ThomsonAugust 10, 2010 1:02 PM

Secure systems have a higher entropy than insecure systems. They are hard to build and you don't get there by accident. Thus, you see and will continue to see lots of vulnerabilities everywhere.

noahAugust 10, 2010 1:03 PM

Q: Are you telling me that it would be safer now to jailbreak my phone so I could install a PDF Warner?

A: Yes, sort of.

MAugust 10, 2010 1:25 PM

@Otto There is an app/fixup that warns you before allowing PDFs to be loaded, and allows you to stop them. It is not available via Apple, however. So again, the most secure configuration available requires that you first jailbreak the device.

OdalchiniAugust 10, 2010 2:38 PM

Brandon Thomson:  “Secure systems ... are hard to build ...”

Only if you try to build the ‘secure’ after you’ve built the ‘system’ – in other words, paste on security after you’ve designed the product – as invariably seems to happen.

Imagine if Apple had designed security into the iPhone hardware and OS from day one.  Imagine if the hardware and the OS between them simply *prohibited* futzing with the kernel so as to escalate app code execution to unsandboxed root.  Impossible?  No – just never done.

HJohnAugust 10, 2010 3:33 PM

@Odalchini: "Only if you try to build the ‘secure’ after you’ve built the ‘system’ – in other words, paste on security after you’ve designed the product – as invariably seems to happen."
____________

Even when there is good security built in, function creep seems to follow. In other words, since something is deemed to have "good" security for its function, it will absorb more functions for which the security is inadequate.

AdamAugust 10, 2010 3:44 PM

If Apple want to kill jailbreaks stone dead, they could simply add a preference that allows a user to do it legally and easily. Perhaps by doing so it shuts you out of Apple store or has other implications, but it should still let you do it.

BrianAugust 10, 2010 4:19 PM

I was a bit shocked when I heard people calling this a good thing. To me, this was clearly a bad thing. I don't think it applies to all versions, just 3.2 and 4.0, but perhaps it does. In any case, apple will have a fix shortly. In the mean time, I am avoiding "unsafe" websites. Mobile Safari does have "fraud warning" which uses something like google's malware site registry. So, its minimally safe.

And I agree with Adam. Apple should give users the option with a button in the control panel. If you accept the agreement, it sends a "ping" to apple, and immediately cancels any apple care you may have had. Then you can go merrily on your way with a phone that has no warrantee and a 3 year contract to burn. And provide some trip wires in the OS or hardware that trigger "not valid for applecare". Don't brick it, just make it useless if you physically get it damaged.

And I disagree with Tim. I love my apple devices, but I think android would have a fix within a day or two. I am a bit dismayed that there is no fix for iOS (not to be confused with cisco ios...)

kog999August 10, 2010 4:54 PM

I personally dont own an Iphone and never will but here is my take on it.

Security Exploits that allow someone else to control your device = Bad

Security Exploits that allow you to control the device you paid for = Good

Its risk vs benefit, the risk is that your phone is open to a particular security explort that could be run by a malicious website to do who knows what. the benefit is you now have full control over your iphone and arn't locked in to apples crap. the risk can futher be mitigated by running this app that warns you about pdf's. The chances of actually experience this risk assuming you have half way decent browesing habbits in minimal. I'm sure there will be some website that will exploit this but most malicious sites are still going to be targeting computer browsers. For the people who see the benefit outweighing the risk they believe this is a good thing.

Patrick HenryAugust 10, 2010 5:01 PM

Q: After all the fighting between Apple and Adobe (regarding Flash), isn't this a bit ironic?
A: Yeah.

WRONG!

Q: So as an iPhone user, what should I do to protect myself?
A: You should be careful. And you should install the patch when it becomes available.

WRONG! Jailbreak your phone and fix it, because Apple hasn't.

Same story back with original jailbreakme.com. In fact, the hackers fixed the vulnerability as part of the jailbreak process.

If anything, this is a lesson that having Apple control your phone soup-to-nuts is NOT more secure than having control of it yourself.

Nick PAugust 10, 2010 5:02 PM

@ Adam

Jailbreaking is legal now in the US. Jailbreaking smartphones was added in as an exemption under the DMCA. They also added certain forms of CD/DVD copying for academia and hacking console games if you are just looking for vulnerabilities (???). Apple doesn't have to make jailbreaking easier and they can try to prevent it, but it's no longer a crime in the US. That's the good news. If your not in US, that's the bad news. ;)

Patrick HenryAugust 10, 2010 5:05 PM

Further, I'd go so far as to say your iPhone/iPod/iPad cannot be secure unless you crack it.

Mine has an outbound firewall that prevents apps (even and especially officially App Store ones) from stealing my data.

Clive RobinsonAugust 10, 2010 5:35 PM

@ Patrick Henry,

"Mine has an outbound firewall that prevents apps(even and especially officially App Store ones) from stealing my data"

You are probably one of the few that have considered that Apple will at some point start being another Google, hovering up whatever data they can.

Oddly nobody seems overly bothered by some asspects of Andoid...

Clive RobinsonAugust 10, 2010 5:46 PM

Brian Krebs has posted on this a couple of days ago and has a little "for fun" competition to find a name for this 8)

http://krebsonsecurity.com/2010/08/...

His current posting is a bit more alarming for some,

http://krebsonsecurity.com/2010/08/...

He comments on the very large number of critical bug fixes microsoft pushed out today Oh and mentions Adobe's big patch/fix in passing.

In both cases it marks a significant upturn in the numbers. why this might be is a matter of conjecture at the moment, but I have a feeling that it is more likley to be bad news rather than good...

Davi OttenheimerAugust 10, 2010 7:47 PM

"Jailbreaking is legal now in the US. Jailbreaking smartphones was added in as an exemption under the DMCA."

1) Jailbreaking is used as a term specific to Apple but it is really just unlocking a SIM from the IMSI

2) It varies by country but jailbreaking/unlocking a phone was *never illegal* in the US.

I repeat, it was never illegal to jailbreak/unlock a phone. ATT refused to unlock the iphone, but they could do it legally

Apple only threatened to pick a fight and try to make it illegal -- suing those who made unlock software (under Section 1201) -- because they did not want anyone but ATT to unlock the phone, and since ATT refused...

3) Apple is obviously a huge proponent of DMCA and suing people so it makes sense they threw it into their warning/threat letters from the legal dept

4) However, permanently locking the SIM is entirely counter to telecommunications policy in the US (Congress and the FCC have said locks are anti-competitive) as well as around the world

http://cyberlaw.stanford.edu/attachments/...

I've seen the jailbreak exemption called a win and a change, but it looks to me more like a continuation of what is the status quo. It would be very odd if the US govt had agreed in any way with Apple's position. Rivers would flow uphill, we'd be back in the USSR, etc.

How soon we might forget that before the 1960s in the US even home telephones were owned by the telephone companies and regulated. The "Hush-a-Phone" suit from the 1950s, as hilarious as it might seem today, as well as the subsequent Carter Electronics suit opened the door to unlock home phones and the rest is history (thank you Al Gore)

I still have some pre-1960s Bell wiring I rescued when I rewired an old home. Their home-call engineers had some cool craftsmanship with RJ11 but it was unnecessary and definitely not worth the huge disadvantages of being locked-in.

Tom T.August 10, 2010 9:54 PM

@ t3knomanser:
"Software is complex. An operating system and all of the components that surround it in modern computers is pretty much as complex as software gets. The more complex a system is, the more likely it is to have unforseen consequences. This can be bugs, novel behavior, or security flaws."

and @ Patrick Henry:
"If anything, this is a lesson that having Apple control your phone soup-to-nuts is NOT more secure than having control of it yourself."
*********************
Both are among the reasons why I've deleted about 95% of the XP system on which this is being written, and why I don't use MS Auto-Update. Less complexity = smaller attack surface, fewer vulns. Choose my own updates from the list offered. Most don't affect this system, since most of the files have been deleted.

%windir% = 150 Mb vs about 4 Gb; about 650 files vs. 7-10,000 or more. Side benefits: faster, quicker boots, faster and more compact backups, faster defrags, etc. etc. Yep, I'd rather control it than have MS control it, and it's about as simple as it can get, while still being fully functional.

(Yes, I know that there can be vulns in the remaining files. They get updated as needed.)

Maybe some day, simplicity will once again take precedence over feeping creaturitis and bullet-point marketing, but the trend is definitely in the other direction. So take control yourself.

Nick PAugust 11, 2010 1:08 AM

@ David Ottenheimer

Good points, but it's hard to totally agree. DMCA has been used to nail people for all kinds of stuff. The key part of the DMCA that's disconcerting is the portion that prevents one from circumventing copy protection technologies. Under those portions, hacking an iPhone through a software flaw and circumventing their application control mechanisms was a DMCA violation. So, a court *could* have ruled in their favor.

Now, they can't. I've seen a lot of insane DMCA cases work out in the plantiff's favor and many people settle out of fear of being bankrupted by litigation costs. So, I think this is quite significant. It's also a step in the right direction in government policy, as far as I'm concerned.

Clive RobinsonAugust 11, 2010 4:18 AM

@ Davi Ottenheimer,

"I still have some pre-1960s Bell wiring I rescued when I rewired an old home. Their home-call engineers had some cool craftsmanship..."

Yes when I was (a lot) younger, I studied "electronics & communications" a large part of which was "craft skills" and I could "wire wrap" "lace" "silver solder" "wipe solder" "wire solder" "cut" "file" and "polish" steel with the best of them (maybe not so good these days I'm well out of practice).

The most sadistic thing they made us do was turn a stick of chalk down on the lath and then put a thread on it. And believe it or not it came in handy when doing microwave engineering I had to make some dieletric screws to correct somebody elses cock up

also are you and @Nick P talking about the same thing?

As far as I was aware "unlocking" (the service provision) and "jailbreaking" the OS so you could load "non approved apps" where two different things (though abstracted a little they are both aimed at stopping a tied market that encorages illegal monopolistic activites).

As far as I'm aware in the UK "unlocking" is not illegal when you "own the phone" (only a contractual issue when not owned) "jailbreaking" is more difficult due to "M'learned friends" interpretation of technology and it's capabilities.

My viewpoint however is "if I'm paying for a tangable good that I walk out the shop with" then it's mine to do with as I see fit. I'ts one reason I've never owned a "games console".

With intangable goods such as software the water is more muddied by "attractive nuisance", which effectivly mandates the "greater good" over "individual freedom" (the example usually given is the "unfenced swimingpool" that a neighbours child drowns in). It puts responsability on end users not to make their phones / computers "cracker bait" that can then be used to harm others. However in turn the end user has a defence of "unknowingly using defective goods" which moves the liability up to the supplier of the good and ultimatly back to the manufacturer.

This enables phone manufactures and suppliers to say they have "Jailed" their phones "for the greater good" and to prevent "atractive nuisance".

However their behaviour sugests the real reason is at best "brand control" verging on "monopolistic behaviour" which as we know is illegal but can take a lifetime to resolve...

Kevin EshbachAugust 11, 2010 8:21 AM

Now can this exploit be used to downgrade from IOS 4 to IOS 3.x so a gps app that gotten broken with IOS 4 update can be used again?

RichAugust 11, 2010 8:49 AM

Little known fact: Apple doesn't give a rats ass about it's users. Proof: Antenna issues. Now this. How is it not fixed yet?

edAugust 11, 2010 1:05 PM

@Rich
Apple has a fix. It hasn't been publicly released yet.


@ Brian at August 10, 2010 4:19 PM
And I disagree with Tim. I love my apple devices, but I think android would have a fix within a day or two.

The question isn't just whether there's a fix or not. The question is whether users are able to update their phone's firmware with the fix or not. Some Android phones lack this capability, or lack a later version of Android that runs on their phone.

agbAugust 12, 2010 5:27 AM

>> How is it not fixed yet?

It got fixed pretty quickly, if you think about how much work is involved in correctly assessing the problem, determining the correct fix, implementing/reviewing/testing the fix, integrating the changes into the system, doing a round of proper "we just changed part of the system and now we're going to release an OS update" testing, staging the deployment, testing the deployment . . .

Getting the picture? Either you want it done immediately or you want it done right.

mooAugust 12, 2010 9:53 AM

@kashmarek:

Its very unlikely that anyone would put vulnerabilities like this into their own software "on purpose". If they made a deliberate backdoor, it would not take this form. On the other hand, accidental vulnerabilities just like these ones are very common in any moderately complex software.

Someone who has never written software can not appreciate how complicated a piece of software really is internally. Software is not bound by constraints of the physical world (such as 3-dimensional geometry or physical limits of materials, heat, power draw etc). Software can be as arbitrarily complicated as the humans making it can handle, and we're always pushing our limits complexity-wise. Modern software applications consist of hundreds of thousands, sometimes even millions, of lines of code. They often build on libraries of code written by other people, sometimes dozens of them.

When trying to explain software bugs to people, I use the following analogy. Imagine you are working on a part of a machine that has more than three million moving parts. Dozens of other people are also working on this machine (lets say 50 other people), every day, for months or years. They are ripping out pieces of it and changing them or replacing them with other pieces. So are you. The requirements and goals of the machine as a whole are constantly being modified by outside stakeholders; nobody has a "giant blueprint" of the whole machine to consult to see if it is correct or not, it is a giant experimental mass that is incredibly complicated. When you try to use it in unusual ways (edge cases), sometimes it does the wrong thing or causes some paper to catch fire. Then a team member will find a few dozen of those millions of moving parts which appear to be responsible for the fire, and rip them out and replace them with some different, more sophisticated parts. They will do this thousands of times before the machine is pronounced "done".

But software is rarely if ever completely "done": fixing EVERY bug would take so long and be so incredibly difficult, that most software would never actually get shipped. Instead, we try to fix as many as possible and then pronounce the final thing as "good enough". In some cases the software has to be ready on a defined schedule, and so whatever state its in when the deadline rolls around will be considered "good enough" unless its completely unusable. Anyway, thats why so much modern software sucks (i.e. has bugs or security vulnerabilities).

There is a famous quote by a famous developer named Brian Kernighan which goes like this: "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it."

Marc EspieAugust 12, 2010 9:56 AM

the cool thing is, now jailbreaking is permitted.
so we're probably going to see a long stream of apple vulnerabilities exploited to jailbreak iphones.

If Jobs wasn't such a control-freak, eventually, he would just give up and open the iphone.

As things stand, things are going to be interesting. Imagine: you're allowed to publish zero-day exploits, as long as you craft them as jailbreaking devices !

Matt PerkinsAugust 12, 2010 11:26 AM

Some people are clueless Apple fanboys who will blame anyone but Apple for Apple's security issues.

Apple can lie, be caught lying and deceive users and the fanboys will still take Apple's side. Like mindless zombies who needs Apple to tell them what to do, where to go, what to eat and what things to buy.

Why would we want a ping sent to Apple if the jailbreak happens? Currently if your jailbroken phone becomes "bricked" you can still take it into the Apple Support Center and they'll fix your iPhone or iPod Touch because there is no way for them to tell off the bat if the iPhone or iPod Touch has been jailbroken unless your stupid enough to tell them that you jailbroke it.

But yes let's let Apple know when a iPhone or iPod Touch is jailbroken so we can really take away a user's freedom. Apple hasn't taken away enough of the user's freedom as is. What next? Let's Jobs tell the fanboys when they can sleep and eat? And if they don't do everything Jobs says when he says it, he'll break all their Apple products.

Nick PAugust 12, 2010 8:31 PM

@ moo

Nice description of the problem's scope, but that's not why software is as bad as it is. Precise formalisms and domain modeling tools can eliminate many requirements issues. Certain languages and platforms don't have issues like buffer overflows. Libraries and tools exists that automatically prevent or quickly find many errors. Additionally, development processes like Cleanroom and Fagan Software Inspection Process have been shown to significantly reduce bugs while adding little to the cost or labor.

It's not that it's inherently unsolvable or even contradictory to market forces. It's just that companies don't do it. Call it mismanagement, call it holding onto legacy... many potential reasons. But numerous software firms produce low defect software regularly, on or ahead of schedule, with acceptable cost. Sometimes it's even cheaper due to less validation and testing, as often happens in Cleanroom and SIP. It's being done right now.

Developers must quit making excuses for their management's bad decisions or their own poor processes or trade offs. This is why I favor software liability legislation that mandates lifecycle defect prevention, although not necessarily a specific method. The method should be chosen based on the company's needs and the problem at hand. Internal quality control measurements can let them monitor the bug rate.

Clive RobinsonAugust 14, 2010 12:47 AM

@ moo, Nick P,

The problem is as I've said many times before code cutters are more artist than engineer.

There is a "macho" behaviour pattern in many code cutters, the worst forms are usually seen around those using Microsoft Windows Foundation Class... It's often a case of "It's my knowledge! you want to know, go figure for yourself, that's what I did!" with self apointed MFC experts when asked by others for help. It is considerably more than just a "job protection" attitude.

Compare this to "Unix Macho" where geeks will explain in depth the ins and outs of ioctl() fcntl() (till your brain hemorrhages ;)

Fundementaly it's like you used to see with Hospital Consultants a "right of passage" Machismo, where a counter productive way of working is seen as being a way to "sort the men from the boys", and the "shared pain" of having gone through it the price of membership in a closed shop.

We all know that Microsoft Foundation Class and Unix I/O device control should be consigned to the dustbin (garbage / trash / waste) of history but we keep them alive on the excuse of "backwards compatability" or "legacy code"...

But underneath this is the real issue of "the tools I was taught" and the attendent "learning curve".

As humans we will always try to use the tools we learnt when we were in our teens and twenties, and we pretend we are "more efficient" using them than because "learning new tools", takes time we don't have...

Which is the old issue of "short term gain -V- long term loss", we'd rather produce a little now than stop learn a better way and produce way more in the long term. And it's something that managers encorage with "what did you do today" attitude.

Clive RobinsonAugust 14, 2010 12:55 AM

@ moo, Nick P,

The problem is as I've said many times before code cutters are more artist than engineer in temperament and methadology.

There is a "macho" behaviour pattern in many code cutters, the worst forms are usually seen around those using Microsoft Windows Foundation Class... It's often a case of "It's my knowledge! you want to know, go figure for yourself, that's what I did!" with self apointed MFC experts when asked by others for help. It is considerably more than just a "job protection" attitude.

Compare this to the almost as bad "Unix Macho" where geeks will explain in depth the ins and outs of ioctl() fcntl() (till your brain hemorrhages ;)

Fundementaly the underlying issues is the same it's just the different way they exhibit it.

It's like you used to see with Hospital Consultants a "right of passage" Machismo, where a counter productive way of working is seen as being a way to "sort the men from the boys", and the "shared pain" of having gone through it the price of membership in a closed shop.

We all know that Microsoft Foundation Class and Unix I/O device control should be consigned to the dustbin (garbage / trash / waste) of history but we keep them alive on the excuse of "backwards compatability" or "legacy code"...

But underneath this is the real issue of "the tools I was taught" and the attendent "learning curve" of the new.

As humans we will always try to use the tools we learnt when we were in our teens and twenties, and we pretend we are "more efficient" using them, because "learning new tools", takes time we feel we don't have...

Which is the old issue of "short term gain -V- long term loss", we'd rather produce a little now than stop learn a better way and produce way more in the long term. And it's something that managers encorage with "what did you do today" attitude.

BSc 40 years agoAugust 14, 2010 4:50 PM

@Brandon Thompson: I understand you probably intended it as a metaphor, but higher security surely corresponds with lower, not higher, entropy.

Nick PAugust 15, 2010 1:18 PM

@ BSc 40 years ago

"higher security surely corresponds with lower, not higher, entropy"

Depends on what your securing from. Nature increases entropy in the gene pool to accelerate evolution and prevent one germ/condition/etc. from wiping out a whole race. Higher entropy in an RNG and anything that depends on it increases security. In a given system, there's always a sweet spot for how much entropy it should have. Can't have too much or too little, but lower is surely not always better.

sleAugust 15, 2010 6:10 PM

I guess that the vulnerabilities used would have been void if the iPhone OS was protected by http://en.wikipedia.org/wiki/...

In fact I'm surprised to see most designers of closed systems (Xbox 1, wii, iPhone...) learning it the hard way. In this iPhone case, it is even more surprising as the open Mac OS is protected, but not the closed OS.

Clive RobinsonAugust 15, 2010 11:24 PM

@ sle,

"In fact I'm surprised to see most designers of closed systems (Xbox 1, wii, iPhone...) learning it the hard way"

I think that they will never actually learn the lesson properly...

The issue is one that is fundamental to the computer hardware not just the CPU but all the memory as well.

To subvert a computer you need to do two basic things,

1, Have code to execute.
2, Get the CPU to execute it.

Of the two the second is generally the easier to do, and there are well recognised ways to achieve it. Put simply you modify the Program Counter to point to memeory where there is valid code.

This gives rise to the notion of "what is valid code" and "how does it become available".

Valid Code is basically data that is meaningful in some way to the CPU as "instructions".

In the general von Neumann architecture there is no difference between code and data they both exist in the single linear memory space. Thus there is no implicit protection in the architecture design. Contrast this with the strict Harvard archetecture where there are two sperate memory spaces one for code the other for data, this means that there is a high degree of protection in the architecture.

The "only" significant advantage of the von Neumann architecture in ordinary usage is it can "self boot" a program via a lever loader or equivalent. In non ordinary usage the von Neumann architecture allows code to be "self modifying" which apart from Malware is often very undesirable (there are a limited number of uses for self modifying code in AI and other research).

The major downside of the Strict Harvard architecture is it cannot load code or modify code in any way thus it requires either the code memory to be immutable or loaded by another method. A secondary issue is the duplication of resources for having two linear memory spaces.

Thus from a security asspect the strict Harvard architecture is way ahead of the von Neumann architecture. However as a stand alone CPU the strict Harvard architecture is of little use in general purpose computing because of this security strength.

So on the face of it the silicon designer has two choices go for a high performance secure architecture (Harvard) or a low performance low security architecture (von Neumann).

Actually there is a "third way" for moderatly complex and above CPU designs, which is "modify the Harvard" architecture. The usual path taken by silicon designers results in high performance but low security.

Worse they then bolt security back on, in various ways involving significant use of expensive "silicon real estate".

Untill CPU designers look beyond this the ability of Malware to load data and make it valid executable code will persist...

However if you "re-visit" the Harvard Architecture but not as a "Single CPU design" you realise that you can get high performance and high security of the Harvard architecture and still have the ability to load programs and external single linear memory space.

What you need is a second limited capacity CPU that controls the MMU thus becoming a hardware security Hypervisor. It effectivly runs the OS whilst the strict Harvard CPU does the heavy lift of applications code and data.

There is a cost of course, you lose the ability to have self modifying code... Is this a significant issue not in by far the majority of cases. Nor is it likely to be in the minority of cases as there are other ways to do it.

Davi OttenheimerAugust 16, 2010 3:14 AM

@ Nick P

I think we agree on most points.

I'm just wondering aloud if the tone of victory could be more like "once again the bridge did not fall when we drove across it"; another day, another anti-competitive practice denied. 1956 was a huge and hard-won victory of values. This was more a regular common-sense victory.

Here's the opposition statement from Apple.

http://www.copyright.gov/1201/2008/responses/...

It's so full of hubris it's hard to distinguish it from a marketing brief. You can almost hear the authors patting themselves on the back as they wrote about the iPhone as "stunning", "dramatic" and "high powered".

The crux of the issue for them seems to be in this phrase:

"a proprietary mobile computing platform protected by copyright can be transformed into one on which any third party application can be run, without taking account of the undesirable consequences that would ensue from the transformation"

IMHO they completely miss the mark on jailbreak. First and foremost it allows the phone to use a different carrier. I could not find a single reference or answer to this point, which is really odd. It is almost as if their harm is 100% focused on apps, tweaks, utils and media. I would have no need to even consider the jailbreak if the SIM could be unlocked without it. If there is no harm from the unlock, then they should add that feature and reduce the jailbreak harm that they claim.

You are very right that this is also about the money of litigation, but that is different from a question of what is legal. Note that the Hush-a-Phone suit forced the maker into bankruptcy. Jailbreak was never found illegal -- just potentially expensive.

Clive RobinsonAugust 16, 2010 5:52 AM

@ Davi Ottenhimer,

In the PDF the first complaint about the EFF submission is that the EFF seeks to change Apples business model...

Thus one can fairly assume that Apples submission is ONLY about maintaining a closed market over which they have full control (ie a Monopoly to enforce anti-competative practices) and are trying to disguise this by claiming that the EFF proposal would allow "copyright infringment".

I'm not sure about the US but as far as I'm aware in the UK and Europe you cannot arbitarily claim copyright or infringment over somebody elses original work because they attache it to your (supposadly) original work. It would be like GM claiming copyright infringment because you put a bumper sticker on your GM car.

BSc 40 years agoAugust 16, 2010 7:06 AM

@Nick P

"Can't have too much or too little, but lower is surely not always better."

Point taken. Thank you.

OdoAugust 16, 2010 8:42 AM

Excellent discussion!

I would have one question:
What would happen if there wouldn't be any bugs to exploit? (i.e. Harvard architecture)

So, the devices or consoles would be bullet proof, no hacking, jailbreaking or anything. Just the device itself, as Apple, Sony, Microsoft etc designed it so "kindly" for us!
Then we would all be caught in the Monopoly, and all of us would be Apple fanboys/fangirls, since we bought their products!!!

But I don't think it would ever happen!
I think that even a Harvard architecture computer could be tricked to execute arbitrary code by hacking the loader, but obviously much harder to do. For example if you open the device up and use something like JTAG interface...
So there will always be something to hack.

However it is quite odd that we have to rely on this hacking and not have the device perform at its full potential (as we want) from day one!

David ThornleyAugust 16, 2010 9:12 AM

@Clive: Why would the Harvard architecture be all that much higher security?

Consider the interpreter: it's a piece of executable code that executes non-executable code. There's been plenty of malware in strictly interpreted code. Nor can you put up a barrier, as there's no sharp line between a program that processes data and one that interprets it.

Moreover, people will insist on running programs of their own choosing, and some of us insist on being able to execute programs that we write. This means that there has to be a loader of some sort that's able to load executables, and it has to be invokable by the user. Think of how much malware runs because the user was tricked into running it. As I understand it, true viruses are not in favor, because modern OSs have reasonably good defenses.

I also think you understate the impact of banning any form of self-modifying code. This would remove "eval" from all programming languages, and permanently limit their development. It would make JIT compilers impossible, as well as any form of run-time optimization.

SleAugust 16, 2010 2:47 PM

@Clive

In this case, if IOS would have used the NX feature of its ARM processor, this jailbreak would be quite impossible.

Haward arch can be more secure, but up to my knowledge correct usage of NX would have been enougth in that case. The iPhone OS based on MacOS should have inherited of this protection from its parent. It is surprising to see a recent and closed OS lacking this old feature.

Clive RobinsonAugust 17, 2010 8:11 AM

@ sle,

I'll answer your points in reverse as it will be easier,

"It is surprising to see a recent and closed OS lacking this old feature"

The feature is in many CPU architectures not that old (say around ten years) and again OS support for it is less than ten years old.

Closed OS's especialy on systems that are effectivly "embedded" are not known for using the more "esoteric features" of the CPU simply because the run of the mill development tools don't make using them easy.

"The iPhone OS based on MacOS should have inherited of this protection from its parent"

In general yes you would expect a later OS to inherit the features of it's predecessors, but...

You also have to consider that it was a "translation" of an OS for a general purpose platform to an OS for what was effectivly an embedded platform with considerably less resources. You have to consider what would be "lost in translation" and why. For starters the CPU is of a different architecture and so is all the IO. Thus I would expect the low level side of the OS to be entirely re-written and the next couple of layers up (memory managment etc) to be vastly simplified to fit within the resources.

Thus many features of the OS would face triage, and as it's not "bells and whistles" an esoteric security feature is not likley to be high on the Marketing Dept's list of "wants" let alone "must haves". Nor for that matter would it be high on the developers lists as it is one of those "invisable features" that only get attention when they fail in action... (yes I might sound cynical but I've been through this sort of thing a number of times).

"... to my knowledge correct usage of NX would have been enough in that case."

For this attack yes, but this is with hindsight...

If you consider that all attacks follow the "low hanging fruit" principle, if the NX feature had been implemented then they would have found another route to "jailbreaking".

Now I'm not familiar with the ins and outs of the ARM CPU or the Apple iPhone OS but I am aware of some issues on other architectures and the consiquences one of which is why W^X is another option.

The first issue and it can be a real deal breaker on limited resource systems is NX page size. The NX 'bit' can be for individual memory bytes or memory blocks, the smaller this granularity is, the more CPU, MMU, memory or battery resources it requires.

In some architectures the granularity can be page sizes of 64K down to 4K both of which are OK for general purpose platforms such as a PC or Mac but not for an embedded device where even a page size of 128bytes might be to big. Further there is the issue of the number of "open applications" (although I believe the iPhone OS is not strictly "multi-tasking" thus it might not be an issue) in that each app needs a code space (RO) a heap space (RW) and a persistant or constant data space (RO). Due to the poor writing of applications and the likes of "malloc" and friends heap space is a real unknown and thus could be on many non sequential pages (depends on MMU) which can realy chew memory up.

Then there is the issue of when is data code and when is it code that looks like data to the CPU.

This is the "interpretive language" issue and it's realy messy. Neither NX or W^X handle it well although arguably W^X does it marginaly better when all the resource issues are taken into consideration.

The likes of java byte code interpreters is a real problem. Effectivly the byte codes represent many many native CPU instructions and are thus byte code programs are very efficient users of memory space. The trouble is the neither the CPU or the OS has any way to know what is data embedded in the byte code or byte code that looks like data passed from another application etc.

This means that the byte code interpreter has to tell the OS and also lose a lot of optomisation strategies many of which are very very usefull not just in byte code languages but in other areas.

For instance a language like Forth or a scripting language (sh bash perl etc etc) can have real issues. although there are well known solutions they all have signiicant trade offs.

Finally there are "Ring 0" issues. Running the bulk of the OS in Ring 0 may not be a good idea and applications should never be even close. The problem is ring to ring activities (application calling library code, kernel calling IO driver) is fraught with issues not just for security but efficiency. In some cases it is as expensive as a full context switch.

Although not a real issue (say a couple of %) on general purpose platforms (PC / Mac) it's going to be a major issue on a resource limited platform especialy one running on batteries.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..