DROPOUTJEEP: NSA Exploit of the Day

Today's item from the NSA's Tailored Access Operations (TAO) group implant catalog:

DROPOUTJEEP

(TS//SI//REL) DROPOUTJEEP is a STRAITBIZARRE based software implant for the Apple iPhone operating system and uses the CHIMNEYPOOL framework. DROPOUTJEEP is compliant with the FREEFLOW project, therefore it is supported in the TURBULENCE architecture.

(TS//SI//REL) DROPOUTJEEP is a software implant for the Apple iPhone that utilizes modular mission applications to provide specific SIGINT functionality. This functionality includes the ability to remotely push/pull files from the device, SMS retrieval, contact list retrieval, voicemail, geolocation, hot mic, camera capture, cell tower location, etc. Command, control, and data exfiltration can occur over SMS messaging or a GPRS data connection. All communications with the implant will be covert and encrypted.

(TS//SI//REL) The initial release of DROPOUTJEEP will focus on installing the implant via close access methods. A remote installation capability will be pursued for a future release.

Unit Cost: $0

Status: (U) In development

Page, with graphics, is here. General information about TAO and the catalog is here.

In the comments, feel free to discuss how the exploit works, how we might detect it, how it has probably been improved since the catalog entry in 2008, and so on.

Posted on February 12, 2014 at 2:06 PM • 26 Comments

Comments

Kai HowellsFebruary 12, 2014 3:43 PM

Now this is an interesting exploit. From the description, it requires physical access to the device to install. The question is however, do they jailbreak the phone as part of the installation process, or do they have another backdoor into iOS to install their app.
Or did Apple cooperate and actually sign their app with a proper distribution certificate? Or, have they compromised whatever CA that Apple are using and made their own code signing cert.

Of these, I'd say it's most likely they use a jailbreak style exploit to sideload this implant onto the target device. If this is the case however, it probably requires rebooting the device - so if you have a lock (either a device lock or a SIM lock) then that potentially complicates things. In particular, if you have a SIM lock then you'd have to unlock your SIM again once you get your device back.

With the older iOS upgrades, particularly those that were in existence around 2007/2008, they basically wipe the phone's OS partition and install a whole new OS, so it's likely that a software update would remove this.

With newer devices, and newer versions of iOS however, minor updates (e.g. from 7.0.0 to 7.0.1) are installed as a delta update, so it's unlikely that such an update would result in the removal of such an implant from a current device.

J. PetersonFebruary 12, 2014 4:33 PM

If you read Apple's "denial" of NSA cooperation:

Apple has never worked with the NSA to create a backdoor in any of our products, including iPhone. Additionally, we have been unaware of this alleged NSA program targeting our products. ...

You see there's nothing there that precludes Apple tossing the source code to iOS over the wall to the NSA, and thus being "unaware" of whatever mechanisms the NSA subsequently found to hack the iPhone.

Greg SlepakFebruary 12, 2014 5:44 PM

@J. Peterson: I disagree with your assertion. I believe the statement: "Apple has never worked with the NSA to create a backdoor in any of our products" necessarily means that they never gave their code over.

IMO, giving code to the NSA falls under the definition of working with them.

That said, how this works, I have no idea, and I'm disappointed in this post, I was expecting new info.

Only someone familiar with Apple's hardware or iOS implementation would be able to give a good answer. That list would include:

- Jailbreakers
- Apple employees
- The NSA (probably falls under "jailbreakers")

Greg SlepakFebruary 12, 2014 5:51 PM

Re: "remote installation capability", this already existed I believe. Search your favorite search engine for "iphone jailbreak pdf".

My guess that they pursued something similar (get target to visit a link on their phone and exploit a similar vulnerability).

Nick PFebruary 12, 2014 9:11 PM

@ Kai Howells

It actually says "close access methods" not physical access. The wording might mean close enough for bluetooth, wifi, etc. We know they already have attacks on these. They might also look at Darwin code for such exploits. So, we can't be sure if it's an actual physical attack or a wireless attack.

@ ALL

CHIMNEYPOOL is supposed to be a malware kit. Figuring out what STRAITBIZARRE is will help understand DROUPOUTJEEP. All I know is it's made by "Data Networking Technologies," whoever they are. So, here's a list of other references to these codenames so an enterprising reader can try to figure it out based on their described functions.

DROPOUTJEEP implant. STRAIGHTBIZARRE-based software implant for Apple iPhone. Uses CHIMNEYPOOL framework. Compliant with FREEFLOW to be supported in TURBULENCE. Requires being close to target.

IRATEMONK software rootkit for desktops/laptops. Implants the hard drive firmware to get execution with MBR substitution. UNITEDDRAKE or STRAIGHTBIZARRE used with SLICKERVICAR to upload HD firmware to target machine to implant IRATEMONK and its payload.

WISTFULTOLL is a UNITEDDRAKE and STRAIGHTBIZZARE plugin for harvesting forensic information from a target using [Windows stuff]. If used remotely, the extracted information is sent back through UNITEDRAKE or STRAITBAZZARE.

TOTEGHOSTLY 2.0 is a STRAITBIZZARE based implant for the Windows Mobile OS. Uses CHIMNEYPOOL framework. Compliant with FREEFLOW and therefore TURBULENCE. Command and control can be via SMS messages or GPRS connection.

CROSSBEAM is a reusable CIMNEYPOOL-compliant GSM comms module capable of collecting and compressing voice data.

COTTONMOUTH-1 USB hardware implant. "will communicate with Data Network Technologies (DNT) software (STRAITBIZZARE) through covert channel on the USB." GENIE-compliant, CHIMNEYPOOL-based.

William LeeFebruary 13, 2014 2:16 AM

All communications with the implant will be covert and encrypted.
Am I the only one who finds this somewhat ironic, bordering on hypocrisy? (Not that just about anything to do with the NSA isn't already hypocritical)

SchneieronSecurityFanFebruary 13, 2014 2:31 AM

Wouldn't the communication via SMS or GPRS be reflected in a monthly phone bill?


In the U.S., the exclusive carrier was AT&T Wireless in 2007-8. An iPhone customer also purchased an unlimited data plan. Maybe that would help hide GPRS communication, but I don't think that would help pay for SMS.

A. SnareFebruary 13, 2014 2:47 AM

@J. Peterson, @Greg Slepak: US export regulations basically stipulate that Apple is required to hand over source code1 to the US Government in order to get CCATS approval and allow export from the US.

The rules and regulations here are quite complex. It's possible that Apple may have found a way to weave through the exemptions, but I doubt it.

To quote information provided by Apple to developers for the App Store:


It is not necessary to provide Apple’s source code to the government because it has already been reviewed and approved by the U.S. Bureau of Industry and Security (BIS).

(Source: "World Wide Trade Compliance for the App Store", inside Apple's iTunes Connect site.)

This all notwithstanding, source code isn't necessarily required for this implant: so far pretty much every version of iOS has proven to be jail-breakable given physical access.

1It's not clear to me what the scope of this would be.

ReferencePleaseFebruary 13, 2014 3:21 AM

@A. Snare: "US export regulations basically stipulate that Apple is required to hand over source code^1 to [...] allow export from the US."

Can you provide a link or a reference about that ?

Thanks.

Marcos El MaloFebruary 13, 2014 8:30 AM

@Nick P

Iirc, in 2008, iOS wasn't capable of over the air (OTA) updates. This would point to jail breaking or some exploit having to do with the syncing component of iTunes.

iTunes used to be scriptable (it might still be for all I know), meaning it had APIs for dealing with interactions with other code. I don't know if this would be exploitable for NSA purposes, but it might be a path to research.

@J. Peterson

There's nothing that precludes you from spreading disinformation. I don't know who you are or what your motivations are. I invite you to disclose your identity and posting history on other websites.

Marcos El MaloFebruary 13, 2014 8:33 AM

@Anonymoose

If you were a hipster math major, but not talented enough to go work for a hedge fund, would you rather work for Starbucks or the NSA? (Remember to factor student loans into your calculation).

Nick PFebruary 13, 2014 10:58 AM

@ Marcos

There's a difference between the "feature" OTA and the process of injecting code into a phone. One can do the latter without the former. To clarify my previous post, I was suggesting they exploited a 0-day in the wireless drivers or OS via wireless protocol. However, it's safe to think any modern phone can be subverted via OTA if they have it.

re iTunes

Good thinking. The extra benefit of iTunes is it's one of those programs users update a lot. A faux iTunes update saying it's for "bug fixes or security issues" would be a nice way to deliver a malicious payload.

WaelFebruary 13, 2014 12:29 PM

@ Nick P,

However, it's safe to think any modern phone can be subverted via OTA if they have it.
It's more than safe to think, and depending on what "it" refers to, "having it" is not a requirement either, in case "it" refers to the phone. If the phone has coverage (one or more bars), it's close enough. If "it" refers to the phone having "OTA" capabilities, then the "If" is just about invariably true.
To give an analogy to the level of vulnerabilities a cell phone have, if non-mobile computing devices such as a desktop require a few attack trees to describe the possiible attacks, Cell phones would require an Attack Forest diagram the size of the Siberian forest for the same purpose. You can take that with a grain of salt, the size of ___ (insert your favoite part Lot's wife here that @Clive Robinson left for you)

33jnf3kjfn3fkjFebruary 13, 2014 12:48 PM

TRANSLATION: IOS RAT with encrypted com over SMS and GPRS(probably xcode based so it doesn't matter what PHY for connection, maybe some dynamic around metered connection stealth). Does typical IOS RAT stuff..

Again, these unpublished-spec propagation kits mentioned in these popular consumer platform packages are where the news is for the software kits. They are usually zero-days and either found in house or bought from firms like Vulpen..

I dig through these published kits and notice all the propagation kit code-names are unpublished.. Those are where the news is, and I suspect they are waiting for vendor patches before they publish so people don't go looking for the vulnerabilities before patch cycles..

The IOS vulnerabilities in STRAITBIZZARE(or whatever exploits it's parent kits use) are probably extremely valuable, even if they are waiting for vendor patches before publishing it'll still be used in an exploit kit because of how IOS updates are sometimes avoided.. Maybe used for jailbreaking too..

33jnf3kjfn3fkjFebruary 13, 2014 12:59 PM

UPDATE: SLICKERVICAR contains an unpublished tethered exploit for IOS which it uses to write this to memory through IOS encryption processes that seem to have been defeated or shared. It also contains x86 privileged escalation exploits to load NT, OSX, and Linux stuff; like the HDD firmware packages..

Greg SlepakFebruary 13, 2014 2:44 PM

@A. Snare, thanks for bringing up the exports / CCATS thing. I had to deal with that myself actually, so I'm somewhat familiar with what you're talking about.

From what I can remember, I'm fairly certain (nearly positive) that they are not required to give all of their code over, but only the portions that could be considered novel or non-standard crypto.

So... I don't know what they gave, but I think it might be on public record if they did, so you may actually be able to see exactly what they gave if you try hard enough.

Greg SlepakFebruary 13, 2014 2:52 PM

For the CCATS thing, it might be that you have to submit the source to any crypto that you wrote, and then describe the crypto that you're using that you did not write (perhaps provide links to open source projects).

DBFebruary 13, 2014 3:12 PM

I'd just like to point out that Apple's denial does not state:

"Apple has never worked with the NSA [period]"

Instead if states:

"Apple has never worked with the NSA [with the specific and known purpose] to create a backdoor in any of our products"

There's a very big difference there... This is the classic non-denial denial. Of course Apple has worked with the NSA, they're just feigning ignorance of what the NSA was doing with it! And wording it in such a way that they're purposefully trying to lead you to believe something that simply isn't true, without technically "telling a lie."

DBFebruary 13, 2014 3:26 PM

Another case:

"It is not necessary to provide Apple’s source code to the government because [blah blah]"

Ahh... just because it's NOT NECESSARY does NOT mean it HASN'T BEEN DONE... you see the little logic loophole there? Another non-denial denial. They didn't take days, weeks, or even months to come up with these kinds of wordings because they were lazy, but they were trying to carefully craft this...

DBFebruary 13, 2014 3:29 PM

...and so... the solution of course is to only use open source... and get involved in making more open source.

Nick PFebruary 13, 2014 3:39 PM

@ Wael

"insert your favoite part Lot's wife here"

Lol. Razor sharp wit as always.

"if it has coverage"

Great point. I must have been in a hurry to forget to include that very obvious attack vector: baseband stack. We've discussed here how insecure they are and Clive pointed out mandated functionality that benefits intelligence agencies.

The current trend is to consilidate system CPU and baseband stack using microvirtualization. I was advocating that for a while when I was into separation kernels. More recently, esp seeing TAO catalog, I think moving back to physical separation is better. My idea is to use a MMU/IOMMU to partition the memory such as baseband stack can't read/write internal data of system. There will be a shared memory space for moving data to and from. The MMU/IOMMU will be controlled by the main processor. I might also add a restriction that its policy can be set only once after it's turned on: it goes into enforcement mode afterward until it's powered off.

I know the design is possible because the old phone approach involved at least two chips. So, adding one chip for baseband and IOMMU somewhere to these phones that reliably keeps any and all device firmware in check seems doable. The "secure" phones I've seen lately haven't inspired confidence. Even the non-smartphone offerings give me no assurance that they protect from firmware level attacks.

Jethro roseFebruary 16, 2014 10:22 PM

Why bother with an exploit when they could just buy / subpoena the ca infrastructure?

A. SnareFebruary 17, 2014 2:10 PM

@ReferencePlease, @Greg Siepak: I know it's a late followup, sorry.

Firstly, I'm not a lawyer, and it's a few years since I've looked into the whole EAR/CCATS thing in any detail. However my assessment of the situation was that EAR covers anything that uses crypto. It's not just the crypto implementation itself that is covered, but anything that uses crypto1.

As I noted in my original post, the rules are quite complicated2. I freely admit (and must stress) that I am not qualified to understand them fully. However there's this lovely little nugget which helps bypass a lot of the logic:


Items described in 740.17(b)(3) include: […] Cryptographic libraries, modules, development kits and toolkits, including for operating systems and cryptographic service providers (CSPs).

Source: BIS, Classification

I believe that iOS itself would fall under this, at the very least due to it shipping encryption frameworks for apps to use.

Items under 740.17(b)(3) are not eligible for self-classification, which means that a classification request must be submitted. I can't find references any more, but this used3 to mean source code review by the NSA. It now seems to mean a 30-day review, the exact scope of which is unclear from the public documentation.

Finally, the Google's reveal this PowerPoint produced for internal use by HP. Their compliance folk note that the NSA has access to the SNAP-R system (used to implement the EAR/CCATS regulations) and is involved in the 30-day review process.

1. So, for example, a native mobile app which uses OS-supplied web services to access a resource via https, itself falls under and is controlled by EAR. It need not implement any encryption itself.
2. I think they're far more complicated than implementing a modern cipher or hash function. Let's think about that a bit, shall we? :)
3. The rules appear to have changed in June 2010.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Co3 Systems, Inc..