NSA's Secure Android Spec

The NSA has released its specification for a secure Android.

One of the interesting things it’s requiring is that all data be tunneled through a secure VPN:

Inter-relationship to Other Elements of the Secure VoIP System

The phone must be a commercial device that supports the ability to pass data over a commercial cellular network. Standard voice phone calls, with the exception of emergency 911 calls, shall not be allowed. The phone must function on US CDMA & GSM networks and OCONUS on GSM networks with the same functionality.

All data communications to/from the mobile device must go through the VPN tunnel to the VPN gateway in the infrastructure; no other communications in or out of the mobile device are permitted.

Applications on the phone additionally encrypt their communications to servers in infrastructure, or to other phones; all those communications must be tunneled through the VPN.

The more I look at mobile security, the more I think a secure tunnel is essential.

Posted on March 7, 2012 at 1:35 PM52 Comments


Raouf March 7, 2012 2:05 PM

This can only work if you have complete control on your trusted certificates or static keys.
So far I don’t see that anyone has solved that problem on cell phones.
This requires a hardware and firmware infrastructure to protect the keys or the certificates with control and access only by the phone user.
Such solution do exist today on the market, why are they not being adopted?

bullethead March 7, 2012 2:15 PM

There are no secure Android devices because OS upgrades are generally unavailable and thus exploits are never patched.

John F March 7, 2012 3:43 PM

There are no secure Android devices because OS upgrades are generally unavailable and thus exploits are never patched.

This is only true of the stock images provided by the cellular providers, and there’s no reason to assume that the US Govt would use those images, particularly if the government is buying the phones.

If you’re willing to roll up your sleeves and roll custom code you can do plenty to secure a mobile device against the stock attacks. My own phone is running a non-carrier provided kernel right now, and it’s working just fine on the cellular networks. I mostly stop at the kernel, but there’s no reason that the userland utilities can’t be treated in the same fashion, especially if you don’t need, e.g., android market access.

Stephen March 7, 2012 4:42 PM

Where does it refer to a “Secure Android Device” policy? This is a generic mobile capability package for secure voip. It only references Android (seven times) as an example OS that the device management and policy management services should be able to identify OS, Device Type and configuration…

Curmudgeon March 7, 2012 4:43 PM

There would be less of a need to overlay a VPN over cellular data transfers if the NSA hadn’t successfully lobbied to cripple over-the-air encryption standards in the name of protecting their freedom to wiretap anyone anytime anywhere.

Anton March 7, 2012 5:09 PM

Not sure what the target market is.

If business applications, that would be a good thing, as there are few/no alternatives to Blackberries for large Corporations and Government.

Giovanni March 7, 2012 5:21 PM

Note, as Stephen did, that this NSA spec is in reference to voice over IP only.

The NSA is NOT specifying that all data from and to mobile device must go through a VPN. Only voice data.

ChristianO March 7, 2012 6:00 PM

Telephones were never meant to be secure … were they?

You only prove against the system that you are real and since 3G the system proofs against you that it is real and not an attacker.

I am missing any word of end to end encryption. Only tunnel can give you that so it seems.

jake March 7, 2012 6:37 PM

another reminder that SRTP is crap and you need to use IPSec to properly encrypt voice comms. the padding is importante

Alex March 7, 2012 7:07 PM


Yes, I would also say that the telephones were have never meant to be secure though sometimes it helps us in many ways but there are instances where it leads us to mishaps.

Luke March 7, 2012 8:25 PM

What about the behavioural inhibitors? We all remember what happened on the Nostromo…

Kent March 7, 2012 9:35 PM

Anyone see the demo of extracting a private key from a mobile device via side-channel at RSA? I missed that session. Seems like that sort of attack would be of interest to the NSA.

Mark March 7, 2012 10:26 PM

I seem to remember an app called Red Phone that encrypts the packets but uses gsm as the transport. Seems like a much better solution to me, as VPN latency may be an issue.

JohnK March 7, 2012 10:41 PM

“Telephones were never meant to be secure … were they?”


“Zfone™ is a new secure VoIP phone software product which lets you make encrypted phone calls over the Internet. Its principal designer is Phil Zimmermann, the creator of PGP, the most widely used email encryption software in the world. Zfone uses a new protocol called ZRTP, which has a better architecture than the other approaches to secure VoIP.

Doesn’t depend on signaling protocols, PKI, or any servers at all. Key negotiations are purely peer-to-peer through the media stream

Interoperates with any SIP/RTP phone, auto-detects if encryption is supported by other endpoint

Available as a “plugin” for existing soft VoIP clients, effectively converting them into secure phones

Available as an SDK for developers to integrate into their VoIP applications

IETF has published the protocol spec as RFC 6189, and source code is published”

Peter E Retep March 7, 2012 11:11 PM

comment on DHS word list also applies here:
@ emace: Curious: Could not go to the site you named,
as its word ‘spook’ is blocked by our school district web filter,
for its most suspect meaning,
as is “Christchurch”, “breast cancer”, crimes by names, male peafowl, etc.
Yet the Google tunnel lets text out.
Posted by: Peter E Retep at March 7, 2012 9:52 PM

RobertT March 8, 2012 12:20 AM

Wait a minute…”Secure Android”, isn’t that an oxymoron!

“The more I look at mobile security, the more I think a secure tunnel is essential.”

What’s the tunnel for, burying the Android device of the frustrated engineer tasked with securing it?

Gweihir March 8, 2012 3:31 AM

I agree with Bruce. A secure telephone has no business being more than a terminal. But a terminal does not need more than a VPN connection home.

This is incidentally the normal way I use my netbook: As an SSH console. (O.k. I do more on it, but different from a phone, it actually has a real user interface, not a pathetic simulation of one.)

JT March 8, 2012 4:58 AM

I know the NSA and CSS Logos… What are the other two on there? Anyone know? Those two are totally new to me.
Im assuming IAD stands for “Information Assurance Directorate”. But anyone know what the shieldish looking eagle is for?

jeff March 8, 2012 6:53 AM

I remember looking over ZRTP a few years ago. Some of the key exchange seemed intentionally weak, using predictable values for parts of the exchange.
Zimmerman has also made comments in a Forbes interview to the effect that we need enough security to keep safe from hackers but that the government must be able to fight terrorism.
this attitude is far different than that of the man who fought the government over pgp.

David March 8, 2012 7:45 AM


do you have a link to the Forbes article? It’s a surprising statement from Zimmerman and I’d love to read the full piece.

Larry March 8, 2012 8:26 AM

You can do secure voice today using Asterisk with SIP-TLS for the signalling traffic and SRTP for the voice traffic. The Bria softphone is available for Android and supports both. For additional security, you can use the OpenVPN client on Android. However, key storage and certificate spoofing will still be issues until there’s a smart card reader available for Android. (Which makes it less portable, especially if you want to insist on a pin-pad card reader. Blackberry has a smart card reader available.)

Mike B March 8, 2012 9:17 AM

@JT The shield-eagle and the IAD are part of the same IAD logo.

Before people start making fun of Android OS security, Some people I talked with at the conference told me that some of the ideas being tossed about include use of a hypervisor that would put the Android part that people interact with/attack ad the VPN terminal in different virtual machines and then use SE Linux on both the hypervisor and Android OS levels. If TPMs are ever rolled out on mobile devices you could add various levels of measurement to the mix.

Leeroy Jenkins March 8, 2012 11:46 AM

Look up whispercore by Moxie Marlinspike of thoughtcrime labs. It has IDS, a firewall, FDE, and is a completely rewritted ROM. Problem is twitter bought the project and may not release it again

Leeroy Jenkins March 8, 2012 11:49 AM

@Mark Redphone is being released open source soon according to @moxie

Still doesn’t help if you’re underlying Android system is full of google spyware, carrier IQ and who knows what other backdoors.

Wish somebody would code the OpenBSD phone.. no java and no buffer overflow attacks

Tony H. March 8, 2012 1:26 PM

This whole document is a curious one, not least for the poor quality of its editing (e.g. points OCK.6 and OCK.8 appear to be identical), spelling, and overall consistency and style. Maybe the NSA has adopted a “release early and often” approach.

It’s also the first I’ve heard — not that I pay close attention to this stuff — of NSA proposing commercial open standards encryption (in two layers, granted) for use with classified (not merely sensitive) data. They require that the same vendor not provide both crypto layers (and a few other combinations of two components), but they say nothing about making themselves a vendor for even one of the layers.

And it’s almost quaint that they will provide access to E911 service, i.e. fail safe rather than fail secure, thus overturning decades of operational practice. I mean, here’s a secure phone for government types to use to make calls to other government types, presumably as allowed/mediated by their central infrastructure, which will be making decisions about read up/write down, Need To Know, and all that. This phone won’t be able to call ordinary PSTN lines, presumably, so any user is going to be carrying a normal phone as well. Yet this secure phone has E911, complete with carrier-based location stuff.

Winn Schwartau March 8, 2012 2:46 PM

For any ‘secure’ mobile device, IMHO, an IPSec VPN is mandatory, and should help ‘lock down’ the mobile device with a provisioning process including CAs and other unique identifiers.

For security sensitive organizations, all tunneled traffic should go through a central managed server(s) with firewall controls, content filtering, FDE, white/black controls and be able to tie to DLP & SIEM.

Essentially, what reason is there for a mobile device and data to be any less secure that it is in the fixed enterprise? Especially with the added risk of mobility. Most critical though, is hostile activity detection, such as jailbreaking and apps behaving badly, and a high speed remediation path.

In many ways, Larry Ellison was right in 1996 when he said that the network is the computer and advocated thin clients. Turning mobile devices into GUI’d ‘dumb’ terminals with little or no long term data residence seems a really attractive approach to me. Lastly, keep in mind, MDM is NOT security.

Joseph March 8, 2012 4:40 PM

This can only work if you have complete control on your trusted certificates or static keys.
So far I don’t see that anyone has solved that problem on cell phones.
This requires a hardware and firmware infrastructure to protect the keys or the certificates with control and access only by the phone user.
Such solution do exist today on the market, why are they not being adopted?

Certificate management is easier to do if you’re the NSA, working only with very well vetted users, and the capacity to send keys to the handset outside of the operating channel (by manually setting them per device, for example).

David March 8, 2012 4:46 PM

@MarkH – yes, I’d discovered that one before I’d posed the question (with the same conclusion as you), hoping there was another one that better represented the views stated.

Jonathan Wilson March 8, 2012 7:46 PM

Even if the over-the-air security was better, you couldn’t trust that the data was safe after it was decrypted by the carrier inside the cellular head end equipment and sent to the VoIP server.

As for zfone, anyone know how vulnerable it would be to someone carrying out a man-in-the-middle attack on the key exchange?

Nick P March 8, 2012 9:14 PM

@ RobertT

“Wait a minute…”Secure Android”, isn’t that an oxymoron!”

Darn you for stealing my post (almost word for word) before I posted it! Darn you to an imaginary hell!

Nick P March 8, 2012 9:27 PM

@ MikeB

A SELinux and TPM model would be a very bad idea. SELinux was an experimental prototype thrown into production that is very complex & has had too many flaws. The Orange Book that inspired it also had a rule saying the TCB must be small & easily verified. A better approach for the software side would be to use OKL4 (in almost a billion phones already) or a separation kernel to isolate security critical functionality. Hardware wise, an architecture like SecureCore or SecureMe is a better idea as they cover more threats. They’re also not DRM disguised as security. 😉

RobertT March 8, 2012 11:48 PM

@Nick P
try reading the report, the conclusions are pathetic, even my teenage sons comment was …”but that does not fix the real problem”

It’s a start, I guess, but it ignores the value of Android to the consumer. unless this value is maintained NSA’s version will be sitting at home on the shelf. What’s in the targets pocket is what counts, everything else is nonsense.

If you follow these NSA guidelines you create a product with zero appeal to Joe Average. If I was given this, I know what I’d do with it…

There is a similar problem that has plagued Mil-comms for the past 10 years, In theory these military radios are great performers in practice they suck, so we have the equally silly case of deployed troops communicating during an operation using iPhones and some commercial conference call facility.

Clive Robinson March 9, 2012 4:05 AM

@ RobertT, Nick P,

Robert with,

, In theory these military radios are great performers in practice they suck, so we have the equally silly case of deployed troops communicating during an operation using iPhones and some commercia conference call facility

“thud” you have squarely hit the nail on the head…

Back in the 1980’s I was involved with a project for a military development agency that was looking at using the original Motorola Cell phone system with Racal UK developed enhancments as a “fly in and use” system to replace combat radio systems that were a mess back then (remember the Falklands War and later 1st Gulf War).

The system worked like a dream in comparison to what was around, one big Racal enhancment was the ability to locate by triangulation the position of a user with greater accuracy than the then GPS (civilian + SA). Another Racal enhancment done with BT Martalsham was the abbility to send 9600baud data reliably and 2400baud at fixed rate across cell swaping thus alowing the voice encryptors developed by the likes of Plesey to work very reliably (unlike the high HF / low VHF equipment that predated the almost as bad Clansman Kit).

What killed it all of was Motorola who refused to alow it to be used as they saw it as compeating with their Trunked Radio System (now called TETRA which was and still is a complete disaster for UK and many other civil authorities).

The result was Motorola lost out twice instead of winning twice and the UK troops ended up with Clansman and a godam awful trunked system that fell over so often the word “trunk” was replaced by “drunk” by not just the army technicians who had to work on it but the manufactures design engineers and technicians as well. Oh and then there was Mould, but best not go there, there may be a few UK amateur radio boffs reading who would like to vent their spleans against that piece of stupidity.

But it gets worse from what I’m told the replacment system used in the current Afghan and Iraq actions is not much better and this has had two consequences,

Firstly the old Clansman kit that had been in the process of being scrapped and sold off has been “bought back” at considerable expense and secondly as you noted the likes of the iPhone are being used with commercial services.

It is interesting to note that the US military hierarchy appear to be “embracing the iPhone” and also that Apple are in the process of building a new manufacturing plant in Texas.

The question is why…

Likely answers are,

1, to avoid “import restriction” battles that have kicked off over IP in phones as Microsoft Apple and Motorola abely assissted by the likes of HTC and Samsung. have demonstrated in recent times.

2, To develop an “out of China” production facility to stop the ripoffs of Apple products have suffered over there (take serious note other manufacttures thinking of going down the “cheap China” route it could cost you way more than you save).

3, To develop the first steps of a “secure product” line in the US for the benifit of the US politico’s, military and major corps as an alternative to RIM products.

Or some combination of the above and/or other reasons not yet identified.

All of that aside “Military Radio” systems are a dead concept these days, they take to long to develop they are way way to expensive and their functionality even with the best will in the world is usually way below that of PMR or other comercial offerings. The military thinking has moved on from “single channel nets” with all the problems that involves to “dial direct” and “tele-conferancing”. On the wish list of most troops is a rugadised iPhone with a slot in so they can whack in their smart card ID with it’s PKI and go straight into secure mode with a change of screen colour etc, take it out and be back in civilian mode. Simple to understand and low stress not real thinking involved and importantly relativly easy to do and have use of COTS advantages.

Will it happen, well it should, and it should not be that difficult to do but… I’m not holding my breath there is way to much vested intrest in “old slow and mega expensive” by the current encumbrants, and we know from experiance they will cut their nose of to spite their face.

Z.T. March 9, 2012 2:22 PM

Don’t such phones have binary-only firmware (for the baseband etc.) that basically has root on the machine? What is the point of securing the app or even the OS when you’re not in control of the hardware?

Clive Robinsson March 9, 2012 4:56 PM

@ Z.T.,

Don’t such phones have binary-only firmware (for the baseband etc.) that basically has root on the machine

Err not in the way you appear to be thinking about it.

Think of a smart phone as three seperate but connected computers,

1, The Subscriber Identity Modul (SIM).
2, The Mobile Communications system.
3, The media support system.

The SIM is a smart card and provides a whole load of services to the phone user and this is controled over the “over the air interface” by the network operator and appart from storing a few numbers in it there is little the subscriber can do to it.

The SIM has a fairly intimate relationship to the mobile communications system and the master-slave relationship can be quite confusing at times.

The Media Support System is almost entirely seperate from the SIM and can only get very limited access through the mobile communications system. Often the connection between the Media Support System and the Mobile Communications System is via an upgraded version of the old Rockwell AT Command set for modems. And in essence that is what a smart phone is a striped down personal computer connected to a modem that just happens to also be a mobile phone.

Unfortunatly the clear deliniation between some hardware such as microphone and ear piece have become blured due to “efficiency” and it is at these points the system isolation is weakest. Sadly the usual way is to connect the microphone and ear piece directly to the Media Support System and send “digital audio” etc directly to the mobile connunications system. Likewise the SMS system (which has always been a cludge) is a weak barrier interface as SMS is just a side effect of the “over the air” interface which was designed to control the SIM and Mobile Communications System by the network operator. The “over the air interface” has been steadily “overloaded” with new functions such as downloading ring tones etc and this area is often extreamly vulnerable, especialy as the network operators tend to be not very cautious with who they allow access to the “over the air interface”.

One crazy thing phone designers did and it has come back to haunt people is to include the equivalent of the phone number in the web browser identity string…

You kind of have to be carefull when talking about a mobile phones “number” because it’s an illusion the number you dial is mapped by the network suppliers database to the actuall electronic identity information held in the SIM and in the mobile communications system. In this respect it’s a bit like talking about DNS names being mapped onto IP addresses which in turn get mapped onto Ethernet MAC addresses.

Clive Robinson March 9, 2012 5:09 PM

@ Tony H,

And it’s almost quaint that they will provide access to E911 service, i.e. fail safe rather than fail secure thus overturning decades of operational practice

E911 “analog” access is a legal not security requirrment. You get the same issues with VoIP.

Thus it is “something they are required to do” rather than “someething they want to do”.

Such requirments are a complete bain of any security design in the same way as “building codes” for fire escapes etc are a nightmare for designing building security for banks and repositories and other secure areas.

A classic example is EmSec and “fire alarms” “fire detectors” “break glasses” and “emergancy lighting” and again “fire codes” for the old Halon dump systems and their modern equivalent in CommCens and secure data processing environments.

sweerek March 10, 2012 6:28 PM

VPN absolutely, but to insure ALL data is ONLY going to the VPN something more is needed… Remember Carrier IQ, the pre-rooted phones that send keystroke & such back to the provider? You’ll need trust in the ‘droid build itself from the bootloader on up, just not in app space.

* March 10, 2012 6:38 PM

“Information Assurance Directorate”…

what are they assuring? That the government has access to users information?

sweerek March 10, 2012 6:44 PM

@ Z.T.

I agree. Security starts at hardware, which means TPM-like embedded chips. Apple was an early leader of TPM but no iOS device has them now. Adding one could be another great & low-cost (but subtle) competitive advantage Apple could roll out for their phones & pads.

SnallaBolaget March 11, 2012 3:30 AM

“The more I look at mobile security, the more I think a secure tunnel is essential.”

Well, real security companies reached tatt conclusion several years ago, with such companies as Securitas requiring ALL connections to internal systems (such as their own “SPS” system) be done through VPN. That was maintained when thei started usin PDAs in the field some 6-7 yrs ago.

Might be an idea to look at what’s actually being done in the real security industry before you “innovate”…

jake March 11, 2012 7:20 AM

@ sweerek

TPM is no magic bullet, it is still vulnerable to cold boot attacks and the only real barrier it provides is accessibility to keys via the hardware. having your keys properly encrypted on-disk means that someone can more easily access them but it does not mean they can be decrypted.

i’d much sooner trust encryption of my keys to open source software than rely on trusting the TPM to store my sensitive data.

DIY March 11, 2012 9:23 AM

Anyone know how to build a secure smartphone-to-smartphone VoIP channel on top of an existing VPN server? Build asterisk on the server? Some other SIP server? What droid/iOS voip apps will work to talk to your private SIP server? Googling around, it looks like VPN is the easy part. Zfone doesn’t live on smartphones yet. iChat doesn’t live on smartphones, and probably never will, so AAPL FaceTime haz all yur paketz. What does?

Clive Robinson March 11, 2012 9:52 AM

@ Jake, Sweerek,

TPM is no magic bullet, it is still vulnerable to cold boot attacks and the only real barrier it provides is accessibility to keys via the hardware

Err actually TPM is a fully busted flush for security, and serves only minimal purpose for DRM.

The problem is TPM sits quite a way up the computing stack and is closer to the application layer than the fundemental physics of the underlying hardware.

Thus for TPM to work the hardware has to be 100% trusted in all places and in all ways, the only ways know to get even vaguely close are so grossly inefficient and eye wateringly expensive that they will not happen outside of very very specialist devices.

But plain and simple we know that it is not possible for a single Turing machine to be secure, the best we can hope for is “probabalisticaly secure” and we actually had the knowledge of this prior to the first electronic computers ever being built.

The interesting thing about “probabilistic security” is it’s a “trade off” between time spent on achieving a desired task and checking it has not become rouge in some way. Thus you can trade security against efficiency in the right architecture.

RobertT March 12, 2012 3:44 AM

I wonder if these NSA guys have ever heard of “covert communications channels”. The problem is your average smart phone has more possible covert channels than I’ve had hot dinners. These channels only need to leak at bits-per-second to transfer the session encryption keys.

Who is going to identify, let alone check, all the possible side channel information leak methods. My gut feeling is that it is only even thinkable if you limit the Android device to executing one active program and nothing else. which kinda defeats the purpose of a smartphone….but what do I know.

Clive Robinson March 12, 2012 8:55 AM

@ RobertT,

These channels only need to leak at bits-per-second to transfer the session encryption keys

Not even that high a data rate if all the keys are derived from a hidden unchanging master key, a bit or two a day would do, especially if the master key was not truly randomly generated (ie the SIM manufacture uses AES in CTR mode to generate a few bits that get put in the SIM and the phones electronic serial number a few more etc).

The problem is you are effectivly “black box testing” quite a way up the stack (ie chip level to App not chip level down to physics level) which gives rise to your question of,

Who is going to identify, let alone check, all the possible side channel information leak methods?

Well… this falls to the “Known knowns”, “Known unknowns”, “Unknown knowns” and “Unknown Unknowns” problem defined by classes of attack and specific methods within a class.

Known knowns, should not be to much of an issue if they are detectable which unfortunatly not all are with a black box system (the same problem as identifing if a random looking stream of bits is truely random or determanistic)

Known unknowns, are new varients within a “class” of attacks. That is if you identify a basic class of attack (say a timing attack based on adding jitter to the data TX sequence) then finding a generic solution to that attack (clock the outputs) solves the problem for the whole class within known bounds.

Unknown knowns, are specific types of attack known in one domain (class or channel) that are moved into another domain. For instance spread spectrum modulation can be applied to the carrier channel (RF signal) by modulating it’s amplitude, frequency, phase and sequence (if SS it’s self). But as was seen with watermarking it could also hide a covert channel in the data channel as normal data has it’s own noise to hide in. As a general case as with known unknowns you can apply a generic mitigation, but it has to be in each applicable class and you may not know all the classes or how to come up with a mitigation. For instance amplitude modulation can be mitigated (in part) by some kind of limiting circuit, but frequency modulation has to be mitigated in an entirely different way.

Unknown Unknowns, these are basicaly unknow vectors in unknown domains, generaly the only way to find these is by carefull examination of energy signals in all the basic channels (carrier, modulation, timing, data, meta data, meta meta data etc etc). Of course you have to be able to identify all the basic channels and that may not be possible, and likewise the energy signiture.

Then there is the question of “transparency” whilst you might be able to mitigate some covert channels in a domain it may not be possible to eliminate them all and still have a functioning system. Take “clocking the output” that has two limitations that can easily be seen, firstly it is effectivly frequency dependent in that data rates below the basic data rate can get through and secondly any channels within the data etc get through. As an example any system leaks one bit of data unavoidably, that is it is either in operation or it is not, and this applies at all channel levels within the system. So one technique might be to generate a covert channel in the error correction mechanism of a block within the system.

Thus if the input to a block within the system was delayed deliberatly then clocking the output would require a pause or re-sync before transmittion could continue. And the pause / re-sync provides the covert channel.

The usuall solution to this is to clock the “input” from a master clock. That is you pipeline all the system blocks together as a fully synchronus system. Unfortunatly this also has limitations and an even slower covert channel can still be formed by simply stoping and starting data transmission into the first block of a system.

The solution to this are two fold, firstly fail hard on errors and back of on retransmission a very long random amount of delay (but this only goes so far). The second is to “store and forward randomly” that is all data input to the system is stored and then sent randomly out of sequence be it as whole “messages” or “fractions of messages” providing the sellection process is not determanistic and the system can reassemble fractions correctly you will disrupt the covert channel in the error system. Effectivly what you are doing is taking the covert channel and randomly modulating it. However after all of this there will still be a very low bandwidth channel you cannot stop (ie system on or system off).

With regards your comment,

My gut feeling is that it is only even thinkable if you limit the Android device to executing one active program and nothing else

No this does not work, it just lowers the oportunity.

Think of the transparency and store and forward issues. If a rouge app runs at some point it can load up it’s chosen “covert channel” with data which will just sit there. At some point a non rouge app will establish communications, as long as the data stored by the rouge app gets used the covert channel is established and the leaked data forwarded.

For instance let us say the rouge app is an off line email client it can establish a meta meta data covert channel by the use/abuse of standard headers in individual messages it puts into the spool directory/file. When the mail transfer agent sends the Emails in the spool it is passivly and unknowingly establishing the covert channel and sending data down it.

This workes well because many headers are effectivly redundant for any given message (think 7/8 bit data identifier or extended charecter set identifier, most systems will process the Email correctly even it the identifier is not included, and will ignore a set identifier if it is included but not necessary).

And at the end of the day the real issue is, any system that has some form of redundancy in it can have a covert channel tucked away somewhere. However without redundancy the system could not function… thus “Catch 22”.

RobertT March 12, 2012 6:38 PM

@Clive R
thanks for expanding on the general message I started. I agree with most of what you are saying, although some attacks are clearly more difficult to implement than others. But again that depends largely on the skill set / system access of the attacker, which is one of your Unknown Unknowns.

I guess it is to my personal advantage to join the Android cheer leading squad and congratulate these guys for coming up with such a magnificent solution. However something tells me they’re also not drinking the Koolaid. This leaves me to wonder, for whom is this proposal actually intended.

Mark Fowler March 13, 2012 6:26 PM

Basically this will be done using virtual machines. You cannot ever be ‘certain’ (from a user app) that the OS doesn’t already contain a virtual machine that sits between your user app and the OS. So you do your stream encryption, using a bit-wise encryptor, within the virtual machine, using some certificated method of identity; non-repudiation and integrity can be handled within the encryptor. NSA has a ‘High Assurance IPSec’ spec ready to publish.

me March 15, 2012 8:47 AM

Paranoid Android would go over huge w/Radiohead fans for sure. Especially if it had the
macbot monotone…”like a pig, in a cage, on antibiotics.” lol

don’t they have lines for different types of voice data? (classified, secret, t.s.)

wait! don’t answer that, I don’t want to know…just saying that maybe some data should not be relayed ‘overland’ if it is really all that sensitive…Hollywood requires that they meet face to face and have very tense, but very subtle, conversations….that’s where the sensitive voice data should start and end…this way, more people could just think about what they’re going to eat for dinner…

rajesh March 18, 2012 7:10 PM

Wireless communication channels are also wired , but these invisible wires can be easily captured/manupulated annonymously (unidentified). The Concept of tunnel seems cool ?

Wael June 20, 2012 2:06 AM

@ Nick P

“The Orange Book that inspired it also had a rule saying the TCB must be small & easily verified.”

Welcome to the TPM 🙂

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.