Intercepting Predator Video

Sometimes mediocre encryption is better than strong encryption, and sometimes no encryption is better still.

The Wall Street Journal reported this week that Iraqi, and possibly also Afghan, militants are using commercial software to eavesdrop on U.S. Predators, other unmanned aerial vehicles, or UAVs, and even piloted planes. The systems weren’t “hacked”—the insurgents can’t control them—but because the downlink is unencrypted, they can watch the same video stream as the coalition troops on the ground.

The naive reaction is to ridicule the military. Encryption is so easy that HDTVs do it—just a software routine and you’re done—and the Pentagon has known about this flaw since Bosnia in the 1990s. But encrypting the data is the easiest part; key management is the hard part. Each UAV needs to share a key with the ground station. These keys have to be produced, guarded, transported, used and then destroyed. And the equipment, both the Predators and the ground terminals, needs to be classified and controlled, and all the users need security clearance.

The command and control channel is, and always has been, encrypted—because that’s both more important and easier to manage. UAVs are flown by airmen sitting at comfortable desks on U.S. military bases, where key management is simpler. But the video feed is different. It needs to be available to all sorts of people, of varying nationalities and security clearances, on a variety of field terminals, in a variety of geographical areas, in all sorts of conditions—with everything constantly changing. Key management in this environment would be a nightmare.

Additionally, how valuable is this video downlink is to the enemy? The primary fear seems to be that the militants watch the video, notice their compound being surveilled and flee before the missiles hit. Or notice a bunch of Marines walking through a recognizable area and attack them. This might make a great movie scene, but it’s not very realistic. Without context, and just by peeking at random video streams, the risk caused by eavesdropping is low.

Contrast this with the additional risks if you encrypt: A soldier in the field doesn’t have access to the real-time video because of a key management failure; a UAV can’t be quickly deployed to a new area because the keys aren’t in place; we can’t share the video information with our allies because we can’t give them the keys; most soldiers can’t use this technology because they don’t have the right clearances. Given this risk analysis, not encrypting the video is almost certainly the right decision.

There is another option, though. During the Cold War, the NSA’s primary adversary was Soviet intelligence, and it developed its crypto solutions accordingly. Even though that level of security makes no sense in Bosnia, and certainly not in Iraq and Afghanistan, it is what the NSA had to offer. If you encrypt, they said, you have to do it “right.”

The problem is, the world has changed. Today’s insurgent adversaries don’t have KGB-level intelligence gathering or cryptanalytic capabilities. At the same time, computer and network data gathering has become much cheaper and easier, so they have technical capabilities the Soviets could only dream of. Defending against these sorts of adversaries doesn’t require military-grade encryption only where it counts; it requires commercial-grade encryption everywhere possible.

This sort of solution would require the NSA to develop a whole new level of lightweight commercial-grade security systems for military applications—not just office-data “Sensitive but Unclassified” or “For Official Use Only” classifications. It would require the NSA to allow keys to be handed to uncleared UAV operators, and perhaps read over insecure phone lines and stored in people’s back pockets. It would require the sort of ad hoc key management systems you find in internet protocols, or in DRM systems. It wouldn’t be anywhere near perfect, but it would be more commensurate with the actual threats.

And it would help defend against a completely different threat facing the Pentagon: The PR threat. Regardless of whether the people responsible made the right security decision when they rushed the Predator into production, or when they convinced themselves that local adversaries wouldn’t know how to exploit it, or when they forgot to update their Bosnia-era threat analysis to account for advances in technology, the story is now being played out in the press. The Pentagon is getting beaten up because it’s not protecting against the threat—because it’s easy to make a sound bite where the threat sounds really dire. And now it has to defend against the perceived threat to the troops, regardless of whether the defense actually protects the troops or not. Reminds me of the TSA, actually.

So the military is now committed to encrypting the video … eventually. The next generation Predators, called Reapers—Who names this stuff? Second-grade boys?—will have the same weakness. Maybe we’ll have encrypted video by 2010, or 2014, but I don’t think that’s even remotely possible unless the NSA relaxes its key management and classification requirements and embraces a lightweight, less secure encryption solution for these sorts of situations. The real failure here is the failure of the Cold War security model to deal with today’s threats.

This essay originally appeared on

EDITED TO ADD (12/24): Good article from The New Yorker on the uses—and politics—of these UAVs.

EDITED TO ADD (12/30): Error corrected—”uncleared UAV operators” should have read “uncleared UAV viewers.” The point is that the operators in the U.S. are cleared and their communications are encrypted, but the viewers in Asia are uncleared and the data is unencrypted.

Posted on December 24, 2009 at 5:24 AM92 Comments


Vladimir December 24, 2009 6:09 AM

Another solution is to make video form UAV less trustworthy.
Technology what can add virtual objects to a real time video stream is some kind available now. Show nice movie few times and next time no one trust information from UAV.

Also it is wide field to practice old good disinformation.

Paul Renault December 24, 2009 6:13 AM

Another option is to broadcast a bunch of fake Predator broadcast, showing all sorts of ‘successful’ kills. And in the process, sow havoc.

Which the ‘Reapers’ can harvest…

GregW December 24, 2009 6:33 AM

Bruce says there is no/minimal threat to adversaries seeing the video feeds, but when I heard about the unencrypted feeds, I wondered whether the following scenario has been happening for a while…

If Al Qaeda has been looking at our drone videos for a while, they have a pretty good idea what looks like a “target” to our guys since they can see through our eyes.

They can then arrange for civilian gatherings to look like targets to us (or ensure their gatherings don’t have signature visuals indistinguishable from civilian gatherings), and we get the bad rap for killing civilians amongst the local populace (and the global community).

GregW December 24, 2009 7:12 AM

Bruce raises an interesting point that there was a reason the military didn’t encrypt the video. A) Key management is a tactical burden on the warfighter and B) if you aren’t going to do it right, then don’t do it at all.

While I was kind of flabbergasted by the military not encrypting the signals, I once postponed implementing encryption for reason B) myself, so its true I can’t blame the military completely for taking that approach. Since I couldn’t figure out how to truly keep encryption keys secure in web-facing environments while allowing programmatic access, I didn’t implement encryption at all.

But is that really the right approach, for either me or the military? ‘

As the military’s dilemma indicates, is sometimes a half a loaf better then none? Or in industry jargon, wouldn’t a partially adequate solution be part of a wise “defense in depth”?

I’ve received the impression over time, reading Bruce and others, that if encryption can’t be done right, it’s better not to do it. Better to do nothing than to XOR, better to avoid building a copy protection/DRM scheme since you have to give the keys to whoever/whatever plays the content, etc. Rarely do people actually state this so baldly, but it does seem part of the ethos.

As a result, I don’t think security philosophy, as propounded by Bruce or others, is very clear on this point– when is the reason “if you aren’t going to do it right, then don’t do it at all” valid, and when is it not?

(One tentative answer to my own question triggered by Bruce’s final sentence about threat models: “Partially adequate encryption is better than none when the threat model indicates a reduction in risk for a reasonable scenario, and when that reduction in risk exceeds the risks of management/customers assuming things are secure.” Is this the right way of thinking??)

Dom De Vitto December 24, 2009 7:53 AM

Too many issues here to list – replay attacks, triangulation, etc. etc.

Just because there are problems doesn’t mean the solutions aren’t justifiable.

“I don’t have a password, as people can guess passwords…”

sooth sayer December 24, 2009 7:58 AM

I agree — this is broohaha over nothing. There is little to no harm in terrorists having the feed — all the harm if good guys don’t.

Bruce Schneier December 24, 2009 8:14 AM

@Dom De Vitto

Agreed. I’m not arguing that the downlink shouldn’t be encrypted, only that it’s not at all obvious that it should be encrypted. I don’t have nearly enough information to do a complete threat analysis.

But my guess is that the military made the right decision. And my guess is that, now that they’re going to encrypt the video, they’re going to use a complex heavyweight system instead of a more agile lightweight system.

Bruce Schneier December 24, 2009 8:16 AM

@ Clive Robinson

Yes, I saw that. (There was a link to that essay in my essay — I don’t know if Wired removed it, and I’m too lazy to check.)

This is why I’d like to see a general lightweight encryption system for this entire class of information.

Trichinosis USA December 24, 2009 8:19 AM

People forget that the insurgents (I am not going to call innocent people who have had their country occupied based on a lie “terrorists”) have the home team advantage. They’ll instantly recognize scenery and buildings and be able to react accordingly. They don’t NEED maps or triangulation to figure out what’s going on.

Not encrypting this traffic wasn’t based on mission critical thinking. It was based on arrogance. It is a fail, and nothing but a fail. The Emperor’s feed has no clothes and no amount of purple prose will ever make that less true.

Second grade boys, Bruce? Yes, you nailed that one. It takes a special kind of creature to pretend to themselves and the world that they need to whiff billions of our tax dollars and waste thousands of American lives chasing some old nutter on a dialysis machine hiding in a mountain cave for almost a decade now. Speaking of which, they STILL can’t find him. “Predator”. “Reaper”. Oh PLEASE! Put down the Marcincko books, kiddies. Produce bin Laden and I’ll be impressed. Oh, but then the wartime gravy train would be all over and done with, wouldn’t it? And we can’t have that…

Josh O. December 24, 2009 8:25 AM

@Paul Renault

That is brilliant. I wonder if they were already doing this a la Operation Bodyguard.

John Campbell December 24, 2009 8:31 AM

HDTV is encrypted for ECONOMIC reasons and is not a fair comparison since the value of a Predator feed is very very immediate and short-term to someone intercepting the signal.

Encryption only covers a longer period whilst the data is information.

The video “data” from a Predator has a short half-life when it is “information” before it becomes “data” again with little value.

Information theory plays a role in this.

And, yeah, encryption, in this instance, would keep information out of the hands that could USE it as well as those who may mis-use it.

SeanT December 24, 2009 8:35 AM

I can’t believe the blatant prejudicial statements in this article– regarding second-grade boys. Most second-graders I know could come up with much more original and clever names than “Predator” and “Reaper”. I would guess it’s marketing majors making millions on Madison Avenue who are making up the names. And my bet is that they are targeting their vision of the typical taxpayer.

Jeff Schroeder December 24, 2009 8:37 AM


As a former UAV Operator in the US Army (RQ-7 Shadow 200 for those interested) all UAV personell have at a very minimum secret security clearances. The clearance is required to complete the 96U/15W training. For certain missions ie anything working with the CIA’s spooks, you needed top secret clearance but anything else just requires secret. The keys used for something like that would be just the same as used to encrypt radio communications. Those require nothing more than a secret security clearance.

You bring up great points however regarding soldiers on the ground needing to see the video. Ditto for allies who we might not want to share crypto with.

Cerebus December 24, 2009 8:58 AM

At least someone understands the design decisions that went into this.

Two points to consider: First is that some of the ground stations that rely on UAV/SUAS video can receive but are prohibited from transmitting; and second, you won’t (and can’t) know who those people are because they can’t transmit.

Asymmetric key establishment with such users is impossible, and out-of-band methods (e.g., time-scheduled symmetric key broadcast or code-books) are simply too cumbersome to warrant given the shelf life of the intel.

In short, the risk of non-availability to the intended users far outweighs the availability to the unintended users.

bcline December 24, 2009 8:59 AM

@John Campbell — my thoughts run similar to yours. So far, the fear scenarios have been that the real time video would cause the enemies to escape or harm would come to the ally soldiers. If this is the case then all that is need is a way to delay what the enemy sees. Whether this is done by some cheap encryption or some other video trick, if the enemy can not tell if the video they are watching is real time or delayed, then the enemy will not be able to make a good decision.

Max Kim Tobiasen December 24, 2009 9:06 AM

The argument here is that encryption is not a viable option because key management would be a nightmare. The insurgents get the raw videostream by intercepting it using some sort of antenna to pick up the signal directly from the predators. Since, presumably, the military actors that need the videostream don’t actually pull it out of the air themselves but get it through some kind of military channel (Internet, VPN or whatever they use) wouldn’t the obvious thing to do be to encrypt the raw stream from the predators to the command post where the signal is picked up by the mlitary and distributed and send it on to whoever needs it in unencrypted form? That way the insurgents wouldn’t be able to pick the signal out of the air, and everyone associated with the military who needs it would have access to it. No key management nightmare needed.

Jason December 24, 2009 9:07 AM

What about the possibility of just having the crypto in such a way that the receiving devices decrypt the signal? Also, the way the predator streaming works is it possible to tell at any time how many devices are pulling the stream/which ones?

Nicholas Weaver December 24, 2009 9:17 AM

Actually, the key management problem for this is solved, IF you don’t do it by the NSA’s playbook but by the civilian world’s playbook.

This is exactly the sattelite TV broadcast encryption problem, where you don’t distribute new keys to the endpoints, but have all the receivers each have their own key and do revocation of any lost/stolen/unaccouted for devices.

Cerebus December 24, 2009 9:25 AM

@ Max — No, UAV/SAUS video is transmitted directly to ground stations. If it were transmitted back to the controller and then redistributed, you still have the exact same problem, but with an additional failure mode inserted.

@ Jason — It’s physically impossible for a transmitter to count passive receivers.

@ Jason, Nicholas — The problem with your schemes (which are variations of each other) is that you’re relying on knowledge of to whom you’re transmitting. Satellite systems register the crypto devices so the broadcaster has knowledge of the recipient’s keys.

In the UAV scenario, you don’t always know. How do I choose the recipient’s key from a key schedule if I don’t know who he is and he can’t tell me?

Kevin S. December 24, 2009 9:35 AM

To me, this highlights the “over-education” of the public of information security issues. While I agree some encryption would be appropriate for this application, the key management issue is a significant issue. The general public might be aghast to find some military systems have no passwords, weak passwords, group passwords, no firewalls or antivirus. – By DESIGN.
My point being: Some controls you would deem critical for your desktop are, not only difficult to implement in a strategic or tactical environment, but are often useless (AV on *nix) or present legitimate safety concerns (complex passwords with MOPP gear).

StCredZero December 24, 2009 9:36 AM

Max is right. If you have to redistribute the video, don’t do that by leaving it unencrypted up on the satellite downlink, accessible to just about the whole freaking surface of the Earth. At least encrypt it to the military base, and stream it from there. The uplink/downlink key management would be the same key management problem as the control signal, which they are already solving. And it would mean that an insurgent with some second-hand equipment, a Russian program, sitting just about anywhere on the surface of the earth couldn’t just peek in.

Clive Robinson December 24, 2009 9:54 AM

One point that should be made about “pay TV” encryption.

They almost never got it right the 1st/2nd/3rd….10th time around. Even with digital only video.

The NSA and others have traditionaly used conservative designs often based on stream encryption for high bandwidth data feeds.

Due to various issues the encryption equipment to be “built in” can be extreamly “physicaly” heavy (supposadly some stuff the NSA put up on recon satallites weighed in at over 100lb). And require specialised power supplies and a lot of power.

Then there is the question of “maintainability” most UAV equipment these days is the aeronautic equivalent of COST where ever possible. Crypto kit is most definatly not and has historical handeling procedures that are a nightmare for use in the field by small teams.

Then the question of “usability” troops take time to train on new equipment and you don’t need them having to use “different grades of equipment” depending on the percived security requirments of a particular Op.

Likewise you have to consider the equipment “lost to the enemy” for many types of behind the lines activities, and you would not want to be doing this with “state of the art” equipment.

Then there is the question of “mental mind set” with secure comms of different grades. How do you decide what is operational and what is stratagic or some point in between. It is most likley people would “over rate” the security (think back to Vietnam and USAF VHF radio systems).

Then think about it with more modern equipment, consider the recent experiances of the IDF with regards to supposadly technicaly unsophisticated arab militias, that appeared to read the IDF “secure tactical radio” traffic with little difficulty.

So there are many many non obvious reasons not to use encryption which are far more difficult problems than KeyMat handeling, such as mindset, training and the dreded “inventory” issues.

Team America December 24, 2009 10:08 AM

The military is a government controlled monopoly who’s main function is to hurt people and wreck stuff. Adjust your expectations accordingly.

Brian December 24, 2009 10:13 AM

@Cerebus: If what you’re saying is true, then this is an impossible problem to solve. Sending video to unknown and unauthorized recipients requires that insurgents also be able to receive it.

However, I think you misunderstand the satellite radio/TV scheme. Devices have individual keys, and those keys are registered with the broadcasting system (by device serial number, not the key itself). The system then encrypts its broadcasts in such a way that any device with an authorized key can decrypt it at any time. You don’t need to choose the recipient’s key, because all previously authorized recipient devices can receive the data all the time. Sure, at SOME point you need to know what devices should be allowed to receive data, but I can’t imagine a scenario where there would be devices in the field that we wouldn’t know about that we would still want to send video feeds to.

This solves most of the problems Bruce mentioned. Since device registration is by serial number, the key remains secret…and serial numbers can be sent over open channels. If the serial numbers/keys are baked into each device, all the management would be at a centralized point, with minimal effort require of soldiers in the field.

I started to agree with Bruce in his essay, this IS a hard problem, and I would agree that soldiers NOT seeing the video is probably worse than insurgents SEEING the video. But I think this is clearly a solvable problem, and while I think some critics of the military overstate how easy it would be to do, the military does deserve a little criticism for not figuring out how to solve it.

Chris Wysopal December 24, 2009 10:19 AM

There is also embedded mission data such as altitude, latitude, and logitude. This makes the video more valuable. See wikileaks for the details.

Every soldier is issued 2 private keys today. The military already has PKI. It wouldn’t be hard to build a ROVER reciever that used this PKI infrastructure now. I can see why they didn’t do it in the 90’s.


Douglas December 24, 2009 10:20 AM

RE: PR issues. If this is a consideration, then using commercial (eg, less secure) puts you at risk of headlines reading, “Terrorists break encryption deployed by US military”, and you are left to put out the sound-bite, “but that wasn’t military grade encryption, honest, it was just the stuff used to encrypt your credit card number.”

Scott December 24, 2009 10:23 AM

I believe it was George Patton who said that using un-coded messages was actually preferred if there was no time for the enemy to do anything about the message’s contents. In other words, if you were sending a message that you were going to attack in 10 minutes and you knew it would take your enemy an hour to mobilize, don’t bother with the code book.

The same principle holds here – the UAV data is most valuable to people in real time, and only to those people who immediately know which camera was looking at what target and if they could react in a timely basis.

Cerebus December 24, 2009 10:59 AM

@ Brian–

If you think about the prevailing “coalition of the moment” operations you’ll realize that having unknown and unknowable fielded receivers is not only possible, it’s common practice today.

@ Chris–

You can’t use asymmetric key establishment with unknown passive receivers.

Cerebus December 24, 2009 11:13 AM

@ Brian–

BTW, conditional access systems (which is what you’re describing) still rely on knowing the individual subscriber and his key. These schemes boil down to key transport systems, where the content key is wrapped with the subscriber’s specific key.

In the end, you still need to know who’s receiving and you need to know what key he holds.

Now, conditional access systems can work with groups, but then they use complex key trees to derive group keys. But these schemes still rely on knowing all the groups (at some level–groups can be nested) ahead of time so the tree can be generated.

Neither of this is particularly applicable to the problem at hand.

aikimark December 24, 2009 11:53 AM

Skygrabber is somewhat of a legacy of the Soviet Union — it is Russian software.

I don’t see why the video receivers couldn’t be upgraded in the field of operations. Distributing GUID-length keys through the existing secure channels shouldn’t be too difficult. Symmetric keys could be used, switching at midnight GMT.

EH December 24, 2009 12:04 PM

Am I dumb? Instead of this clearance-oriented infrastructure for key management, why not just build the key into the drone at the factory, where there is certainly an existing clearance regime, then have a small clearance’d implementation team that handles the key import to the c&c system? Deliver the drone, have it report its key to the system, then deploy. It’s the same as if a company employee submitted their laptop to IT in order to have a VPN shared secret installed.

I suppose there’s an issue with key stealing if bad guys got a hold of one of the drones, but if that’s the case then there are larger problems afoot and the drone is inoperable in that case anyway.

Chris M December 24, 2009 2:22 PM

Bruce, thank you for your very insightful analysis. While you haven’t changed my opinion of this matter, you have introduced some new insight (particularly WRT coalition allies)

I am afraid that many people are underestimating the damage that tireless analysis of ROVER streams can do. This information does not necessarily have a short lifespan. I hate to make the comparison to a video game, but this is very, very similar to the “Parasite” unit in Starcraft. Any one of the gazillions of players can tell you that being able to follow one of your enemy’s units around and see what they see is of immense value. ROVER units are attached to most US aircraft, This issue has little or nothing to do with the Predator or Reaper. Analysis of a decade of these feeds could:

-Provide detailed plans of our air bases. ROVERS continually transmit, so just watch the feed and wait for the aircraft to return to base. look around at what is in the base, who’s there, where the ammunition is stored, etc.
-Provide massive improvements to camouflage. Being able to watch your activities through the eyes of your enemies will allow you to camouflage your movements, antennas, cave entrances, convoys, etc, etc, etc much better. You can then tune your camouflage to the resolution of your enemy’s surveillance.
-Improve defenses. Determine angle of attack of missiles, bombs, illumination, etc.
-Surveying weaknesses in your enemies operational security. No one is perfect at OpSec, and this is an excellent opportunity to recognize significant weaknesses.
-Diminish the “fear weapon” effect of our surveillance. I’ve got to imagine that being attacked by an enemy with excellent surveillance capabilities is rather frightening. Being able to see what that enemy can see will relieve a significant portion of that fear.

There are more long-term uses, but I’m not going to continue listing them here. IMHO, this gaff has certainly caused coalition casualties and allowed the enemy to improve their ability to avoid detection and capture and prolonging the conflict. If I were being hunted by someone I would love to have this sort of video available to me.

Dave December 24, 2009 2:28 PM

I started to agree with Bruce in his essay, this IS a hard problem

It’s not a hard problem if you treat it as a security engineering problem rather than a crypto problem. The problem now is that you can take a random piece of COTS video gear and intercept the feed. To prevent this, make some change to the stream so that you can’t do it with COTS gear any more. If you really want to use crypto, throw in opportunistic DH, which requires no key management and gives you the same “security” as SSL (as used on the WWW) does. You don’t even need to do that though, just switching from an MPEG video stream to some obscure format that no-one uses much (Theora, for example 🙂 would stop them.

(And as for the person who suggested PKI, the point is to make it harder for the bad guys to misuse your assets, not to DoS yourself. Whose side are you on anyway?).

Chris December 24, 2009 2:39 PM

I am not sure everyone’s assumptions about the method of interception are correct. Skygrabber software depends on a DVB-S tuner, so obviously, they must be downlinking an unencrypted DVB-S signal. I seriously doubt it is coming directly from the predator, DVB-S is not a very good point to point modulation scheme. Most likely the predator is uplinking to a geostationary satellite which then rebroadcasts the data on a particular transponder frequency. It probably has a small bandwidth return channel to downlink control signals. This architecture is what TV networks and their affiliates use for uplinking live events via satellite, except the content would be encoded live video instead of IP encoded video. Broadcasters also use DVB-S data IP broadcasting to receive commercial content.

Alternatively, the predator could be using a more suitable modulation scheme (DVB-t) and broadcasting point to point to a local uplink facility. This is similar to what you would see watching a car chase live in another state. The local broadcaster’s helicopter microwave relays to a local receive site, that signal is then uplinked (DVB-s or DVB-s2) to a satellite, and than anyone with a commercial agreement and the correct receive equipment can downlink the video and rebroadcast it.

Sam December 24, 2009 3:16 PM

Here is an excellent point made by a commentor on the Wired page, that clears up some confusion:

Posted by: Armchair_Warlord | 12/17/09 | 7:40 pm
If I could make an observation, it seems like people have been talking about two seperate problems here. The first of these is intercepting video signals off of the sattelite downlink between the drone and, say, Creech AFB. This appears to be what sparked the original WSJ story and something the military has known about since a year ago – it also seems to have been fixed for some time (ref the Yahoo article) and only applicable to certain models of drones that both sent unencrypted video and used sattelite links for communication – namely, early model Predator drones that featured electronics designed in the 1990s. This was presumably never a problem with newer systems featuring encrypted video and could be fixed without a lot of hassle by simply updating systems at two points – the drone and the operating station. The WSJ seems to have heard about the security breach well after it was corrected.

Then there’s a seperate issue with unencrypted video feeds being sent between aerial platforms and receivers on the ground without passing through an intervening sattelite. Encrypting these systems would be more difficult due to the need to disseminate the upgraded receivers to all the ground forces that would need them – on the other hand, you need to be pointing an antenna exactly at the source to get a signal.

Think of it this way – Ravens are small and fly pretty low. If you’re seriously interested in not being seen, do you really want to be standing out in the open waving around an antenna trying to tap into a video feed that is going to consist of you looking at yourself and knowing that the whole US Army is going to be chasing you starting right now? It’s kind a postmodern way to die but it’s still a really good way to get yourself killed.

Karl Koscher December 24, 2009 3:44 PM

One of the articles you linked actually proposed what I think is decent solution, although they call it a problem.

Buddenberg says that adding link-level encryption won’t solve the problem — you need end-to-end encryption. But, in this scenario, link-level encryption seems like the right approach. You can have only a handful of downlink stations that decrypt relay the data to the DoD’s internal networks. Now the key management and security clearance issues become a lot easier to deal with.

Sure, people with access to the DoD networks could intercept the traffic. But, it would substantially raise the bar above what it is right now (BROADCASTED to the entire planet IN THE CLEAR).

The drones already have their C&C encrypted, so adding video encryption shouldn’t change the classification level of the drones, or the security clearance requirements to handle them.

supachupa December 24, 2009 3:49 PM

hmm. I smell bs on this.

I first doubt that any military communications would be unencrypted.

Secondly, even if it was not encrypted, why would the “terrorists” announce to the world that they had the capability to see this video? Doing so would trigger an immediate change to the design and block their access.

BCS December 24, 2009 4:56 PM

Maybe what is needed is a cryptographic system for “time sensitive data”. That is stuff where the enemy getting access at some point in the future is of little value to them or threat to you. Where the only threat is real-time snooping.

Thomas December 24, 2009 6:23 PM

“Secondly, even if it was not encrypted, why would the “terrorists” announce to the world that they had the capability to see this video?”

The terrorists didn’t announce it.

It was discovered when a UAV operator zoomed in on a terrorist with a laptop only to see… a terrorist with a laptop!

Clive Robinson December 24, 2009 9:28 PM

@ Karl Koscher,

” …in this scenario, link-level encryption seems like the right approach. You can have only a handful of downlink stations that decrypt relay the data to the DoD’s internal networks.”

Err which link are you refering to?

Also what do you mean by “link level” encryption?

The problem with the story is it is lacking in details to make a rational judgment.

Other articles indicate it may not be the UAV system at all but a “re-broadcast” via an IP data network on a comercial satallite system as a more general part of the defence network.

For true link level encryption you are talking “point to point” which taken at it’s simplest would imply retro fitting a satallite with the required system.

If it’s a comercial satallite that has been “rented” then it may not be capable of carrying encrypted traffic for various reasons.

Then there is the issue of the receivers, again these may not be possible to modify as it may just be COST equipment in a green equipment rack.

The lack of encryption is a well known problem with COST equipment. If you think back to the SA code for GPS in a previous conflict in the middle east. It was not possible to supply the number of GPS units required thus a COST system had to be purchased at short notice and the SA code turned off for the duration.

Likewise a lot of military communications is moving over to GSM or CDMA mobile phones for similar reasons.

What has happened in the last thirty years is that comercial systems now out perform military systems in many many ways. The policy upto the end of the Clinton era was to treat encryption technology out of civilian kit at all costs.

As the US Gov is discovering, you can not have it both ways. If they want to use COST to get reliable working equipment as and when they need it then they have to do without high level encryption.

If however they allow high level encryption to be built in as standard in COST they lose the LEO advantage wanted by the likes of the FBI.

The problem they have at the moment is the worst of both worlds. It is possible to get GSM phones with very high level encryption but it is not “standard” because it is not in great comercial demand thus the supplies are limited and expensive. The result is criminal gangs and others the LEO’s want to get access to their comms can’t and the military has to make do with insecure kit.

The question boils down to “who are our future enemies” and “who is going to fight them and how”.

The answer is almost certainly going to be the LEO’s not the military.

However an interesting series of coincidents has happened in the open science world which is PET. A lot of clever minds have shown that message content is actually less important than traffic flow. Likewise most anonymous systems are not nor can they be if they are to have reasonable utility. Arguably the open science community has caught the likes of the NSA/GCHQ unawares yet again.

Thus the question of treating encryption as a munition is now not a mute point but one where the dog has turned around and sunk it’s teeth into the owner.

The problem the Gov has now is how to get industry to put encryption of the right type back into COST equipment. Arguably it is to late for GSM & CDMA so what about up and comming IP systems?

It is one of the reasons I argue that NIST amongst others should move away from “solutions based” standards to “framework based” standards. AES etc is nice but we need a proper framework to put them in so various “solutions” can be swapped in and out as needed. However this brings up new problems to be faced (such as negotiation protocols and MITM attacks etc 😉

Just to make life realy interesting various standards bodies are waking up to the issue of Software Defined Radio (SDR). There has always been an implicit assumption of certifying communications hardware against a specification and that different pieces of equipment where required for different tasks SDR has been a bit of a wake up call because the assumptions based on tangable goods do not hold in the intangable good world of information which software is just one small part.

Any way I wish all who are still reading the best for their respective Solstice celebrations, and a peacfull time.

Hoopla Bub December 25, 2009 12:03 AM

The information gathered from a UAV is more valuable than the machine itself. Hence why it was decided to build these things. Stating that C&C is more important than the video feed is ludicrous. Just because you can’t imagine what an adversary does with the advantage real-time information gleaned provides doesn’t mean it doesn’t exist. I guess that’s what all these years of having idiots grovel at your every word does to a man.

And no encryption is better than weak encryption? Really? Because Iraqi insurgents have cryptanalysis capabilities, right. Even if they did, delayed video feeds lose all their value. So yeah, wrong again.

As for encryption being too costly, well… yes and no. It’d be prohibitively expensive, but that’s also why the rest of the planet doesn’t start random wars — they tend to be expensive. If UAV technology is too costly to implement properly, it shouldn’t be implemented at all. As for key management, it’s the same problem as that of radio comms. The fact every grunt on the ground has encrypted comms kind of puts a dent in your little theory, doesn’t it, Bruce?

UAVs are fancy toys sold to idiots in Washington with a yacht to pay for. You speak as if the US government were manufacturing these. Get out the office a bit, and you’ll learn war is a racket. The UAVs being used now are a scam.

You’ve just lost all credibility to anyone who works in intelligence…

Tracy Reed December 25, 2009 2:01 AM

“But encrypting the data is the easiest part; key management is the hard part. Each UAV needs to share a key with the ground station. These keys have to be produced, guarded, transported, used and then destroyed.”

Really? Wouldn’t AES with a simple passphrase communicated over the phone (or similarly simple) likely be good enough? By the time the RPV has done its job do we care if they can somehow decode the video?

It seems that we have gone from underestimating the insurgents to over-estimating them which leaves us paralized and unable to fix anything.

Dan December 25, 2009 3:05 AM

Key management is largely a solved problem in the military. Every day all around the world US military communicators properly key their crypto gear and have been doing so for decades.

This is a goofy failure and terming people who see this risk as “naive” is similarly naive (Not to mention the concept of “uncleared UAV operators”).

This video has significant intelligence value and as such should be properly protected.

Jay Vaughan December 25, 2009 6:08 AM

Ask yourself this question: if YOU were the target of these horrendous killing machines and the cowards that operate them, what would YOU do with the feed to protect yourself and your family from murder? I can think of plenty of uses for this intelligence, personally and I sure hope I never get the chance to test the case.

Nick P December 25, 2009 10:10 PM

I agree with the critics that the video feed should definitely be protected for both short-term and long-term reasons. I disagree that this would be too hard to pull off. My proposed solution involves centralized key management in a more commercial, rather than military, fashion. Military already has many authentication channels, including DMS, Firefly and their regular tactical comms. The UAV could simply broadcast the traffic encrypted with one key, which was granted to any who asked with strong or weak authentication. The UAV control station would make an accept or reject decision on who could get the key, which can be automated if the request comes via strong (and authorized) authentication. The key would be changed regularly between missions. Commercial grade encryption modules can be used to cheaply retrofit video modules with high performance & decent security. This would be relatively painless to implement and would reduce risks until a better solution arrives. At the least, the video of classified missions wouldn’t be broadcast as plaintext to our enemies for real-time, delayed or future analysis.

PackagedBlue December 25, 2009 10:49 PM

Predator, some background: see

Some recent book, mentions that Clarke was instrumental in getting the rollout going, others were a bit lax in getting terrorist hits.

So, given the need to roll, unencrypted feeds might have been a good compromise.

Rushing new tech in a Predator, is not a good idea. A Predator is not your home pc.

Corrupted and chipped chips have hit the military, and a lassoed Predator would not be cool. As to the feed issue of from UAV or sat, who cares, there are qualified people making the decisions with USA weapon systems, at least they have earned a solid enough history to be trusted and given the benefit of the doubt, in spite of recent issues that are being reported.

Roger December 25, 2009 10:51 PM

The emphasis on SkyGrabber is misleading. The official Pentagon statements on this issue did say that SkyGrabber was installed on the captured laptops, and the Press (no doubt seeing an opportunity to kick the military with the $26 figure) latched on to that. However, the same statement also said that the laptops were fitted with a suite of military grade electronic warfare hardware.

It should be pointed out that the “insurgents” thought to have been doing this are Shiite militias. Many of these militias receive extensive technical support, arms and equipment from the Iranian government, which has a significant history of SIGINT capability [1].

Reading between the lines of the press release, it seems that the intercepted feeds were unencrypted because they were centimetre-band line of sight microwave systems transmitting between airborne platforms, and it was thought impossible to intercept these highly directional signals. However, the “Shiite insurgents” (read, Iranian intelligence) have managed to use very sophisticated hardware to pick up enough of a signal to decode, presumably from either a sidelobe or backscatter.

The idea that the live feed directly from the UAV is unencrypted seems incredible to believe, and several posters have denied it. Nevertheless it appears to be true. The Pentagon’s official statement on the matter explicitly admits it, while detailed specifications for ROVER III have been published, and describe video formats in fair detail; there is no mention of any security features at all.

The idea that interception of UAV footage is not harmful, or that all footage is highly time-sensitive, is frankly (and I apologise if this seems rude, but it has to be said) very naïve. I am currently reading “Delusions of Intelligence” [2] which explores the reasons why Germany failed to realise the extent of its COMSEC failures during World War 2, and why the Allies suffered much less from COMSEC failures even though their low-level field ciphers were much weaker than the German equivalent. Some of the arguments now being made about this issue are strikingly close to arguments made by X-B-Dienst and OKW/Chi during WW2 !!

Quite simply, any such interception is strategic intel of considerable value because it reveals the exact capabilities and limitations of the fleet of platforms — not just Predator / Reaper or Raven, but any platform with the same or similar surveillance package. And it doesn’t just reveal it, it allows continuous on-going testing: a sort of adaptive chosen plaintext attack against reconnaissance and surveillance capabilities. From the point of view of Iranian intelligence, this breach has enabled them to conduct multiple no-cost live tests of counter-surveillance systems against their enemy’s premier surveillance platform!!

There are numerous other troubling aspects to an enemy ability to intercept these signals, but that one is the real kicker. Hundreds of millions in R&D has been partially neutralised due to this carelessness.

The idea that there is a technical roadblock to fixing the issue is impossible to believe. In 1994 when they were developing this thing, there already existed commercial-off-the-shelf ASIC cipher chips that were fast enough to encrypt video in real time. Maybe not with a classified cipher (or maybe so, I don’t know), but heck, three key triple DES is a very long march beyond plaintext. True, key management is a complicated problem but it is one that military signals agencies have been solving with a fair degree of success for decades. Including with multiple levels of clearance, in multi-national coalitions, in hostile EW environments, and in scenarios where loss of service is (arguably) worse than COMSEC failure. It’s a difficult problem, but it’s a solved difficult problem.

But now that it has finally dawned on someone that this is a really, really serious problem, they are really hauling arse to fix it fast. I mean that seriously; the claim that it will take until 2014 to fix the issue seems to refer to dotting all the Is and crossing the Ts on every old pile of spare parts. They actually expect to have fixed many frontline systems before the end of THIS MONTH, some two orders of magnitude faster. This makes perfect sense since both the drones and the receivers are software controlled devices with conventional operating systems; throwing a well-tested encryption module into the chain is just a software patch. In fact it could be done in much less than a month but I expect they’ll want to do a bit of field testing.

  1. In the 1950s, the Shah’s Secret Service thwarted a Soviet-backed coup attempt by cryptanalysing — without any foreign assistance — the communications of the communist cells. It would be careless to assume that the Islamic revolutionaries completely discarded this capability. More recently, during the 2006 Lebanon War, Hezbollah, with the assistance of Iranian military (at least two of whose bodies were captured) not only defeated Israeli jamming but successfully jammed Israeli radar and penetrated at least some Israeli communications networks. See, for example:
  2. R.A. Ratcliff, Cambridge University Press, 2006. ISBN 0-521-85522-5

Clive Robinson December 26, 2009 7:00 AM

@ Roger,

The issue with “SkyGrabber” is two fold and there is no information to say which it is.

The problem is SkyGrabber is used mainly for pulling data channels such as Sky’s Broadband Internet open down link. This is a comercial service and as such does not have in built encryption, on data or routing information.

The advantage of a service like Sky’s is the downstream equipment is very very low cost and highly available compared to military grade equipment. Also although there are stories of Sky boxes drawing 100W of power the relevant receiver parts actually use quite low power and their are “camping” versions available that will run quite happily off of a small motor bike battery and fold up solar cell.

That is for around 500USD you can have a compleate set up to receive live news feeds and all IP traffic broadcast from a very broadband comercial satellite. That will fit in a box little bigger than a pizza box, and thus will fit more than happily in a “student” back pack.

Are the insergents using such a system for general background or more sepcificaly directed intel.

That is are they just pulling down the likes of CNN (in which case they don’t need SkyGrabber) or are they trawling through IP traffic with it.

If the latter is it just general traffic from civilians or is it military traffic.

If the latter what sort of traffic, just soldiers Emails home etc or actual military traffic.

It is known that the US and coalition forces have a real comms problem and the military systems can not cope.

An open commercial system with high capacity would be of real benifit to the coalition and at very low cost virtualy straight of the shelf.

As I noted earlier it may not be possible to put “link encryption” on such systems only end to end data encryption. Thus at a minimum the system would be open to traffic analysis.

However what if it is being used for sending unencrypted video feeds to coalition troops?

The question then arises as to what else is being sent on such links…

Clive Roinson December 26, 2009 7:29 AM

@ Roger,

I forgot to mention the second issue with SkyGrabber. Which is covert communications.

The data downlinks are sent “general broadcast” SkyGrabber is reputed to be used for getting “free porn” by hunting out other peoples downloads that might be of interest.

It could be the case that they are using it to “get orders” via “stego pictures” etc downloaded by others in the satelite footprint. Either deliberatly or by coincidence.

So irrespective of the UAV issues there is a very real question as to what the comerical satallite coverage in that part of the world is being used for by the insurgents.

Jonathan Wilson December 26, 2009 9:13 AM

The answer is to have a single key for the stream and issue receivers with the key embedded in it to anyone with a need to see the feed. Use a well tested off-the-shelf and unclassified algorithim like AES (its not like Al Queda have the computer grunt to crack an AES key)

The devices would contain the key inside a micro controller that cant be read without decapping it (again, the bad guys don’t exactly have clean rooms hidden away in the mountain caves to decap chips and read out the secret key)

The devices would be reprogrammable in the advent of a lost device by inserting some kind of simple flash memory based device. Make the device easy to destroy once its been used (e.g. erase feature on the receiver) and add some simple security just in case the storage device is lost before its used (e.g. a pin pad on the top, enter the pin to allow the storage device to be used. This would only be required in the case that the device is somehow stolen by the bad guys.

The signal would then be transmitted with both the old and new keys until everyone has upgraded (just like sattelite TV companies do when they need to change encryption to defeat hackers) and then the old key is dropped.

Should there be groups where new keys cannot be delivered (either physically or over some form of data link), only the particular surveillance systems that those people need to access need to send with the old key.

All this of course assumes that the people who need to see the feed are able to get the devices somehow.

Jonathan Wilson December 26, 2009 9:42 AM

For the key management, an even better idea popped into my head. New keys are sent over the downlink encrypted using some kind of password or pass phrase.

All of this would be done automatically as a broadcast with each receiver simply seeing a “new key material” blob mixed in with the data stream (sent many many times for redundancy to ensure every device gets the key)

Then, the pass phrase is given to all the legitimate users of the system (being a pass phrase it can even be transmitted to special forces, spooks and others who cant use normal comms channels via the same method as any other orders or instructions to those forces). The pass phrase can be picked so as to be meaningless to anyone who doesn’t know what to look for.

Plus, with new key material only being required if and when a device is lost/stolen, its not like there are going to be daily transmissions of new pass phrases to worry about

Marian Kechlibar December 26, 2009 9:44 AM

The ZRTP + SRTP protocol seem to be an (almost) ideal choice of encryption protocols for me. After all, the traffic is analogous to a VoIP call with audio and video stream. Are you familiar with the ZRTP key exchange protocol, Bruce?

The main problem with ZRTP is man-in-the-middle. Hmm. The drone could record the SASes for every transmission, and they could be compared after landing of said drone back onto the base. In this case, an active MiTM would be detected post factum, but still detected.

And the ZRTP protocol is opportunity encryption, so the said drone can be borrowed by allies without the necessary software+hardware.

Mike December 26, 2009 11:01 AM

The solution to this “problem” is already in effect all across the military. If they need encryption, but don’t want to invoke all of the NSA crypto regulation they just DON”T CALL IT ENCRYPTION. There are all sorts of euphemisms like “wrapping” and “scrambling” and “authentication solution”. As long as crypto isn’t called that and keys aren’t called keys and what you’re transmitting isn’t sufficiently classified you can use whatever solution you want.

Other issues include the cost of “encryption” as just a few years ago I saw that a ruggedized military grade HD video stream encryptor cost $50,000 and wasn’t exactly featherweight. If field troops need a separate box to decrypt the scream it’s going to demand more power (which means they have to carry more batteries) and the equipment has to be both soldier and Afghanistan proof, which is not easy to accomplish.

Another problem with a low grade COTS solution is that it will probably also get hacked and the military will end up looking even MORE stupid. What you said they need is something like a satellite TV DRM product, but we all know what happens to DRM 😉

Ted December 26, 2009 1:03 PM

The analysis seems to have focused mostly on the short-term value of capturing tactical data in real time. Over time, doesn’t the eavesdropper also gain information on common mission patterns?

For example, a thoughtful analyst could learn to distinguish pre-attack behavior from surveillance, learn the common flight times and routes, and notice areas of repeated interest.

David MacQuigg December 26, 2009 1:53 PM

The problem with encrypting the video: “A soldier in the field doesn’t have access to the real-time video because of a key management failure.”

The solution: Use a public-key system. The problems with public-key systems (untrusted chains of Certificate Authorities) result from the need to distribute public keys to anyone in the world. This is not a problem in a military organization.

There are many things that can fail in the heat of battle, and no system is immune from failure. It should be easy to design a public-key system for a military command that has less chance of failure than most other technologies in the battle.

Then, there should be a backup, or maybe two. The soldier should be able to call for an alternate key. His CO, upon recognizing his voice, could either send the key, or turn off encryption for that one session. This will do the enemy no good, because an occasional absense of encryption will not be enough for them to have monitoring systems running all the time.

Iain Anderson December 26, 2009 4:47 PM

It’s likely they got the new “Reaper” name from a video game. The flying alien creatures in the Gears of War 1&2 are called “reapers” and the lead character in Unreal Tournament 3 is “Reaper”.

Kirt December 26, 2009 4:58 PM

Nice write-up…. Yes, go against the “if you’re going to do it, do it right” principle and develop some pseudo-random lightweight schema that is functional for the purpose is my principal. Same applies to developing security for business – start with the objectives.
UAV operators are cleared, I thought.

Clive Robinson December 26, 2009 7:02 PM

Most people talking about “key managment” have missed the point.

The military have had a simple solution to emergancy key issue with maximum security for many many years it’s called the One Time Pad.

If you lose your keyfill for some reason the solution is you encode an agreed key request phrase via the first part of a OTP page. Putting the page identifer first. At HQ or where ever they pull the OTP with that serial number and check who it has been issued to and what their role is. They then look up the agreed phrase and check it is correct or duress etc etc.

If all is ok the correct key is then encoded against the remaining part of the OTP page and sent along with another agreed phrase to indicate the key status etc.

After decoding etc the OTP page is burnt or eaten or disolved in water etc etc (rice paper and easily soluble vegtable dye is fun).

For modern equipment each piece of kit can have it’s own OTP stored inside it if the sesion key is lost. The operator types in the pass phrase from memory into the unit. It displays a message the operator sends in whatever way is appropriate. The message the operator gets back either directly or through the downlink data gets into the unit and the check phrase displayed.

The used portion of the OTP is destoyed. If the unit gets captured providing the operator pressess the “fill clear” button or pulls the “Ignition Key” to delete the current sesion key the unit is effectivly usless as the required pass phrases are in the operators head. If tourtured the duress code can be entered instead and the appropriate action taken by HQ.

A similar system can be done with two unrelated PK certs one for encoding one for decoding, however if the equipment is lost the use of certificates will allow previous messages to be decoded, there are ways to deal with this but it can get messy (the simplest is for the system to send a new decode cert each time a key update is sent to the unit and it over writes the previous cert etc).

Technology wise the message to HQ can be via line of sight infared laser etc to the UAV or other air bourn or orbital platform or other covert technology. Such laser systems are already in use for “smart weapon” deployment where a covert op waits for a drop signal and then “paints the target”. Thus the UAV only has to overfly a known point at a known time which again can be carried in the operatives head.

The danger that the NSA has had in the past is loss of crypto equipment. However as Bruce has pointed out in reality it’s session keys, key fill or ignition keys that are the main concern.

The NSA already has an “inline HD encryptor” based on AES that would do the job of encoding video without any real issue (very minor design change of adding Video -SATA encoder at UAV etc end and SATA – Video decoder at the other end)

However this leaves other issues such as EmSec and Side Channels which are the real Achiles Heals of such systems these days.

Nick P December 26, 2009 9:40 PM

@ Clive

You’re taking this too far and making it too hard. Laser-sighted transmission, side channels, TEMPEST protections, etc… That’s all unnecessary, at least for an interim solution. Maybe all of that kind of stuff would be necessary to truly protect the information against sophisticated attackers and the like. However, if we want a good partial solution now, the interim solution I posted a few posts ago solves this problem, uses cheap (but reliable) COTS hardware, and only requires a simple software update for any people viewing it. The solution should work fine while providing these assurances: the whole world can’t see the video; the insurgents in Middle East probably won’t see the video; principle of least authority implicit due to frequent key changes. Total cost to implement my interim solution should be a few million, which ain’t bad.

As for the final solution, like the one they are to deliver in 2014, one would have to consider many more threats and many countermeasures, like those you mentioned. The encryption will likely be Type 1 or FIPS certified. They will probably use SCIP or HAIPE for crypto layer, with Firefly and regular key fill devices for the rest. Upgrade field receivers. Blah blah blah. Complicated. Costly. I don’t dare to even try to offer a solution. However, if one wants military grade crypto that’s easy to integrate, then I know a good product: General Dynamics AIM chip. It’s an awesome programmable cryptochip with many nice features. If one integrates properly, then their software inherits the strength of that processor and its security properties. That’s awesome. COTS chips of that quality could make the Type 1 or Type 2 version of the video encryption so much easier to build and integrate. Yo Clive, look up the AIM and tell me what you think of it. I love the thing. It doesn’t provide instant system or crypto security, but it’s a nice building block.

Dave December 26, 2009 10:27 PM


Key management is largely a solved problem in the military. Every day all
around the world US military communicators properly key their crypto gear and
have been doing so for decades.

You’ve never had to watch how military crypto works under field conditions have you? Either that or you mistyped “erratically” for “properly”.

PackagedBlue December 26, 2009 11:24 PM

One might read about the Predator in recent books: Tenet’s, Calm before the Storm; Kessler’s, CIA at War, and Woodwards, Bush at War. These books adds some important context, not found on the wikipedia page on the predator, or maybe in these comments.

While wikipedia lists a unit cost of 4.5 million, one of the books, cites 1 million each, and forced by congress to reorder in very small production orders, pushing costs up. Still, these are dirt cheap, put together, by “spit and glue.”

The predator was a novel tool, especially when adding hellfires, and going after terrorists.

Considering flying an armed UAV in other countries, I can understand intially having unencrypted feeds, like the ICBM issues over unencrypted data feeds.

Other issues, but really, who cares?

Mike December 27, 2009 1:29 AM

“The reason the U.S. military didn’t encrypt video streams from drone aircraft flying over war zones is that soldiers without security clearances needed access to the video, and if it were encrypted, anyone using it would require security clearance, a military security expert says.”

How unbelievably silly. By this logic you would need a clearance to watch HBO or DirecTV.

Personnel security theater is like regular security theater, except instead of having to take your shoes off you lose your job.

Clive Robinson December 27, 2009 5:16 AM

@ Nick P,

“That’s all unnecessary, at least for an interim solution. Maybe all of that kind of stuff would be necessary to truly protect the information against sophisticated attackers and the like.”

Sorry I did not make it clear the problem is the self imposed (and thoroughly justified historicaly) rules the NSA et al operate under.

As you are probably aware nothing lasts longer than an “interim solution” simply because it’s pragmatic and cheap.

All it’s faults and failings get built in from that point onwards as new systems are designed to be compatable, and it can take several human generations to get rid of pragmatic problems 8(

The Intel community is full of real life stories where security is lost to a series of events that all start from a very temporary or pragmatic solution that has developed an existance of it’s own.

The rules are generaly brutaly simple to prevent under/over risk evaluation in design and use.

Preditor unfortunatly like humans breaks the mold of the NSA rules, that kind of assume each piece of equipment has a clearly designed use and role.

It was fairly deliberatly considered to be a field weapon, and as such does not have a requirment to be “secure”. This was to alow it to get off of the ground (quite literaly) in the first place.

The NSA/GCHQ/DWS et al have a history of very conserative design they know that all their longterm problems arise from incorect threat evaluation and equipment longevity thus they try to protect against “unknown unknowns” in a very conservative maximal way… (metal “safe” technology with inbuilt destruct capability and carefully “physicaly” controled interfaces enforcing segregation policy).

The old conservative rules have an inbuilt chicken and egg problem, maximal security means maximum cost which means maximal life expectancy on equipment which means maximal reliability etc etc. Which is why the cost spirals upwards. Look at the price of the POTUS CrackBerry to see what happens with COST equipment and the issues of ComSec (EmSec on it is still an open question).

The problem with that is the NSA has not yet come up with a general purpose high security COST system which would solve the problem (and would not be to disimilar to what you describe).

Although to give them their due the NSA et al have actualy started in on the issue (the In Line Disk Encryptor you “like” being a case in point).

The reason for the slow progress is complex and actualy not due to burecratic inertia, vested interests or turf wars. It is more due to the speed of change of technology and lack of appropriate resources (there are some problems only time will solve).

Bruce’s Xmas Eve wish of,

“This is why I’d like to see a general lightweight encryption system for this entire class of information.”

Sounds good but it has a significant issue.

Many people thought the NATO tactical radio system design is overly complex and overly secure. And in many ways they are correct but for one issue “humans”.

Think of a General or Diplomat on a general service transport aircraft who has a suden need to communicate with those of similar rank. What they may need to say may involve very high level stratigic or intel information and it may be very very urgent…

It is one of the obvious (with hindsight) reasons why “just secure enough” is not good enough now or in the future which is the task the NSA has been landed with for properly protecting the nations secrets…

With regards the covert comms via IR laser the point is the equipment exists it works but for a different task.

The addition of a simple “through link” turns it from a limited covert weapons targeting system for “smart weapons” into a general purpose covert device with many many uses…

And I mentioned it to show that a new system does not of necessity mean a new box/battery/etc needs to be carried in addition to existing systems (which is an objection another poster raised).

As always the issue falls down to perspective and the NSA et al’s remit unfortunatly has to take the long term view.

Josh DeFrain December 27, 2009 9:47 AM

The military already has very robust and efficient systems in place for key management. Quite often they have to change the encryption key to their own radio transmissions.

This requires the new key to be distributed to all ground personnel who need to use any radio communication.

This is just one example, but I don’t think that key management and distribution is the major hurdle to over come because of management system already in place.

Blackacre December 27, 2009 1:16 PM

I do not agree with the idea that you can just choose to not do something because key management requirements are too difficult. There are always compromises. Going clear doesn’t seem like a good compromise to my Monday morning quarterback pov. Transmitting in the clear turns our equipment into an intelligence collection platform for the enemy. Shared key encryption would have been better than nothing (putting the video through of an SSH tunnel would have been good enough, imo) and I think any trooper in the field could deal with something as simple as a pre-configured putty session and still change key at a reasonable interval. With a little sophistication the enemy might be able to effectively detect the presence of the drone and triangulate it anyway… That’s where we could really turn a weakness into a weapon.

As for the rapidly changing environment scenario…troops spend a lot of time decompressing and training for war; usually outside of the theater of operations. I’ve never bought into the concept that they are too hurried to learn to master their environment. It’s sort of the reason why we have the most powerful force on the planet.

…airforce guy joking around now…
Even Army guys are smart enough to be trained. It’s been proven…lol
…airforce guy joking around now…

These systems don’t just show up during the middle of a fire fight… real people don’t outrun fire like our esteemed all american actors do in movies.

/agree with the statement, “The real failure here is the failure of the Cold War security model to deal with today’s threats.” It could be argued that the agency was built just for the Cold War purpose. It may not be a popular statement but if you’ve read the Puzzle Palace ( ) then you’d consider just building a mandate to address the threat and wrap it around a new secret agency… Hopefully this has already been done… Since it hasn’t been reported by our media yet then maybe it’s doing its job well and helping to put rounds down range and boots onto the doorsteps of the bad guys.

And yes, second grade boys actually do name this stuff…thankfully. Otherwise they would be named things like Happy Pony and they would be dedicated to providing Internet access so Al Qaeda cells could watch YouTube, update their facebook status, and learn how to become homosexuals.

Grande Mocha December 27, 2009 9:45 PM

@Cerebus: I believe that you are correct and that many of the other posters are missing a very important point. Namely, special operations may frequently be run radio silent to protect them from direction finding. (Low probability of intercept radios can still be intercepted if you’re close enough to the transmitter.)

Therefore, many of these proposed public-key based key management schemes are unusable because the special ops teams can’t transmit to perform the key exchange. They are “receive only”. Sure, you could, in theory hand out keys before the mission, but then if something goes wrong then there is no way for the troops on the ground to re-key. Further complicating the picture is interoperability between the allies. In this situation, it seems to me that the correct approach is not to encrypt.

@Bruce: When you said, “This sort of solution would require the NSA to develop a whole new level of lightweight commercial-grade security systems for military applications…” did you mean to imply the Suite-B was inadequate? It certainly seems to me that the Suite-B algorithms are strong enough to use, and there are already COTS products supporting them.

b December 28, 2009 1:44 AM

There is something extremely valuable in the video feed: Routes. Especially routes home, but also search patterns, and not just where it searches but how it searches.

The point isn’t the individual feed, but an enemy watching every feed. They can get enormously important data over time. Where are they flying out of? What allied installations do they see? Where are they searching? (Possibly even warning well ahead of time) This is also, most likely, not just raw video, but also metadata on the screen, especially coordinates.

It’s just like the NSA getting routing data. They make a heck of a lot out of who’s talking to whom. The enemy forces in northern pakistan could get a tremendous amount of intelligence out of these feeds that would help them avoid being found and targetted in the future.

If they didn’t even bother to encrypt this at all, can we really be certain they encrypted the command and control properly? Ever since I heard about the first time they loaded a Hellfire missile onto one of these, I’ve been imagining someone commandeering one and sending it our way.

Nick P December 28, 2009 2:25 AM

@ Grande Mocha

I disagree with the spec ops scenario preventing encryption. It certainly doesn’t prevent the use of far more complicated Type 1 solutions. The rekeying could be done between missions. It would expand the potential viewing time of enemies, but would prevent situations like you mentioned.

@ Clive

Good point about the how an interim solution could screw everything up for compatibility’s sake. I overlooked that in my design, mainly because its a human rather than technical issue. Such issues must be considered, though. I figure if we put an expiration date on the interim solution and forced vendors to switch to the future high assurance version we would be fine. The DOD has done this in many other areas, improving security overall while using quick fixes until the good stuff came out. The use of solutions like Trusted Solaris and later General Dynamic’s TVE multi-level workstation (admitedly nice) based on NSA’s HAP project are a good example. These are actually interim solutions used until we get fully high assurance MILS workstations. We already have separation kernels and some middleware, but lot is lacking and these give us something to use until then. They might stick around due to inertia or compatibility, but they will probably be forced out just as TVE forced out (or substituted for) many Trusted Solaris systems and DOD’s PKI and CAC forced out password-based authentication on many networked systems. I think DOD could use the same approach to get an interim solution in 3-6 months and then force deployment of the real (read: truly secure) solution a few years later. What do you think of this approach?

Brad December 28, 2009 8:34 AM

To encrypt or send in the clear? Easy: if our guys need the information sooner than the enemy can act on it if he intercepts it, send it in the clear. Encrypting information in such a circumstance imposes a cost on ourselves without any benefit at all. By the same token, obtaining that information in this situation is of no use to the enemy.

Swedge December 28, 2009 12:22 PM

Add to the list of of key management/distribution issues the uncertainty of the battlefield. The military must assume that a receiving unit on the battlefield will be captured intact. A battlefield operator may not have the time to destroy the encryption keys on the unit before he/she is captured or killed.

Receiving units could also be stolen or sold.

In the case of capture, theft, or outright sale, you might not discover that a unit is in unauthorized hands for quite a long time.

Nick P December 28, 2009 12:32 PM

@ Swedge

It’s not really the problem that it seems. My interim solution features regular key changes, so intercepted gear would be out of the loop soon. Real military crypto gear uses their key management infrastructure, which has the ability to revoke keys. If a soldier was captured or killed, all sensitive keys would be revoked. So, either way, the soldier still gets the information he needs and the enemies get little to nothing.

Clive Robinson December 28, 2009 3:00 PM

@ Grande Mocha,

“Therefore, many of these proposed public-key based key management schemes are unusable because the special ops teams can’t transmit to perform the key exchange. They are “receive only”. ”

Sorry you have a false assumption there.

Keys can be securly updated without the need for the “special ops” end to transmit.

As you note,

“Sure, you could, in theory hand out keys before the mission, but then if something goes wrong then there is no way for the troops on the ground to re-key.”

Again an incorect assumption.

A simple walk through of a system that you might want to implement for a comercial entity such as an oil exploration company to do something similar with what are effectivly untrusted field operatives.

If a unit of equipment has a number of PK Private (decrypt) certs in it that some or all are quickly erasable then sesion keys can be transmitted with the video stream encrypted against the PK Public (encrypt) keys held by the video sending entity.

If the PK Private keys are held in a heirachy with the keys encrypted against “pass phrases” then they are effectivly usless to anyone who does not know the pass phrase for the required key (the pass phrase holder does not need to be trusted over and above not handing it over to the oposition and it might require three or four pass phrases to be eneterd if a higher level of trust is required).

Likewise you could hold a rather large amount of OTP in a modern tamper proof chip this could be used to occasionaly decode either a session key or a PK Private key that is sent repeatedly for each unit with the video feed.

My prefrence as I said earlier is to use OTP, this can be further encoded by one or more pass phrases etc.

At lower levels of secrecy there are things like BBS generators that can be used to do a similar job.

Importantly the whole thing can be done in software very very easily.
Which brings us around to your other worry issue,

“Further complicating the picture is interoperability between the allies.”

Again not a problem, as long as the resulting video / audio / data can be handeled by the rest of the unit (Ogg
/CELP MPEG for instance) the encryption/decryption system within the unit of equipment would require a known language (say java) and certain efficient primatives in a DSP like engine.

All the equipment owner would have to do is hand over the units “code” PK Public key to be given back an appropriate encrypted code module to load into the unit.

If people are realy paranoid then the unit can be reset and it generates a new key pair via an internal TRNG with it handing over the PK.

Effectively “end of problem”.

Some or all of this is done in existing equipment used either comercialy or by various quasi military/police units.

Which brings me around to your conclusion,

“In this situation, it seems to me that the correct approach is not to encrypt.”

That is incorrect due to your assumption that the covert/special Ops need to transmit to get session keys. All they have to do is leave the unit of equipment on long enough to receive the sesion key being broadcast specificaly to it periodicaly and enter their pass phrases from memory when the unit asks for them.

There are further fiddles you can do to a general broadcast system such that no KeyMat is actually required to be pre-loaded into field receiver units. But is still equally as secure as an OTP, all without requiring a TX from the field receiver end (I will leave it as mental excercise for the readers who don’t know how to do it, but I can assure you it can and has been done since WWII).

Nick P December 28, 2009 5:33 PM

@ Clive

Yes, the OTP approach to key exchange is quite useful for this stuff. A friend and I once initiated secure communications using a OTP for the key exchange because I have a distrust for asymetric crypto and PKI in general. We would (face-to-face) exchange a few GB of key material in a truecrypt volume (Serpent encryption w/ strong passphrase). Our methodology was using OTP material for secure email, chat, etc. on a Windows, Linux or hardened LiveCD machine. However, it was designed to be integrated into the crypto coprocessor I described to you in the past, greatly simplifying key exchange and randomness of key properties. I love these huge USB drives and in spite of all the hating on OTP crypto, I still find it useful.

As for your intellectual challenge, I was going to take it up but I need clarification. You said this was a way to rekey for field officers who were in receive-only mode, didn’t preload the keys and it was as secure as OTP. Did you mean info-theoretic security like OTP or security via hard problem like Diffie-Hellman? For preloading, did you mean that just symmetric keys weren’t preloaded (leaving room for pub/priv pair) or no keys whatsoever preloaded? I mean, if it’s a rekeying with no preloaded key-encryption keys, no transmission by client, and equal to OTP, then I’m out of ideas already.

Grande Mocha December 28, 2009 7:22 PM


You are very intelligent, and your proposals are clever. I guess I am internally assuming some operational parameters which I neglected to mention.

Here is what I am envisioning… a UAV is flying over a certain area delivering a, hopefully, uninterrupted stream of video. The UAV does not know who is receiving this video, and there may be many units in the field simultaneously using that live video stream. So, the UAV is essentially operating in broadcast mode.

Now, in that model, if the UAV is encrypting the data stream, then all of the recipients need to have the same key. A rekey for the UAV would involve all recipients having to rekey.

Certainly you could use a PK scheme where the UAV periodically generates a new session key and transmits it using PK where the decrypting key is stored in all the receivers locked with a passphrase. However, there are some operational problems with that:

  1. I have never been in the military, and certainly not in an special forces units, but I would imagine that they would not be excited about entering passphrases in the field. I can just imagine how many retries I would require to correctly enter a passphrase on a portable, ruggedized, keyboard in the dark, possibly while wearing gloves, maybe while taking fire. It seems like something that they wouldn’t want to do.
  2. Does every member of the Ops team have to memorize the passphrase? If not, what happens if the radio guy gets killed?
  3. I believe that by DoD rules, the passphrase would be at the same classification level as the PK it protects. So, that just moves the “key management” problem into being a “passphrase management” problem.

I guess that I view the initial loading of the PK split into the receive device as being equivalent to preloading a shared key. Sure it’s technically different because different algorithms are involved, but operationally it’s the same because if that PK split is compromised then all future communication from the UAV is available to anyone who has it. Likewise with any OTP stored in a tamper-resistant chip. From a long-term operational standpoint, these seem problematic because the compromise of a single receive unit would require a maintenance cycle on all receive units and transmitters (to replace the OTP or PK).

Ultimately, I’m sure that a technical solution exists, but it may take several years and many millions of dollars to design, verify, test, and deploy. Hopefully by that time the current conflicts will be over! So, I certainly don’t have any problem with the DoD’s decision to just broadcast this ephemeral video data in the clear.

PackagedBlue December 28, 2009 9:57 PM

I wonder if there is ALSO an encrypted video feed, sure wouldn’t take much to implement.

Hopefully, the extra command and control options would allow extra options with looping the open feed, replay of whatever, etc…

Perspectives around the early Predator, would probably rule this out, it wouldn’t take much to add, and is something I would expect exists. Only one predator at a time, hum, also a safety mechanism. Makes the open feed issue relevent. Not a bad design if you assume serious stuff could be present.

You do not have to be a rocket scientist, to assume that more things might be going on.

Clive Robinson December 29, 2009 12:39 AM

@ Nick P,

“mainly because its a human rather than technical issue. Such issues must be considered, though. I figure if we put an expiration date on the interim solution”

I’m going to give this a very long answer, not because you need it but others might misunderstand what the issues are (especialy artical writers who now use Bruce’s blog instead of doing their own research).

One of the biggest anti spy Intel successess that became “known” is also on of the NSA/GCHQ et al’s greatest fears.

Project VENONA (sometimes incorectly verona),

Put simply the untrusted WWII friend Russia which became the “cold war” foe issued their “overseas operatives” (Diplomats and agents) with OTP material that was hand generated during WWII.

Somebody on their side made a mistake / made an assumption / got lazy / got complacent / got insufficient resources / got corrupt / got whatever human failing that caused some of the OTP pagess to be “REUSED”…

The NSA is jokingly assumed to mean “Never Say Anything” it could equally mean “Never Stops Analysing” the OTP reuse was discovered in 1943 and analysis went on untill the 1980’s.

The overall results must have been quite upsetting for the cold war opposition (something like 3/4 of the nearly 400 NATO country sources positivly identified), especialy some of their operatives who lost one heck of a sight more than just some old intel…

Key re-use deliberate or otherwise is a fear the NSA/GCHQ et al have and belive me when I say they test each other all the time and they know it.

Such are the joys of damage limitation and the resulting security “segmentation” you just don’t know who’s traffic you are recording and anaylising be it friend or foe.

Then again it all helps as todays friend could be tomorows foe, and they are like our “current friends” human at the end of the day…

You also sometimes get the feeling that the likes of the NSA design new systems because they feal they never put in enough “key space” to be happy “reuse” issues won’t arise.

That is there are three ways to prevent key reuse,

1, Use a determanistic method to pick your keys.
2, Store every key you have ever used with a system and check each key before you use it.
3, Use a new system where old keys are incompatable and thus cannot be reused.

The first option is an obvious “no no” as once a friend or foe “know” the method “all your secrets past present and future” are “known to them” (I’m ignoring theoretical security here as the Comms agencies are a conservative bunch at the best of times).

The second option is at first sight an example of an 0.5(n^2-n) problem. Only it’s not, it’s actualy worse due to multiple key issuing entities and over allocation of key space and all sorts of other “human issues”.

And even if a single agency could be trusted and issued 100% of the keys in use the problem would fairly quickly reach the size where checking would take longer than growing issuing requirments allowed.

So the pragmatic and possibly only solution is the third one dump the system befor the number of used keys “n” gets too large…

Unfortunatly this has human issues as well called resource appropriations, the tax pot is only so big…

As Bruce has noted in the past the likes of AES are thought (ie theoreticaly) in the Open Crypto community to be strong enough for our current needs. And the design of new systems to replace them sufficiently well understood to deal with the practical problems as they arise. So… they want to move on from algorithm design / breaking to more difficult problems such as “key managment”…

But is that the NSA et al’s view point?

Possibly not (it certainly hasn’t been in the past).

A thought experiment for you to see why that might be..

Just assume for now that 128bit AES keys are tomorrows passwords…

If you make a simple current assumption about passwords of, 2.5billion users, with 10 services each requiring a new password every month.

That is an upper approximation of the password size problem currently. So 300 billion passwords a year currently which is aproximatly 2^38.

Now add in a little growth information. The US population has trippled since 1960 Africa and Asia are suposed to be growing at 2-4 times the US rate. Many think this is going to be a lower bound on population expansion (personaly I think they are wrong but that’s a different subject).

However there is the double every 18 month rule, which is losely based on Charles Moore’s observation about transistors in IC’s.

Internet population is predicted to doubled every 18 months, the number of services we each use is predicted to double every 18months or so, and the number of electronic toys etc likewise.

So growth in needed passwords is 2^3 every 1.5 years, so the number of passwords used by the next Olympics is going to be 2^44, and 2^50 by 2015.

So what the password only has a life of a month so who cares. Unfortunatly unlike passwords keys have a longevity problem in that you should not re-use them ever…

So at that (admitadly very upper boundish 8) growth rate in 2022 we are going to see the 2^64 bit mark past. Or a better than even chance a 128bit AES key has been re-used.

That’s just over 12 years away… The NSA think very conservativly in a minimum of 30 or 100 year periods and our legal brethrin in just over a 1000 years which is the length of time some legal contracts such as land leases last.

You are looking at quite a large number of bits certainly more than AES has by a very very long long way.

But does it matter well yes and no. In 30 years I like many others will possibly not be here to care but our children most likley will (if obesity and Global Warming don’t claim them first).

However I do know one thing though there are cars on the road that are considerably more than 12 years old and I still use electronic test kit that is more than 20 years old on working equipment that is more than 30 years old. Oh and I still travel occasionaly on train rolling stock that is more than 60 years old and have flown in my younger days in comercial aircraft that where WWII vintage (DC3). And most years I go on board one or more “little ships” that got the British troops out of France early in WWII.

Oh and just last year I was in the London Science museum and they had some bits on display and my son rather embarisingly pointed at some of the exhibits and said “Daddy you’ve got one of those and one of those on your desk”…

All of these items had “best before” dates and planed obsolecence. So even in the pragmatic world,

“I figure if we put an expiration date on the interim solution”

Is only a figure of speach.
But unlike data processing equipment which does not broadcast to the world (or shouldn’t 😉 radio communications equipment is designed to do so. And as the NSA/GCHQ et al are acutely aware, when it comes to ComSec “the elephant in the room” that nobody talks about has not just a perfect memory but ears that hear everything. VENONA is a very tangable ghost of an elephant that has long since gone to the graveyard.

But it still frightens those tasked with protecting a Nations most intamate secrets. Why? because in the case of the NSA/GCHQ they know quite literaly where the bodies are burried because they put them there. And there is nothing so real as the nightmare you live with.

So the fact that they are trying quite hard to overcome this fear and can now put trust in AES for data storage security does not mean they are yet ready now (or ever) to certify AES for ComSec.

The advantage of not having the Preditor encrypted is that it does not engender a false sense of security. Field commanders and their controling staff officers know full well that the enemy can see what they are doing and thus plan to mitigate this from the outset.

On this score it’s a bit like a handgrenade, when you pull the pin you had better have made your mind up what you are going to do with it, you just cann’t put it down whilst you think about it. And for certain types of fighting (in woodland at night) you don’t just chuck them about as they have a habit of hurting you more than the enemy.

So there are things for which the Preditor might be used for that it is currently not specificaly because it does not have encryption.

As the IDF found out just a year ago ComSec can give you a false sense of security and hurt you even with tactical field comms. Their previous activities sugest that they would have behaved differently with non secure tactical radio systems. They badly underestimated the capabilities of those whose land they invaded and payed a high price for it.

They obviously did not think about what the US learnt in Vietnam about supposadly secure VHF/UHF comms systems…

I’m sure it’s a lesson that todays field commanders are more than a little concerned about.

The simple fact is ComSec can and does hurt more than it helps in a tactical situation.

Even more so when it engenders a false sense of security which those you are fighting are happy to take advantage of to best effect.

Clive Robinson December 29, 2009 2:24 AM

@ Grande Mocha,

Your point 1 (HCI) “Muddy Boot Issue” is probably the most vexing problem that has been and I suspect always will be “nearly but not quite” solved in the military and other contexts.

Your point 2 about who needs to remember what pass phrase is actually a solved problem as a “M of N shared secret” issue. Say you have a squad of six people (realisticaly the smallest non Special Op size) all six could have two pass phrases that they know “the correct” and “the duress” phrases. Each person picks their own phrase. So six pass phrases and a selectable number from 1 to 6 required to make it work.

The problem with pass phrases is that they are usually some kind of memorable plain text. Which would normaly be a bit of a problem.

However think about an OTP encrypted message. If you supply the plain text you get the key, if you supply the key you get the plain text. Providing one of them is sufficiently random then the OTP is provably secure. You can stretch this point a little further in that the plain text in a normal OTP encrypted message is protected on a bit by bit basis even if it is repeated a number of times. Thus the unit it’s self can store the shared secrets if the format is sufficiently random OTP encrypted against the individual pass phrases. There are all sorts of gotchers involved but it can (and has) been done in comercial equipment.

Your point 3 of DoD and other rules about classification equivalence. Yes the pass phrase would be of the same clasification as the PK key

However this is where the fun starts 8)

The key handeling procedures don’t apply in quite the way many people think they do.

Think of the pass phrase as the combination to the lock on the door of the safe holding the clasified information.

Then apply the rules for the safe combination handeling to the pass phrase. Yes I know that that appears to be bending the rules but your realy are not (It’s the sort of thing that happens when you can abstract information from a tangable to an intangable form 8)

You see this with NSA Crypto Ignition Keys and their memorable pin numbers.

With regards to,

“I guess that I view the initial loading of the PK split into the receive device as being equivalent to preloading a shared key.”

It is and it is not…

Yes the idea is to get a session key that is shared between all units.

However importantly each unit has it’s own PK/OTP keys and these are not shared with other units.

The master control unit (MCU) needs to be aware of all the units individual keys but it is not in the UAV, it sits next to the UAV operators way behind the front line (even on a different continent).

The MCU generates a shared sesion key via a TRNG and appends random fill and the time the key becomes valid as a message to all field units.

It encrypts this message with all the individual “field unit” and UAV keys. It sends all the “field unit encrypted sesion” keys up to the UAV which rebroadcasts them allthe units decode their individual keys and at the appropriate time all units switch to the new session key.

Capturing the drone or a field unit is not going to get you anything other than the current session key and the units own private keys encrypted against the shared secret pass phrases.

Provided you cann’t get the right pass phrases in the minimum required number all you have is a unit that will stop working when the next sesion key is changed.

If the unit is assumed lost then the unit’s encryption key is removed from the MCU and a new session key issued to the other units and from that point on the lost unit is just a lost piece of equipment like an empty petrol drum, that is it has scrap value but not a lot else.

As you say,

“Ultimately, I’m sure that a technical solution exists,”

Yup there are one or two comercial units out there with bits of the problem solved in them. It’s just a matter of putting it together.

“but it may take several years and many millions of dollars to design, verify, test, and deploy.”

I hope not most of the bits are out there and a prototype could be knocked up in days, production “prototype units” within a week or two with the right incentive, field testing of the hardware could begin within a month. The big unknown is “security testing” that could hopefully be just software updating if the hardware is done right first time.

But I’m with you on,

“Hopefully by that time the current conflicts will be over! So, I certainly don’t have any problem with the DoD’s decision to just broadcast this ephemeral video data in the clear.”

And I suspect importantly neither do the field savy comanders or their comanding staff officers.

As has oft been observed you “cann’t make an egg breakfast without breaking shells”.

Kevin December 29, 2009 11:07 AM

I’m sorry to say this, but you are a tactical idiot. There is a reason that, as a prisoner of war, you only give them name, rank, and serial number. It’s called the code of conduct, and it exists for a reason. The reason is information.

In fact, the most valuable commodity in war is not bombs, nor guns, nor oil. It is information.

Information is why the allies won WWII… Information is why anyone wins any war.

Clive Robinson December 29, 2009 2:06 PM

@ Kevin,

“I’m sorry to say this, but you are a tactical idiot.”

You forgot to say to whom you where refering your comment.

With regards your statment,

“Information is why anyone wins any war.”

Err not exactly, a lack of information can lose you a war, as can false information or correct information you chose to disbelieve.

But information alone will never win you a war, it’s what you do with it and your available resources within the choice of available locals.

This has been known and written about quite clearly for the last 6000 years or so by various people at various times.

SM2 January 1, 2010 8:09 AM

‘security analysts’ often make the case that for encryption, etc. to be effective it has to be 100% perfect at least to the standard they study academically. there is surely a middle ground where the stream is not ‘hackable’, ie totally open to anyone, and where it is encrypted to the theoretical standard and is ‘provably secure’.

sometimes the theoretical standard is needed but sometimes the concepts of ‘security’, for example key management, do not translate into practical use as much as the people studying them might like.

Clive Robinson January 1, 2010 1:37 PM

@ SM2,

“There is surely a middle ground where the stream is not ‘hackable’, ie totally open to anyone, and where it is encrypted to the theoretical standard and is ‘provably secure’.”

The answer is both yes and no.

From a technical only perspective the answer is yes, you can have light weight protocols and key managment that work and work well.

We see such things with comercial systems. They are more than adiquate (after ten or eleven attempts) for the likes of comercial TV broadcast by satellite.

But “technology” and “comercial value” are not the issue. The issue is “human” in nature, and has a large “unknown unknowns” component.

You usually do not know the real value of intel in of it’s self or within various contexts at the point in time it is gathered.

Sometimes small bits of apparently very low level or irrelivant intel can become very very important (such that it will allow an asset to be unmasked etc).

From the NSA/GCHQ et al’s perspective it is safer to assume all intel has one of only two values, “Public” or “Code Word Secret”. That is unencrypted or maximaly encrypted.

This is for a couple of reasons, the first is they cannot be held responsable for anything others regard as “Public” but everything else they can if it falls within their remit.

Humans are pathalogicaly incapable of assessing risk where there is even fully known complexity. And obviously cannot acuratly assess risk where there is high complexity and several unknowns.

If an operative knows there is no security then they know the responsability falls on them and they may act accordingly. If however there is some degree of security then pressing need will overcome caution and the risk will more than likley be downplayed to some future disadvantage.

As the NSA/GCHQ et al have the responsability of assessing the level of communications security the long term security issue becomes there problem. BUT they have little or no control of what an operative will do, thus the only pragmatic approach with ComSec is supply “the best we’ve got” attitude…

Which for obvious reasons has it’s downsides.

As I said as a technical issue an intermediate solution is easily possible but as a human issue not even close to being solved. So an intermediate solution is still some way off with ComSec.

However there are intermediate solutions for non Comms Data sec such as storage. The reason for this is that even the encrypted data is an “inventory item” and thus it’s loss or disclosure is going to be known in a timely fashion, but it is also limited to the missing inventory items…

As the Old Chinese curse says,

“May you live in Interesting times”

And I’m Sure the IntelSec agencies sure feel like they’ve been cursed 😉

Dave January 3, 2010 1:50 AM

The problem isn’t making crypto secure, it’s making it effective and usable. From that point of view, the Common Criteria is Al Quaeda’s secret weapon.

Tim January 5, 2010 11:50 AM

I disagree about the points on key management. You could build a common key management system that allowed the data to be encrypted without all the overhead of managing keys. Even having a static HW key is better than no encryption at all.

Bruce Schneier January 5, 2010 11:58 AM

“I disagree about the points on key management. You could build a common key management system that allowed the data to be encrypted without all the overhead of managing keys. Even having a static HW key is better than no encryption at all.”

Of course. Of course you could do it. Of course a static hardware key is a reasonable solution.

But it’s not an option the NSA’s Cold-War solution space.

Nick P January 6, 2010 1:54 AM

@ Bruce

Plenty of workable solutions have been discussed. The problem you mention is the real problem and it’s a bureaucratic one. However, I think it can be solved over time. Notice how they’ve changed their methods in response to asymmetric warfare. With enough prodding, they might loosen up on crypto requirements in certain situations, like “Sensitive but Unclassified” type of transmissions. They already do in areas like logistics, which are actually security-critical. So, there is hope that some restrictions will be lifted on even less secret activities.

Clive Robinson January 6, 2010 5:31 AM

@ Nick P,

“So, there is hope that some restrictions will be lifted on even less secret activities.”

As both you and I have seen the NSA et al have losened up on “inventory” but not on “broadcast”.

And I think this might be the crucial difference.

With “inventory” items like Hard Disks, Backup Tapes, etc you know where your secret is which alows you an assumption that the cryptanalysis clock (ie brut force) has not started until the item is lost.

With “broadcast” there is no assumption about the loss of a secret item, it is beyond control the moment it is broadcast. Thus the “brut force” clock is already running…

It will be interesting to observe the transission from “conservative” to “liberal” on information protection.

For one thing it will actualy tell us just how secure the NSA et al belive (know) the likes of AES are.

If they say “yeh AES ok for top flight diplomatic and intel open broadcast comms” not just “codeword on HD”. It will be saying a great deal about more than just the perceived security of AES 😉

Rene Bastien January 15, 2010 8:22 AM

I do not think the fact that insurgents can intercept the video download is the real issue here. The video channel is not protected. I would worry much more about the insurgents actually using the weakness to upload video, thus wreaking havoc on operations.

Nick P January 18, 2010 2:54 AM

@ Rene

Which brings us to the question: is the video authenticated? Plenty of software supports authenticated, but unencrypted, communication. I wonder if this applies to Predator. Otherwise, you are right: things can get interesting.

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.