Autonomous Vehicles as Bombs

Good discussion of the issues. Now we need to think about solutions.

Posted on October 6, 2015 at 7:18 AM • 48 Comments

Comments

65535October 6, 2015 8:41 AM

Although a “Driverless Cars” would be the weapon of choice [it could carry a large bomb], as pointed out on this blog there are airborne drones that could fire a gun - into a crowd of people. This weaponized drone could be launched from a hidden location near a stadium and fire upon the crowd causing panic. That not a pleasant thought.

And, yes we need to think about solutions. I will say the US military probably has experience and simply won't fly drones in defended airspace [only in 3rd world airspace].

WinterOctober 6, 2015 9:14 AM

Somehow, I am reminded of all the post 9/11 predictions of terrorism disrupting society. It only happened in near civil war situations.

All these "threats" are less lethal than a guy with ample firepower walking into a crowd. The Bombay massacre showed that conclusively, as do the near monthly school shootings in the US.

paulOctober 6, 2015 9:15 AM

@64K:

There are (nonmilitary) drones that have been modified to carry guns, but the weight limits and flight-time restrictions for current drones make it a difficult sell (and the recoil would play hob with the stability and structure). Might as well outfit a Piper Cub with weapons.

A lot might depend on whether the destination restrictions for a driverless car could be defeated while still maintaining the autonomous-driving capabilities, e.g. navigation through the doors of buildings. (Or, if we're talking about robot uprisings, whether the collision-avoidance routines could be subverted.)

SmirkOctober 6, 2015 9:19 AM

This isnt anything new. A guy a few years ago somewhere made a tank with a gun on top controlled by a playstation controler and a tv from the inside. You can make almost anything remote control be or you can store the information already if you know where you want to go already. And of course you can add something nefarious to almost anything.

Some "drones" already have a follow me option, so you could follow someone with 30 drones.

Solutions Inc.October 6, 2015 9:21 AM

Umm, how about we use the same solutions as we do with cars that have drivers in them?

CrudeOctober 6, 2015 9:28 AM

I know!

How about implementing the ultimate panopticon, i.e., the total surveillance and absolute police state?

Oh, wait a minute...

jaysonOctober 6, 2015 9:34 AM

Driverless cars will remain a myth. Maybe, just maybe, we'll have a bus system but I doubt it. The ability to hack them outpaces the ability to drive them. I believe it was here where I first read about the liability issues of a driverless car involved in a death. Who's liable? Manufacturer lawsuits would increase substantially in number and cost.

This article prompts to thing is that individuals will not be able to control their own cars programming as it can lead to weaponization of the vehicle. The comparison to cars with drivers isn't the same since it's far safer to drive a remote control car with a bomb than convincing someone to drive it. The comparison with drones isn't the same since the payload is far greater in a car/van/truck.

ChristianOctober 6, 2015 9:48 AM

Daemon by Daniel Suarez already is a nice novel that makes use of self driving cars as weapons.

A good read despite it being pre NSA/Snowden with its world view.

LRSOctober 6, 2015 9:56 AM

So what ?

I remember someone saying recently : "Since 9/11, the US has increasingly become Yellowland, a place where we assume danger is imminent. It's damaging to us individually and as a society."

Oh wait... ;-)

rOctober 6, 2015 10:10 AM

The driverless cars currently have eyes, a nose would go a long way with this. They need some sort of DRM to stop the software and explosive sniffing hardware from being tampered with. The good thing? These are a function of time, it wouldn't take long if the explosive wasn't heavily hidden or masked for the software to realize that for the last GPS mile it's been smelling ammonium nitrate etc.

The really good thing? Radiological stuff should be the hardest to hide from the software and sensors I think?

I don't think it's appropriate to include immediate shut off software, as somebody could trigger a freeway pile up etc.

CallMeLateForSupperOctober 6, 2015 10:23 AM

While I was struggling mightily with my adolescence, my grandmother, a teacher of music, math and Latin, and ever the instructor, often chided me, "Feste lente [1]; hasten slowly". Hastening slowly - indeed, doing *anything* slowly - is something Americans do very poorly, to our detriment.

Innovate; market; repeat. Let the devil take the hindmost. "No time to say hello... goodbye!" That tight focus leaves too little time for fundamental problems to surface; we are well into the river of adoption before we can even notice that its depth and speed are threats.

Transportation infrastructure in America - to be clear, the roads and bridges that bear our must-have toys from Amazon to our doors - is crumbling now. Today! So we do... what? Fix that problem? Naw; gushing about self-driving cars and imagining sleeping through the morning commute are "sweeter".


[1] That is how the words sounded to my ear; I've never seen the phrase in writing, so I might have got the spelling wrong.

rOctober 6, 2015 10:38 AM

Upon deeper inspection, I believe it's a good idea to mandate that those 2 types of sensors be included in the manufacturing of all automated vehicles.

Then, we just have to figure out how to defend the sensor/software from attack.

Carmel AckisonOctober 6, 2015 11:27 AM

Now that's a strong selling point: apart from spying on you and beaming your 24-7 location to google, your car will automatically report you to the police and lock you inside the vehicle if it smells any acetone or hydrogen peroxyde on your hands, it will be easily hackable by any spotty teenager, it will slam on the brakes when someone waves a Barbie doll in front of it, and it will refuse to drive you beyond the entrance gates to your building block, which are 50 meters away from your actual door. Where can I get one?!

markOctober 6, 2015 11:41 AM

Bruce,

I really, really, REALLY do not want self-driving cars.

When Vent Cerf was here where I work, he was saying google was looking at cars with no steering wheel, and no pedals. My first reaction was to think of a street that I take when I do drive to work: one lane of parking, two-way, wide enough for two vehicles, assuming both are tolerably competent (and not nervous) drivers... oh, and *NO* center line. AND buses use the street. There is *NO* *WAY* that they're going to have something to navigate that.

Then, of course, let's see it navigate an Interstate... not in winter, but the other season: road destruction.

Finally, and if the first two weren't self-driving car killers, here's the big one: they start selling self-driving cars today. Ten years from now... half of them are more than five years old, probably half of them have been resold, used. Now, how many of them have been maintained properly, and had the recall service performed? Would *you* want to be driving in that traffic, even assuming that the car control system wasn't hit with malware, or its networking cracked?

mark

wandering quiltOctober 6, 2015 12:11 PM

Hey, I know how we could make driverless cars more popular. Stick a pair of google glasses in front of them!

JasonOctober 6, 2015 12:25 PM

"Security, Inc" has it right: you stop autonomous car bombs the same way you stop human-piloted car bombs. There is one new twist, though: with an ordinary car bomb, the bombers always have at least one casualty.

One annoying thing about this article is that it treats "drones" and "autonomous vehicles" as synonyms. Most things we call drones are nowhere near autonomous.

Clive RobinsonOctober 6, 2015 12:58 PM

@ Paul, 64K,

... and the recoil would play hob with the stability and structure...

Ever heard of a recoilless gun?

But simply a low mass high velocity bullet goes in one direction and a considerably higher mass at proportionatly lower velocity goes in the opposite direction.

Further modern barrels can be made extreamly thin and light if made out of similar materials to the leading edges of multi mach aircraft wings and designed for a very few number of rounds to be fired. This leaves only the over preasure and heat in the breach to be dealt with, and I'm told there are viable designs for those.

The question thus becomes, how long befor SWAT teams start using such designs...

MartinOctober 6, 2015 1:33 PM

@Clive Robinson

...a low mass high velocity bullet goes in one direction and a considerably higher mass at proportionatly lower velocity goes in the opposite direction.

What, as I read/understand, you described is not a recoilless rifle (gun). A recoilless rifle simply does not restrict the "opposite reaction" of firing a projectile as most standard rifles do. (As in physic class 101, equal but "opposite reactions.")

Most, if not all, recoilless armament simiply has an open ended breach block that simply vents, without restriction, the "opposite" reaction gasses that result from firing the projectile.

rOctober 6, 2015 1:41 PM

@carmel
I wasn't saying tie it into the ignition, my mind is on the cellular chipset... tie it in the same way we are complaining about existing cellular SOCs and modems. Have it snitch, have the ignition disabled only if the cellular head or sensors are disabled and keep it separate from the main driving computer. Any extra barriers to an entry level guided missle makes the general public safer by increasing cost and time to market. I think it would be a really good idea to make this sort of behavior mandatory...

What do we do if after-market autonomous add-ons appear later on?

Bob FOctober 6, 2015 1:42 PM

I guess I have looked at the situation from a world with computer guided flying cars perspective. In such a world a computer system would need to be created that would control all the flying cars (a new air traffic control system), routing them to their destination.

Once self-driving cars become ubiquitous it should be required to have them controlled by the computer system. You enter a destination and the computer drives to it. Turning off the computer system (user driving mode) would be prohibited unless there is an emergency which would notify emergency services (or perhaps on specially designated roads).

I figured the same should be done with a maglev highway system as well though with the increased speeds it would definitely be needed.

AJWMOctober 6, 2015 1:49 PM

A few heavy-duty servos and some off the shelf R/C gear and you've got a driverless (ok, remotely piloted) car. The Mythbusters guys have been doing this kind of thing since their first season.

Shrug. Considering there isn't much lack of suicide bombers, it's not something specifically worth getting your panties in a bunch over. 9/11 didn't use drones or RPVs.

JesseOctober 6, 2015 1:50 PM

You know, I've read materially the same as the following probably 100 times by now:

> For example, if an autonomous vehicle were to cause an accident, who would be
> responsible? The passengers, even if they had no influence? The vehicle manufacturer?
> The software developer?

The answer always stymied me, but for some reason (I blame the Vyvance :P) this time I read the question and the answer somehow fell out of my mouth as though it was something I was already sick to death of repeating.. even though in actually it's as much news to me as anyone else.

*The insurance company* (potentially in tandem with the car manufacturer) would be responsible. That's who.

How? Why? What?

Well, today drivers are only found at fault when there is an actual driver fault. If you crash into something because you steered wrong, and it can be proven that you should have done differently or that you were breaking a traffic rule during the collision (speeding, ignoring traffic control devices, wrong side of road, etc). However there are already a thousand things a *car* itself can do to cause a collision. Remember Firestone tires getting blow-outs? Well, like that.

The only difference between insuring an ordinary vehicle and insuring a "driverless" vehicle is that the latter hardware is responsible for a larger percentage of whatever might go wrong. You still have to insure your car on the road today (though in fully driverless scenarios you are no longer ensuring the driver, it's just a car full of passengers :P) so the insurane companies bill the car-owner as the primary party reaping rewards from it's utility, and they study everything they can about this model of vehicle and firmware, and they use their magic bookie formulas to calculate exactly what cost offers what coverage based on the probability of different kinds of accidents occurring in different situations.

If there comes a spat of bad behavior out of automobile model/version X (either in the wild or caught during a round of testing), then the insurance either retracts coverage for it (grounding all of the vehicles on the road, possibly even with a remote software lockout if available) or increases it's rates, either of which puts severe pressure on the manufacturer to perform a recall or patch or whatever is required to completely satisfy the insurer once more.

Done and done. Human owner pays the insurance, can only operate the vehicle when covered by insurance, insurance (and interplay with manufacturer) pick up all tab on responsibility. Loop closed. :3

GrauhutOctober 6, 2015 2:40 PM

@Bob F "Once self-driving cars become ubiquitous it should be required to have them controlled by the computer system."

Bad idea, provides hackers to much kinetic energy right to the keyboard.

Bob S.October 6, 2015 3:02 PM

I assume the police, military, and powerful corporations of the entire world will demand and get access to the most granular tracking data for auto-drive vehicle and the most influential of them will demand and get total remote control of vehicles, for our convenience and safety of course.

IF I am right, then ordinary and extraordinary criminals will have that control, too.

I would agree, my view is extreme.

So, backing up a bit, maybe there will be delivery trucks, taxis and certain service vehicles operable within the above conditions, i.e. total access by those with the power.

That might work in some places in some circumstances.

As for solutions to abuse and weaponization, no doubt some very smart people under the color of law, and not, are working at this moment to have total control over auto-drive vehicles. Of course, arming them is child's play.

The only way to stop it is via sincere commitment by the various governments to do whatever necessary to keep them from becoming weapons. That absolutely could not happen in the USA right now, and I suspect that is the same for the vast majority of governments of the world.

IF all of that is right, then it's reasonable to conclude the race to have the best and most remote controlled land weapons is on. Way on.

Then the limited solutions for self preservation remain the same as always: run, hide or fight.

rOctober 6, 2015 3:02 PM

Another question about self driving cars- HOW will they respond to sensor blinding?

Clive RobinsonOctober 6, 2015 3:22 PM

@ Martin,

As Anura has indicated there are countermass recoilless rifles. What he did not mention is that unlike the propellant gas deflection type which are most definitely not recoilless countermass recoil less rifles are not just recoil less they are also balanced which is also necessary for use on drones and other small airframes.

Alien JerkyOctober 6, 2015 3:37 PM

hmm... Just a rambling thought (and based on some projects from my past)

What about taking the magnetron out of a microwave oven. Make a parabolic reflector dish. Point in direction of enemy. turn on. Not instant like a bullet. but....

WmOctober 6, 2015 5:17 PM

There was a TV show in the 50s about someone who set up a car that was driven remotely. I can't remember the plot, but it was something sinister as I remember. The heroes managed to stop it before any bad was done.

Bob FOctober 6, 2015 6:13 PM

@Grauhut

Assuming flying cars one day become a reality such a system will be necessary to move them all harmoniously and prevent crashes.

Building the system now and work on getting the kinks out seems like a good idea.

Brad TempletonOctober 6, 2015 6:14 PM

While this is a challenging problem, it must be kept in context. Car accidents kill 1.2M each year around the world. More have died in US car crashes than all who died in all the wars in the history of the USA back to the revolution. Terrorism deaths are but a tiny blip compared to this.

So while we do want to reduce the ability to use these vehicles as weapons and bomb delivery vehicles, we must always keep that in mind. Every step that reduces the utility of these vehicles and delays their deployment comes with a cost in lives, a cost in lives that can easily exceed all terrorist death and injury tolls in many environments.

Radical steps, such as forbidding vehicles from dropping off passengers at the entrances of buildings, would reduce their utility. Requiring manual driving for this last distance would be less harmful, but is highly discriminatory against the disabled who can't perform that act -- and there are millions of them who will be liberated by this technology in a way you can't grasp if you are not among them. Particularly the vision impaired.

Having to pass a turing test to get close to a crowd or building is still a bad impediment, but in the end it doesn't let the vehicles come to pick you UP at the building or crowded place, nothing short of costly inspection can easily prevent a bomb-laden robot. And it could be a hand-built robot, unless you want a police state level of control so that only blessed robots can ever go somewhere useful. The cost of this is much more than the cost of terrorism today.

Clive RobinsonOctober 6, 2015 6:16 PM

Why do you need a driverless car?

All you need is a vehical with the explosives in it and a driver, whilst you would think the driver needs to be a "suicide bomber" that is not necessary.

Back in the days of the Provisional Irish Republican Army puting bombs in cars was a fairly normal attack method. The bomb would be driven there and the driver would set the timer get out and walk away.

The thing about home made bombs is that occasionaly the setting of a timer would cause the bomb to go off...

It has been said that some early "backpack bombs" that exploded in Israel were due to faulty timers, but it was suggested that the timers had been deliberately rigged so the bomber would become a "dead end" as far as being hunted down by the Israeli authorities.

So it does not take much thinking to realise that a suicide bomber may actually not be, but someone who has been dupped by others.

A little further thinking and you realise that it's possible to have a compleatly unwitting bomber.

All you need to do is conceal a bomb in a vehicle with a remote controled detonator and arrange for some totaly unwitting person to drive it to where you want to set it off. The easiest way to do this is with a taxi that is not owned by the driver, you simply phone the company and request the driver of that taxi when you know they are on duty. When you see the taxi turn up you trigger the bomb as long as you are far enough away not to get hurt, you simply disappear into the crowd and disappear and make sure you are out of the jurisdiction within a few hours.

The simple fact is the only way to stop vehicle bombings is not to alow vehicles near potential targets. But as others have noted in the past, there are to many potential targets for this to be practical as a defencive method.

MattOctober 6, 2015 7:10 PM

You don't need explosives to make a car bomb.

Remember the recent hack through the cell modems of the Jeep Chrysler, causing it to veer off the road or stop dead in the middle of the highway? Terrorist autonomous cars won't need any explosives. They will be viruses spread from car to car locking the doors and ramming into any other moving object it sees until it is too damaged to drive. Gasoline and propane trucks and gas stations will be the preferred ramming targets if encountered.

All it would take is a few thousand vulnerable cars and it would shut down the US economy, as no one would want to risk their lives driving.

tyrOctober 6, 2015 8:55 PM


Just build driverless cars with a bomb built in
and have them detonate at random during the
ptocess of loading. Publicize this feature, one
thing the average bomb dumbass wants to avoid is
dying before his glorious martyrdom. We had a
revolutionary bunch from Berkeley blow themselves
up while driving because they were careless about
pipe bomb handling. Zeal isn't a substitute for
using your brains to think with.
If you look around the modern world there are
far too many dangerous things easily available
to those who want to cause problems but incidents
are rare. That should tell you something about
your fellow man, for all the government squealing
about the sky is falling, most have no interest
in crapping in their own nest.

Of course if you are in Syria the sky really is
falling the instant the Rus ID you as troublemaker.
There is a slight problem with deciding all new
tech is suddenly a new problem to solve. This is
that all the old unsolved problems get ignored in
the rush to go after the new ones. Decreasing the
attack surface by keeping all vehicles away from
large groups of people might do wonders. Clearing
the area around train tracks might also. Recently
a Canadian town was wiped out by a train accident.
Notice that it wasn't a purposeful attack, just a
combination of human errors.

In the 50s communist forces in Korea captured a
recoiless rifle. They dragged it up into a cave
on the side of a mountain. Their first shot fried
part of the gun crew. Observers on the other side
watched as they turned it around thinking it was
backwards. The second firing brought the whole side
of the mountain down on them. The moral of the
story is military weapons are dangerous to play with.


decorous cowbellOctober 7, 2015 5:49 AM

For me agency is at the heart of the issue. A world in which driverless cars are the norm is a world that has gone from mobile, independent, active people to a global population that is merely held by the hand and driven around. TLAs are doubtlessly becoming excited by the prospect of watching, profiling and data-mining the world's traffic, like they currently do with the internet. A well positioned TLA will be able to stop dead the traffic of an entire nation at will, as long as they control the infrastructure or have legal recourse over the companies that do. Enemy states will certainly be developing attacks against this infrastructure. Hackers will eventually find extremely serious vulnerabilities (perhaps making several million cars drive off the road simultaneously). Contractors will rub their hands with glee: as sensationalistic headlines make the news, they will peddle snake-oil solutions to protect "you and your family" from the evils of road hacking. Meanwhile, we suckers (the ones who pay for the cars, the ones get driven around by them and the ones whose data gets pimped on a global scale) will be caught in the middle ("OMG, the new e-car 2.0 is SOOO cool, it allows me to share my favorite downloaded songs with my contact list via Facebook!").

paulOctober 7, 2015 8:47 AM

@Clive Robinson

I was not questioning the notion that mounting an effective gun on a drone was theoretically possible, but rather the notion that it was an efficient use of terrorist resources. Military or paramilitary organizations with lots of resources and development time could no doubt do it, but (thus far) that's not the threat. Perhaps when commonly-available drones have payloads of 50 kg and flight endurance of an hour or more. And (as I'm sure you know) counter-mass systems don't actually eliminate recoil, they just spread it out in time, reducing shock loads (Which may or may not be good for a particular structure, depending on how it's designed.)

I think with autonomous cars there's an assumption (perhaps mistaken) that a car's guidance package will be more resistant to crash damage and non-precision gunfire than a human driver, so that barriers that would stop a human might not stop a self-driving vehicle.

GweihirOctober 7, 2015 4:40 PM

This is obvious nonsense. Terrorists are not smart. Smart people have a ton of attack-vectors for devastating attacks that cannot be prevented. Yet these almost never happen, because smart people can also see that terrorism generally only strengthens the enemy. (Bin Laden being the one exception. His success in making the US significantly less-free is staggering...) Self-driving cars are not a risk as soon as it is not too easy to have them drive without passengers and as long as it is not too easy to hide who caused this.

Of course, as soon es these things become easy, there is a really big _different_ problem: Theft of autonomous cars! As that will kill the whole idea unless kept rare, it will make sure only smart, capable individuals can do this to an autonomous car. But these people can do a lot of other things already, and hence this is just one more tool in their box, and by far not the best one.

So this is an obvious non-issue, except for all the people that benefit hugely from creating fear.

Clive RobinsonOctober 7, 2015 6:28 PM

@ Paul,

I was not questioning the notion that mounting an effective gun on a drone was theoretically possible, but rather the notion that it was an efficient use of terrorist resources.

Very little is "an efficient use of terrorist resources", the primary reason for this is "defence is rarely a consideration for terrorists". That is they are "hit and run" within others territory rather than "stand and fight" within their own territory organisations. Perversely the exception being what is going on in the middle east currently, which is almost medieval in nature.

Thus in general terrorists use readily available light infantry weapons that were designed around fifty to eighty years ago, that they obtain effectivly as "surplus" or "second hand". The only weapons they use that are modern than this are those they have captured in some way or have been given and trained by a third party intelligence agency usually of a super power fighting a proxy war against another super power.

Thus what is of concern is not what terrorist might "innovate" themselves, but what others may innovate and which get supplied one way or another to terrorists.

I'm aware of current development of very light weight high velocity rifles for lightweight UAV platforms being carried out in the more peacefull parts of the middle east. Some are for civilian use and others not. One civilian use is to replace helicopters and "sure shot" technicians for ranching / ranging stock / wildlife managment, preditor / vermin control and for vetinary and scientific purposes. In essence a semiautomatic dart gun for stock and high velocity small caliber wad cutter etc rounds for prey and vermin control.

It does not take any stretch of the imagination to see how such systems would be additionaly of use for both IC and LE activities. I'm also aware of interest in such systems being developed for military activities small drones giving forward infantry over head cover and advanced position look out. If you can get into the right "exhibitions" then you can see the interest such systems are getting as concept designs.

Thus once in production it is a matter of time before they find their way into regular military units, which also means that at some point just like shoulder launched surface to air missiles some paramilitary or terrorist organisations will obtain them, such is the way of the world.

name.withheld.for.obvious.reasonsOctober 7, 2015 10:12 PM

One thought, what if in the passing of a driver-less vehicle that a "jamming" device override control of the vehicle (preferably at a stop sign) and is "jail-broken" and added to a car-bot network?

A newly minted car-bot network could be used to "jail-break" more vehicles...that would be one interesting hack.

albertOctober 8, 2015 11:21 AM

Autonomous vehicles are a BAD idea. (great for insurance companies and lawyers, but then, what isn't?)
.
Auto companies will greatly increase their prices, and continue their well-documented shitty engineering. System security seems to be unattainable for them. They can't even get Engine Control Systems to work right. Nowadays, ECS programming is kindergarten stuff.
.
Hackers will have a field day.
.
The 'terrorism' threat will NOT increase. There myriad ways to terrorize a nation (some discussed here), that one can do from the comfort of one's living room, and electrons are cheap.
.
What really sticks in my craw is the notion that the freedom to operate vehicles is an inalienable, God-given right to all Americans. It is not. It is a privilege, granted by States to folks who can demonstrate some proficiency (in theory) behind the wheel. We could reduce the 20,000 auto accident deaths per year (US) by simply increasing the testing standards. We can't have old folks with 20/20 ('corrected') vision, but 2 second reaction times, on the roads. We need draconian jail sentences for drunk drivers, and better treatment for alcoholics. It's a big public health problem.
.
Just because something CAN be done, doesn't mean it SHOULD be done. This sort of thing happens when geeks have access to unlimited funds (Google). They think technology can solve everything, when it only creates more problems.
.
The fact that the commenters here instantly responded with most of the potential problems with autonomous vehicles speaks volumes. Apparently, the promoters and developers of such technology are either ignoring them, or are too stupid to consider them.
. .. . .. _ _ _

n8October 8, 2015 3:00 PM

If Google comes out with a self-driving car, it will probably come with a disclaimer that the company takes no legal responsibility for anything.

gregOctober 10, 2015 7:53 AM

What gives with you guys? Liability? Who is liable when ABS breaks fail? Or when say gas tanks explode? or Perhaps a tire blows out. Or cruise controls fails to disengage? or a steering linkage breaks?

All these things have happened more than once and well liability is a pretty we defined concept. Claiming that "with a computer ... " changes this liability is just plain ignorant. Car makers have been liable for faults in their cars and the cars software for a long time.

As for hacking. Yea its like the internet never works because of all these terrorists. Oh wait. Yea you are all way over stating the threat.

Think about it, how is driverless cars different from right now. Laser pointer? Humans do better? really? Fake bag made to look like a kid, hacked local OS that gives incorrect directions. Hacked GPS etc. etc.

There really isn't an increase in the vector of possible threats with driverless cars. Those threats are all true of new cars *right now*. And yet the world turns.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.