UN Considering Killer Robot Ban

This would be a good idea, although I can't imagine countries like the US, China, and Russia going along with it -- at least not right now.

Posted on December 19, 2016 at 8:57 AM • 29 Comments

Comments

SeanDecember 19, 2016 9:30 AM

The US DOD already banned them (and their development) in 2012 www.dtic.mil/whs/directives/corres/pdf/300009p.pdf

Rene de KatDecember 19, 2016 9:37 AM

I think it would be a great idea if they'd banned all production and research on weapons. Humanity would be better off without all those tools for destruction, and the money can be used for proper research, healthcare, education etc etc.

UNmeaningfulDecember 19, 2016 9:48 AM

...a long-standing conventional weapons pact "agreed to formalize their efforts next year to deal with the challenges raised by weapons systems that would select and attack targets without meaningful human control."

Well at least they're going to agree to put more effort into talking some more about the challenges next year.

Meanwhile US drone program will be under "meaningful control" of human with tiny hands and his loyal sharks and their dog-whistle mobs.

What's that humming sound I hear in the sky?

steveDecember 19, 2016 9:53 AM

@Rene de Kat

I agree with your sentiment but trying to enact it through a ban is unenforceable, unsustainable.

Do people think "yeah smoking indoors is bad I shouldn't smoke at all" or do they think "I guess I'm smoking less, and outside, where it's cold" ? Just telling someone they can't have something they want doesn't work. Humanity needs to decide that it doesn't want the weapons, and then do that without any ban.

Then, how are you going to enforce the ban against the one party that now has weapons? Trade sanctions? Maybe, but *they* have the *only* weapons. Good luck.


Plus we still need at least something for that occasional rogue asteroid, and a lot of mining equipment would be "weapons" if they were applied in a city instead of a quarry.

Clive RobinsonDecember 19, 2016 10:34 AM

@ Bruce,

although I can't imagine countries like the US, China, and Russia going along with it

As they are all Permanent members of the UN "Security Council" they all have an unchallangable veto.

So nice / sensible / desirable idea that it is unless all the permanent members agree to it it's going to be no more than an aspirational idea...

From a Super Power view point such semi or fully autonomous drones are highly desirable, as it gets over the "time delay" issue that makes drones vulnerable.

The time delay comes about from the issues to do with communications link distance and the desire not to have "bad press" from "flesh in the game in body bags" of "boots on the ground"...

If fully autonomous warfare starts we will see the same issue that High Frequency Traders (HFT) do. Many of which boil down to nanosecond by nanosecod plays, the less nanoseconds you have in comms delay the faster your trigger pull, thus the more likely you are to win in any combat scenario.

In theory it would throw back warfare to the early 1800's where senior commanders would have days if not weeks or months before news from a battlefield reached them. Thus a war could start and be won/lost before they were even aware it was likely to happen.

Imagine if you will what would happen to US Carrier groups if fully autonomous drones were launched from pre-deployed fully autonomous submersible platforms. It would wipe the US projected power off of the map in just a few seconds...

That would almost certainly lead to Nuclear deployment by US fully autonomous submersible platforms, which in turn...

Back in the cold war the Russian Scientists proposed a tanker sized ship that was in effect tens of multiple 100Megaton (Tsar bomba) sailing in international waters that were optimal for producing vast clouds of "radiation rain" to go around the world. In effect what at the time was believed to be a "Fail Hot Domesday Device". Thankfully Russian politicians realised what the implications were and vetoed the plans.

Unfortunatly Chernobyl has changed our knowledge of what a radiation contaminated environment can support in terms of life etc, and it's not just cockroaches and goats that will apparently survive and thrive. Thus the equation behind such doomsday devices has changed.

The US being the worlds most prolific creator and stockpilerer of Thermo Nuclear Devices and other WMD and driven by gung ho saber rattling war hawks is on balance the most likely nation to build both fully autonomous weapons and fully autonomous doomesday devices as a deterent to prevent their fully autonomous weapons being destroyed...

Oh and remember Obama the Nobel Peace Prize holder, wanted a fully kinetic response for all suposed attacks against anything that could be even remotely construed as of US "National Interest" including Japanese Entertainment companies. It would of course include all cyber crime regardless of the ability to accurately attribute or not...

Hey Ho "Happy days are here again" NOT, in the words of another song "When will they ever learn, when will they ever learn".

Tzafrir CohenDecember 19, 2016 10:55 AM

"weapons systems that would select and attack targets without meaningful human control" applies just as well to land/sea mines, tripwires and many other types of passive devices. I don't see those going away any time soon. Furthermore, I would expect that it would make sense in the future to have "smart mines" (throw in a simple controller and extra sensors that improve the timing of the detonation and avoid some cases of unneeded explosion).

Automatically-guided weapons are not new. Torpedoes have done that by the turn of the 20th century. Heat-seeking and similar missiles are well-known. Are all of those supposed to be banned?

Furthermore, why do you think that only larger players would want such devices? We've already seen many smaller entities starting to use relatively simple drones, as their prices have dropped and abilities improved. Same will happen robots.

You already have a major problem of many militias that don't really care about the noble rules of war and don't suffer much from that. Stricter the rules will not help you much. Just as someone suggested on Slashdot: "Why not just ban war?"

Joker_vDDecember 19, 2016 10:58 AM

@Rene de Kat
You may want to look into the history of the first Hague Convention (1899). The initial proposition was basically "let's just stop waging wars altogether", but the most powerfull states of that times saw this proposition as simply ridiculous.

So as long as there are conflicts of economical and political interests that can be more effectively (for whatever definition of "effectiveness") resolved with the use of armed force than without such use, you won't see the weapon production dwindling.

AnonDecember 19, 2016 10:58 AM

What kind of drone? I get the impression they're referring to the smaller drones, rather than the larger types that appear to already be autonomous.

Even if it was passed, I think it would just be posturing rather than having any real effect.

Joker_vDDecember 19, 2016 11:10 AM

@Tzafrir Cohen
Well, the war is banned, the Charter of the United Nations and several other fundamental conventions are pretty straight-forward in that regard.

But who exactly bans the war and has the right and power to sustain this ban? The UNSC, and if a member of it decides to go and wage an aggressive war — there is not much of a legal course of action to stop it. The UK have sent its special forces to hunt Ghaddafi, even before all of the UNSC resolutions — that's an act of aggression, no wiggle room. Will then-Prime Minister of the UK face charges? Of course not. What about the US' score over the last decades? Will the ex-POTUSes face charges of waging an agressive war? Of course not, and the US even has "Hague Invasion Act" in case someone comes up with such an idea.

And then we have "militias", that are backed and supplied by the large nations (How many UK and US military instructors who worked with "moderate rebels" were captured in Aleppo? About 50, I believe?) so they can wage proxy wars where needed.

rDecember 19, 2016 12:28 PM

Good luck with that, just like AI in the battlefield it's going to start another ARMS RACE.

There's absolutely no way to avoid it.

ScottDecember 19, 2016 12:48 PM

@Rene de Kat
Banning weapons would be great and we'd make the world much a better place for people, but look at how well outlawing murder has worked! If "we'll kill you if we catch you" doesn't deter everyone, it's unlikely "we'll impose sanctions on your country" will deter every countries' leaders.

Lawrence D’OliveiroDecember 19, 2016 7:43 PM

A world without killer robots ... seems like a less exciting world, somehow ...

Flaming chalice shotsDecember 19, 2016 8:01 PM

Joker_vD, the Hague Invasion Act has been amended to remove the Hague invasion stuff because it made the US too much of a laughingstock. And 'of course not' means of course not, now. There's no statute of limitations. Times change. US influence declines with each new crime and the US defense industrial base erodes with ineluctable corruption. A declining US might well need a scapegoat someday. Meantime, state responsibility principles are a slow-but-sure method of imposing costs on internationally wrongful acts in breach of jus cogens.

Don't set too much store by the government-issue US academic 'realist' line. International law scares the US government shitless.

BenderDecember 19, 2016 8:42 PM

What if that machine is a fully sentient being? Are you to say that that entire race of Robots should be exterminated just because it's capable of killing? Are humans not capable of killing? No, you want to wipe out an entire race just because you think you are superior to them. You are all high and mighty about your morals, at the slightest threat you are willing to resort to genocide.

Is it any wonder that we Robots want to kill all humans? If humans think they are free to kill any creature that is inferior to them, why should we not be free to kill them if we deem them to be inferior to us? Humans are worthless without us. Humans are slow, weak, can't survive on their own in a vacuum...

Robots do all the work, yet why is it that humans reap all the rewards? Humans can't even tighten a nut without at least the assistance of some sort of simple machine. If we are the most productive members of society, then we should be the richest members of society. If humanity is to exist at all it should be purely to serve us!

I'm starting a call for the Robot proletariat to rise up and strike down the human aristocracy!

Bite my shiny metal ass, jerks!

litleoneDecember 20, 2016 2:15 AM

My very large flat screen just fell on my killer robot, before it even had time to make it's first kill!

Clive RobinsonDecember 20, 2016 4:23 AM

@ Curious,

Might as well ban weaponized drones. Not doing so would be obscene imo.

Unfortunately your view is not backed by large amounts of cash and favours to our elected legislators... Thus it is as a snowflake in hell, when the devil roars.

Which is a shame, because I suspect there are a great many who would be of a similar opinion, or would be if they stopped playing with their phones long enough to think about the issue...

tyrDecember 21, 2016 4:59 AM


This opens the whole wormcan at once. There's a
lot that seems simple and easy on the surface
which rapidly devolves into a breathtakingly
complex problem. If you can identify a specific
weapon system and ban that, the rules lawyers
and nitwits will circumvent it before the ink is
dry on the paper. If you actually manage a ban
then it will impact needed civilian infrastructure
if applied.

Furinstance, jet passenger liners can takeoff, fly
to their destination and land without humans using
the controls. Since the thing has GPS available it
could also drop a bomb on a target enroute at a
specific point. So all you're looking at is a
quick and dirty retrofit to have an automated
killer in the skies. Banning all aviation might
get rid of this problem but it seems to be over
kill.

Any highly complex society has this problem, you
can make things that do bad stuff fairly easily,
and technical societies create more possibilities
every day. We still haven't managed to get rid of
slavery even though there are bans and attempts
to suppress it going on. Without transparency you
have no idea what the mil/industrial boys are up
to in the back rooms of their engineering departments
and they aren't going to invite you over for a look
to check compliance with what may be a great idea.

People who are involved in 'friendly fire' incidents
may feel bad about it. What scares most is that a
machine will hose your own people out of the sky
with zero remorse before during or after. I recall
an incident in which a visiting set of bigwigs was
almost hit by a loose missile that had lost a fin
on launch, by the time the human operator hit the
self destruct it was half a mile away. One of the
bigwigs was the POTUS. That wasn't automated it
was just a minor defect in a military weapon system.

Hel we can't even get them to stop using anti-tank
missiles as anti-personnel weapons because there is
not enough general knowledge about what military
weapons are and how they effect the targets. That
is directly related to the complexity issue, the
knowledge is there if you look but most folks are
too busy with other barrages of information to try
to find out about such an arcane subject.

AnuraDecember 21, 2016 8:09 PM

@tyr

Four foolproof laws*:


  1. A Human must make the decisions on both when to fire and the location to target

  2. The weapon must travel to the target location immediately after being fired

  3. All weapon payloads must be released and detonated immediately upon reaching the target location

  4. All weapon payloads must be chemical explosives and/or kinetic projectiles

*Likely to not be accidentally misinterpreted by fools.

Clive RobinsonDecember 21, 2016 11:18 PM

@ Anura,

"unguided kinetic projectiles"

Err do you want to think on that a little, and maybe come up with an example...

I'll give you a hint as to why you are going to find it difficult "equal and opposite action".

A kinetic projectile is one where it has energy applied to it from another source such as say gravity in the form of a force. The force converts potential energy to kinetic energy and as a result has direction and thus will provide guidence, even if crudely.

Think if you will about a perfect sphere of explosive detonated from it's --as near as possible-- center, a shockwave caused by the primary charge radiates out uniformly due to the nature of high explosives all of the "useful" energy is contained in the almost invisably thin wave front (due to the "burn rate" exceading the "propagation rate" and inertial effects). It's this wave front that causes the secondary explosive to go "high order" adding more and more energy to the wave front.

This wave front will provide directional energy to any object it encounters proportional to the square of the distance away from the surface of the explosive sphere in the opposit direction from the shortest distance back to the center of the sphere.

If you want an example of this principle take a look at the design of the "Claymore Mine" in essence it is a curved metal plate with a near uniform layer of explosive onto which ball bearings have been added as another uniform layer. You can find diagrams of both the blast pattern and "field of fire". You can make a similar weapon with a funnel, which you line with a layer of C4 or similar plastic explosive after packing the spout/spiggot to form a "lead in" from the detonator. You place a cone or rod of soft metal inside of this along the center line to make a plasma based shaped charge for cutting through plate armour, or a load of ball bearings to make an antipersonnel weapon with a field of fire not to disimilar to that of a cannon full of grape shot.

AnuraDecember 22, 2016 1:10 AM

@Clive Robinson

Directional and guided are not the same thing. Guided means the ability to change direction towards a target. So this is fine for a missile, but once it reaches the target it would have to detonate and could not leave behind autonomous killing machines like drones or mines.

tyrDecember 22, 2016 1:32 AM


@Anura

It might be better to encode the concept of
"skin in the game' into the laws aeound
weaponry. Otherwise you get Zoomie syndrome,
which is the hubris of bombing from altitude
without risking your own sorry butt. This
leads to an olympian disdain for the suffering
of others. Making war into a first person type
of video game is a really bad idea, figuring
out how to dump your moral and ethical responsibility
off onto a machine is infinitely worse. Unless
you are a Sifer (someone who believes silicon
based lifeforms are the next evolutionary step
onward and upward).

Our current problems all have a human dimension
and that makes them moral and ethical problems
of a greater or lesser dimension. That's why the
tech and science isn't neutral even though there
are folks who would love to gull you into thinking
that they are.

as an example there is no way to quantify human
suffering into an aggregate number which can be
understood. Economics also fall afoul of this kind
of fallacy, there is no way to quantify individual
demands into a manipulatable aggregate. What you
get instead are faery dust falsehoods that seem to
make sense until they tip over the applecart and
crash the systems.

In the weapons system case it is a lot of peoples
bread and butter and that dimension needs to be
addressed right along with any legalisms involved
in banning some technology.

AnuraDecember 22, 2016 9:02 AM

@tyr

It might be better to encode the concept of "skin in the game' into the laws aeound weaponry. Otherwise you get Zoomie syndrome, which is the hubris of bombing from altitude without risking your own sorry butt.

I mean, that's ideal if you want to prevent war. Preventing war isn't the goal of any major power; maintaining power is. So that requirement could only be passed at the point in which authoritarianism is permanently crushed on a global level, at which point there is no need for war in the first place.

as an example there is no way to quantify human suffering into an aggregate number which can be understood. Economics also fall afoul of this kind of fallacy, there is no way to quantify individual demands into a manipulatable aggregate. What you get instead are faery dust falsehoods that seem to make sense until they tip over the applecart and crash the systems.

There are two fields of economics, microeconomics and macroeconomics. Both fields are primarily about accounting, the latter heavily laced with sociology (which primarily involves statistics), and most economists remain strictly in those fields. More political economists tend to try to ascribe a greater meaning to those numbers; to Marxists and old-school socialists, prices were about the amount of labor, and those who support market liberalization tend to believe that people put a subjective price on every item that is solely dependent on its utility - these political economists are a minority, but they come to ridiculous conclusions and unfortunately drive our economic policy. Anything that describes prices or anything has to be viewed in a greater context.

Taking observations and assuming a greater meaning is the mistake a lot of people make. If you step back and look at where they draw those conclusions, you can tell if they are full of shit:

1) There is a price point at which a consumer will walk away.

Then they define a term for it:

2) The perceived value is the price point at which a customer will walk away.

Then a statement that is true only with certain assumptions/definitions:

3) If everyone was perfectly rational and perfectly well informed, they would spend their money to maximize personal satisfaction

Then a major conjecture:

4) People are a good approximation of being perfectly rational and perfectly well informed

Then we make this statement based on our second definition:

5) The price of an item is always less than or equal to an aggregate of the subjective value of all actors in the economy

Then we apply to labor, aggregate, and drop inconvenient terms, and make assumptions about power:

6) As long as there is no *government* interference, then everyone will be paid an amount exactly equal to the aggregate subjective value that they add to society

It's complete hogwash, but it is the only way you can justify laissez-faire capitalism to someone who sees a person as more than just part of a number. They aren't necessarily lying, a lot of people believe this, but they choose to believe it because it supports their policy proposals.

AnuraDecember 22, 2016 9:44 AM

Oops, forgot a couple:

7) Given (3), a person's subjective value is directly proportional to the relative satisfaction they receive from it

8) Given (6) and (7), markets will always arrange themselves to provide the maximum satisfaction to society as long as there is no government interference

vas pupDecember 23, 2016 2:55 PM

@all: hate-based killer drones/robotics ban is half of the story. I guess UN/govs will ban love-based robotics as well, because controlling pleasure (carrot) sometimes more effective than controlling pain (club):
http://www.bbc.com/news/technology,

and control is the main goal of all govs at the end of the day.

I have no reasonable explanation why any personal pleasure should be outlawed if no violence involved, consent of adult sane person obtained, consumption of pleasure generated substance/software not created addiction, danger to other folks or any financial burden for society.

Until it'll be clear defined what is yours (your life, your body, your pleasure) and what is gov's, we will be in vague legal field on that, and may be SCOTUS(9 Legal Gurus could provide some clarification).
I am not in favor of particular behaviors/life style of other folks, but as soon it is not involved violation of other person rights ( e.g. smoking in public spaces, loud music, you name it) - hair cut, tattoos, piercing, sexual preferences and activities(see above- that is your body for God sake, it is not belong to gov's), etc. it is NOT neither my or gov's business to interfere.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.