Weapons shouldn’t be able to decide themselves to end a life – Hawking, Musk, Wozniak sign letter requesting the ban of autonomous weapons and military AI

One of the cornerstone events in Frank Herbert’s fictional Dune Universe is the Butlerian Jihad – an empire-wide crusade against thinking machines and AI of any kind.

Jihad, Butlerian: (see also Great Revolt) — the crusade against computers, thinking machines, and conscious robots begun in 201 B.G. and concluded in 108 B.G. Its chief commandment remains in the O.C. Bible as “Thou shalt not make a machine in the likeness of a human mind.”

A militant group, calling themselves the Titans, used humanity’s over-reliance on technology to gain dominion over the entire human race. They transplant their brains into mechanical bodies and become immortal and nearly unstoppable, enslaving human kind. Granting too much power over their computerized empire to the AI Omnius, they are overthrown by it. The rogue program sees no value in human life, and the deaths it causes makes humanity rise up in revolt and, after their final victory, ban AIs and computers forever.

A photo from the ‘Campaign to Stop Killer Robots’ which called for a pre-emptive ban on lethal robot weapons in 2013.
Image via observer.com

The tale has all the makings of a great story – a hero you feel for, humanity as underdogs and overbearing robot overlords. And, according to many researchers, programers and tech experts, it may have something even more important, that every good story needs.

It may have a kernel of truth

Elon Musk and Stephen Hawking have both previously warned of the dangers of advanced AI. Musk said that AI is “potentially more dangerous than nukes,” while Hawking was far more optimistic, merely saying that AI is “our biggest existential threat.”

The two have added their names to those of a very large number of scientific and technological heavyweights, that have signed an open letter which will be presented at the International Joint Conferences on Artificial Intelligence (IJCAI) in Buenos Aires tomorrow. Noam Chomsky, the Woz, and dozens of other AI robotics researchers have also signed the letter, calling for the world’s governments to ban the development of “offensive autonomous weapons” to prevent a “military AI arms race.”

Most of the letter addresses the issue of today’s “dumb” robots, vehicles and munitions being turned into smart autonomous weapons. Cruise missiles and remotely piloted drones are ok, the letter says, because they cannot make the choice to destroy or kill by themselves, as “humans make all targeting decisions.”

So where do we draw the line?

The letter voices the concern of may scientists that weaponizing AIs is a slippery slope that could very well lead to our extinction. The development of fully autonomous weapons that can fight and kill without human intervention should be nipped in the bud, scientists agree. And it letter warns us that once the first AI is weaponized, many more will follow:

“The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow,” the letter reads.

Later, the letter draws a strong parallel between autonomous weapons and chemical/biological warfare:

“Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits.”

The letter is being presented at IJCAI by the Future of Life Institute. It isn’t entirely clear who the letter is addressed to, other than the academics and researchers who will be attending the conferences. Perhaps it’s just intended to generally raise awareness of the issue, so that we don’t turn a blind eye to any autonomous weapons research being carried out by major military powers.

The main issue with AI in general, and autonomous weapons in specific, is that they are transformational, game-changing technologies. Once we create an advanced AI, or a weapons system that can decide for itself who to attack, there’s no turning back. We can’t put gunpowder or nuclear weapons back in the bag, and autonomous weaponry would be no different.

There will always be Ix and Tleliax.

To tie the Dune parallel in a neat little bow and bring it to the end, the planets Ix and Tleilax in the fictional universe design and produce technology that was outlawed by the Butlerian Jihad, but is tolerated by the Empire, a kind of technological “gray-area”.

And the same issue stands with the letter. The history of global technology regulation warns us that making this kind of statement is much easier than realising what it asks for. What do we ban, how do we make sure the ban sticks? The thousands of scientists that have signed the letter to ban military use of AI may have inadvertently created restrictions on their own ability to share software with international collaborators or develop future products.

As Patrick Lin, director of the Ethics & Emerging Sciences Group at California Polytechnic State University, told io9.com:

“Any AI research could be co-opted into the service of war, from autonomous cars to smarter chat-bots… It’s a short hop from innocent research to weaponization.”

One thought on “Weapons shouldn’t be able to decide themselves to end a life – Hawking, Musk, Wozniak sign letter requesting the ban of autonomous weapons and military AI

  1. Brian Donovan

    Fortunately, the military has never wanted to let the machine decide who to kill. They, to their credit, have always wanted multiple officials in the decision process to kill. I applaud these men and heartily agree: never give machines life and death authority.

Leave a Reply

Your email address will not be published. Required fields are marked *