​Hawking, Musk and Wozniak call for ban on offensive AI weapons

Scientists fear autonomous killer quadcopters are just years away and have called for new rules to prevent their proliferation.

Hundreds of robotics and artificial intelligence researchers have called for a ban on autonomous weapons development.

In an open letter, the researchers issued a dire warning that a "global arms race is virtually inevitable". The letter calls for a "ban on offensive autonomous weapons", fearing that the reality of armed quadcopters - designed to search and eliminate people - are just years away.

Among the signatories to the letter from the Future Life Institute are several high profile thinkers such as Stephen Hawking and Elon Musk, both who've previously said AI is an existential threat to humankind. Other names include Apple co-founder Steve Wozniak, the CEO of Google-owned DeepMind CEO Demis Hassabis, Noam Chomsky, co-founder of Skype Jaan Tallinn, and Microsoft Research managing director Eric Horvitz.

The group concedes there are arguments that replacing human soldiers by machines is good as it reduces the likelihood of casualties. On the other hand, it also lowers the threshold for going into battle.

Read this

'Building AI is like launching a rocket': Meet the man fighting to stop artificial intelligence destroying humanity

Skype's co-founder wants to keep humankind safe from the existential threats of artificial intelligence.

Read More

"The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter reads.

A concern raised in the letter about AI weapons is that, unlike nuclear weapons, they can be built with off-the-shelf equipment. As a result, they'll be cheap and easy to mass produce, and could easy to slip onto the black market and end up in the hands of terrorists, dictators, and warlords.

However, the letter doesn't call for an outright ban on AI on the battlefield - which it says can actually make life safer for civilians - only AI weapons designed to kill people.

As for the type of regulation the group wants to see in place, the letter points to international treaties that prohibit the development of chemical and biological weapons, as well as the 1996 ban on blinding weapons.

"Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons - and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits," it states.

The call for a ban on autonomous offensive weapons stands well apart from more general warnings over the future of AI, which could see robots outsmart and out-evolve humans.

Hawking earlier this year told the BBC that "full artificial intelligence could spell the end of the human race", noting that it would have the capacity to evolve at a much faster rate than humans and ultimately supersede them.

Musk, an early investor in AI technology, has previously said it may be worth applying some regulatory oversight to AI, either at a national or international level.

One person who isn't concerned about machines outsmarting humans is Linux founder Linus Torvalds.

"I just don't see the thing to be fearful of," he said in a recent interview on Slashdot. "We'll get AI, and it will almost certainly be through something very much like recurrent neural networks. And the thing is, since that kind of AI will need training, it won't be "reliable" in the traditional computer sense. It's not the old rule-based prolog days, when people thought they'd *understand* what the actual decisions were in an AI."

Read more

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All