Tesla's Elon Musk and Google's Mustafa Suleyman are among a group of founders to call for a ban on the development and use of autonomous weapons -- otherwise known as "killer robots".
116 founders of AI and robotics companies across 26 countries signed an open letter to the United Nations (UN) urging it to stop the arms race that is underway for killer robots.
"Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways," the letter states.
"We do not have long to act. Once this Pandora's box is opened, it will be hard to close."
The UN's Review Conference of the Convention on Conventional Weapons had unanimously agreed to start formal discussions on the threat of autonomous weapons such as drones, tanks, and automated machine guns. The group was scheduled to meet on August 21, but the meeting has been postponed to November.
The letter warns that this arms race threatens to spur the "third revolution in warfare" after gunpowder and nuclear arms. As such, autonomous weapons need to be added to the list of weapons prohibited under the UN's Convention on Certain Conventional Weapons, which includes blinding laser weapons.
"Nearly every technology can be used for good and bad, and artificial intelligence is no different. It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis," said Toby Walsh, Scientia professor of artificial intelligence at the University of New South Wales, who is one of the key organisers of the letter.
"However, the same technology can also be used in autonomous weapons to industrialise war. We need to make decisions today choosing which of these futures we want."
Ryan Gariepy, founder and CTO of Clearpath Robotics, who was the first to sign the open letter, said unlike other potential manifestations of AI which "still remain in the realm of science fiction", autonomous weaponry is on the cusp of development.
The letter, launching at the opening of the International Joint Conference on Artificial Intelligence in Melbourne on Monday, is not the first to encourage the UN to act on the threat of autonomous weapons, but it is claimed to be the first time so many AI and robotics companies have taken a joint stance on the issue.
A 2015 report, released by Human Rights Watch and Harvard Law School, detailed how a lack of regulation could cause human deaths without accountability. The report, Mind the Gap: The Lack of Accountability for Killer Robots, states that under current laws, programmers, manufacturers, and military personnel would all escape liability for deaths caused on the field by fully autonomous weaponry.
It also suggests that there is not likely to be any legal framework that would clearly state where responsibility lies in the production and deployment of such weapons -- and therefore no retribution or restitution when errors occur.
Another open letter signed by Musk, physicist Stephen Hawking, Apple co-founder Steve Wozniak, and thousands of researchers in 2015, called for a ban on the development of autonomous weapons, fearing that the reality of armed quadcopters -- designed to search and eliminate people -- are just years away.
A concern raised in the 2015 letter was that unlike nuclear weapons, autonomous weapons can be built with off-the-shelf equipment. As a result, they'll be cheap and easy to mass produce, and could be traded on the black market, possibly ending up in the hands of terrorists, dictators, and warlords.