X
Innovation

UN urged to ban fully autonomous drones, weapons before they exist

Are so-called "killer robots" on the horizon, and what should be done about the potential future of autonomous weaponry?
Written by Charlie Osborne, Contributing Writer
dsc0630.jpg
An Israeli drone operator. Credit:
Charlie Osborne
The United Nations is under pressure to ban fully autonomous before they are developed, in the form of a new report which details how a lack of regulation could cause human deaths without accountability.

In a new report released by Human Rights Watch and Harvard Law School, the groups argue that so-called "killer robots," fully autonomous weapons able to inflict harm without operators, should be banned before they come into existence.

At the moment, drones and autonomous vehicles -- ranging from sensor-laden scouts to consumer hobby drones and self-driving cars -- are being developed at a rapid pace. Companies including Amazon are harnessing the technology for delivery purposes, Google is experimenting with a fully self-driving car, and Parrot is a start-up which now offers a range of hobby drones to consumers.

Considering the technology scene only a few decades ago, the possibility of these machines being taken a step further for military use is not outside the realm of possibility. While regulators are exploring different avenues for the regulation of consumer-based drones and unmanned aerial vehicles (UAVs), the report argues that rather than lawmakers falling into a hole where regulations are playing catch-up with technology, laws should be set in place before such technology arrives.

See also: FAA to impose restrictions on commercial drone use

As reported by The Guardian, the report says that under current laws, programmers, manufacturers and military personnel would all escape liability for deaths caused on the field by fully autonomous weaponry. The report, titled "Mind the Gap: The Lack of Accountability for Killer Robots," also suggests that there is not likely to be any legal framework which would clearly state where responsibility lies in the production and deployment of such weapons -- and therefore no retribution or restitution when errors occur.

"Fully autonomous weapons do not yet exist, but technology is moving in their direction, and precursors are already in use or development," the report argues. "For example, many countries use weapons defense systems -- such as the Israeli Iron Dome and the US Phalanx and C-RAM -- that are programmed to respond automatically to threats from incoming munitions. In addition, prototypes exist for planes that could autonomously fly on intercontinental missions (UK Taranis) or take off and land on an aircraft carrier (US X-47B)."

The controversial factor in autonomous weaponry is the lack of meaningful human control in selecting and engaging targets. By rescinding control to a machine, there is the possibility of civilians being targeted instead of military, a potential arms race to develop more sophisticated and dangerous weaponry, and "proliferation to armed forces with little regard for the law," the report suggests.

"Existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harms fully autonomous weapons might cause," the groups argue. "These weapons have the potential to commit criminal acts -- unlawful acts that would constitute a crime if done with intent -- for which no one could be held responsible. A fully autonomous weapon itself could not be found accountable for criminal acts that it might commit because it would lack intentionality."

Drones and automated weaponry currently used by governments are defended as a human operator is always behind the decision to pull the trigger or not. Therefore, a person is held accountable in the case of war crimes and misuse. However, researchers from Human Rights Watch and Harvard Law School believe military personnel and operators could "not be assigned direct responsibility" for the actions of a fully autonomous weapon, except in rare situations where intent to misuse such weapons can be proved. The report states:

"An alternative approach would be to hold a commander or a programmer liable for negligence if, for example, the unlawful acts brought about by robots were reasonably foreseeable, even if not intended. Such civil liability can be a useful tool for providing compensation for victims and provides a degree of deterrence and some sense of justice for those harmed. It imposes lesser penalties than criminal law, however, and thus does not achieve the same level of social condemnation associated with punishment of a crime."

The report continues:

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.
They would thus challenge long-standing notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon.
On the other hand, fully autonomous weapons would fall far short of being human."

Human Rights Watch and Harvard Law School recommend that the "development, production and use" of fully autonomous weapons be prohibited through an international legally binding policy, and national laws be adopted which would also prevent this type of weaponry from being created nationally.

The report has been released ahead of a meeting of international officials at the UN in Geneva later this month, which will include a discussion on the regulation of emerging military technology.

Read on: In the world of innovation

Editorial standards