X
Innovation

Robots need rights, and kill switches too, warn politicians

Robots could be given 'electronic person' status, says report, which argues a legal framework is urgently needed for robots and self-driving cars.
Written by Steve Ranger, Global News Director
robot-toy.jpg

Should robots be classed as 'electronic persons'?

Image: iStock

The European Union needs to set rules for the use of robots, to settle issues around ethics, safety and security, its legal affairs committee has said, which could see robots gain a legal status as 'electronic persons'.

MEPs said that EU-wide rules are needed to exploit the economic potential of robotics and artificial intelligence and also guarantee safety and security. The EU needs to take the lead or risk seeing the rules around robots being set by others, the committee's report warned.

The politicians are calling for a new European agency for robotics and a code of ethical conduct to regulate who would be accountable for the social, environmental and human health impacts of robotics and ensure that they operate in accordance with legal, safety and ethical standards.

They said the code should recommend that robot designers include "kill" switches, so that robots can be turned off in emergencies.

The group said rules are especially urgently needed for self-driving cars, and said an obligatory insurance scheme and a fund should be created to ensure victims are fully compensated in cases of accidents caused by driverless cars.

Longer-term, the possibility of creating a specific legal status of "electronic persons" for the most sophisticated autonomous robots, so as to clarify responsibility in cases of damage, should also be considered, MEPs say.

Europe also needs to look at the impact that increased use of robotics will have on society, including new employment models and the viability of the current tax and social system for robotics.

While robots are increasingly becoming a part of everyday life -- some 1.7 million robots already exist worldwide -- their use is still not properly regulated, the report argues.

Europe should be ready for the rise of self-learning robots, said report author Mady Delvaux. One option could be to give robots a limited "e-personality" something like a "corporate personality", a legal status which enables firms to sue or be sued.

"What we need now is to create a legal framework for the robots that are currently on the market or will become available over the next 10 to 15 years," she said.

The report also raises concerns about allowing vulnerable people become emotionally attached to their care robots.

"We always have to remind people that robots are not human and will never be. Although they might appear to show empathy, they cannot feel it. We do not want robots like they have in Japan, which look like people. We proposed a charter setting out that robots should not make people emotionally dependent on them. You can be dependent on them for physical tasks, but you should never think that a robot loves you or feels your sadness," Delvaux said.

These aren't the only ethical issues being discussed around the use of robots. The United Nations is discussing whether to allow the use of artificial intelligence on the battlefield, in the form of 'killer robots'. And, of course, a set of laws for robots has already been formulated: Asimov's three laws, which state:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Read more on robots

Editorial standards