X
Innovation

Bad robot? Who gets the blame when a brain-controlled machine does wrong?

Scientists worry about a future including brain-jacking - and who should be responsible if brain-machine interfaces go wrong.
Written by Liam Tung, Contributing Writer
robot-hands.jpg
HACK_CG, Getty Images/iStockphoto

Advances in biotech and brain-machine interface computing like Elon Musk's Neurolink is working on may one day make it common for our brains to be melded to a computer. But how will we prevent an attacker from hacking this device to make us hurt others or ourselves?

Another question is responsibility. As scientists from the Geneva-based Wyss Center for Bio and Neuroengineering puts it, "Who is responsible if a brain-controlled robot drops a baby?"

While brain-machine interface (BMI) computing is currently used to help people with paralysis communicate, the scientists are concerned about the day able-bodied people enhance themselves with BMI.

They have published a paper in the journal Science entitled "Help, hope, and hype: Ethical dimensions of neuroprosthetics", which calls for new guidelines to ensure brain-machine interactions are safe.

"We don't want to overstate the risks nor build false hope for those who could benefit from neurotechnology. Our aim is to ensure that appropriate legislation keeps pace with this rapidly progressing field," said Professor John Donoghue, a co-author of the paper and director of the Geneva-based centre.

As with technologies like autonomous vehicles and artificial intelligence, BMI and a future of semi-autonomous robots raises ethical questions around accountability, responsibility, privacy and security.

There should, they argue, be a requirement for semi-autonomous robots to have a kill switch to override a bug in the BMI. If a semi-autonomous robot did not have a reliable control or override mechanism, a person might be considered negligent if they used it to pick up a baby, but not for other less risky activities, they argue.

They're also concerned about the security of the brain's output data, which should be protected by encryption, and the possibility of "brainjacking" or the malicious manipulation of brain implants. Someone in a position of power or who holds valuable intellectual property could be at a higher risk of attacks targeting brain readouts.

Laurie Pycroft, a doctoral candidate at the University of Oxford's Nuffield Department of Surgical Sciences raised concerns about brainjacking in a recent paper looking at the potential for attackers drain brains implant batteries, or use them to induce pain, tweak emotions and cause tissue damage.

Fellow Wyss Center researcher Professor Niels Birbaumer points out that data produced by BMI used for people with paralysis contains sensitive neuronal data, such as responses to personal questions.

"Strict data protection must be applied to all people involved, this includes protecting the personal information asked in questions as well as the protection of neuronal data to ensure the device functions correctly."

Editorial standards