X
Innovation

Mind-reading technology: The security and privacy threats ahead

Brain computer interface technology is developing fast. But just because we can read data from others' minds, should we?
Written by Jo Best, Contributor

Since the dawn of humanity, the only way for us to share our thoughts has been to take some kind of physical action: to speak, to move, to type out an ill-considered tweet.

Brain computer interfaces (BCIs), while still in their infancy, could offer a new way to share our thoughts and feelings directly from our minds through (and maybe with) computers. But before we go any further with this new generation of mind-reading technology, do we understand the impact it will have? And should we be worried?

Depending on who you listen to, the ethical challenges of BCIs are unprecedented, or they're just a repeat of the risks brought about by each previous generation of technology. Due to the so-far limited use of BCIs in the real world, there's little practical experience to show which attitude is more likely to be the right one.

The future of privacy

It's clear that some ethical challenges that affect earlier technologies will carry across to BCIs, with privacy being the most obvious.

We already know it's annoying to have a user name and password hacked, and worrying when it's your bank account details that are stolen. But BCIs could mean that eventually it's your emotional responses that would be stolen and shared by hackers, with all the embarrassments and horrors that go with that.

BCIs offer access to the most personal of personal data: inevitably they'll be targeted by hackers and would-be blackmailers; equally clearly, security systems will attempt to keep data from BCIs as locked down as possible. And we already know the defenders never win every time.

By the time BCIs reach the consumer world, something like privacy settings -- like when browser or app users turn services on or off, according to whether the services cross their own personal privacy threshold -- might also be deployed around BCIs.

One reason for some optimism: there will also be our own internal privacy processes to supplement security, says Rajesh Rao, professor at the University of Washington's Paul G. Allen School of Computer Science & Engineering.

"There's going to be multiple protective layers of security, as well as your brain's own mechanisms for security -- we have mechanisms for not revealing everything we're feeling through language right now. Once you have these types of technologies, the brain would have its own defensive mechanisms which could come into play," he told ZDNet.

The military mind

Another big issue; like generations of new technology from the internet to GPS, some of the funding behind BCI projects has come from the military.

As well as helping soldiers paralysed by injuries in battle regain the abilities they've lost, it seems likely that military's interest in BCIs will lead to the development of systems designed to augment humans' capabilities. For a soldier, that might mean the chance to damp down fear in the face of an enemy, or patch-in a remote team to help out in the field -- even connect to an AI to advise on battle tactics. In battle, having better tech than the enemy is seen as an advantage and a military priority.

There are also concerns that military involvement in BCIs could lead to brain computer interfaces being used as interrogation devices, potentially being used to intrude on the thoughts of enemy combatants captured in battle.

The one percent get smarter

If the use of BCIs in the military is controversial, the use of the technology in the civilian world is similarly problematic.

Is it fair for a BCI-equipped person with access to external computing power and memory to compete for a new job against a standard-issue person? And given the steep cost of BCIs, will they just create a new way for the privileged few to beat down the 99 percent?

These technologies are likely to throw up a whole new set of social justice issues around who gets access to devices that can allow them to learn faster or have better memories.

"You have a new set of problems in terms of haves and have nots," says Rao.

This is far from the only issue this technology could create. While most current-generation BCIs can read thoughts but not send information back into the brain – future generation BCIs may well be able to both send and receive data.

SEE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

The effect of having computer systems wirelessly or directly transmit data to the brain isn't known, but related technologies such as deep brain stimulation -- where electrical impulses are sent into brain tissue to regulate unwanted movement in medical conditions such as dystonias and Parkinson's disease -- may cause personality changes in users (though the strength of the link is still a matter of debate). 

And even if BCIs did cause personality changes, would that really be a good enough reason to withhold them from someone who needs one -- a person with paraplegia who requires an assistive device, for example?

As one research paper in the journal BMC Medical Ethics puts it: "the debate is not so much over whether BCI will cause identity changes, but over whether those changes in personal identity are a problem that should impact technological development or access to BCI".

Whether regular long-term use of BCIs will ultimately effect users' moods or personalities isn't known, but it's hard not to imagine that technology that plugs the brain into an AI or internet-level repository of data won't ultimately have an effect on personhood.

Historically, the bounds of a person were marked by their skin; where does 'me' start with a brain that's linked up to an artificial intelligence programme, where do 'I' end when my thoughts are linked to vast swathes of processing power?

It's not just a philosophical question, it's a legal one too. In a world where our brains may be directly connected to an AI, what happens if I break the law, or just make a bad decision that leaves me in hospital or in debt?

The corporate brain drain

And another legal front that will open up around BCI tech could pit employees against employer.

There are already legal protections built up around how physical and intellectual property are handled when an employee works for and leaves a company. But what about if a company doesn't want the skills and knowledge a worker built up during their employment to leave in their head when they leave the building?

Dr S Matthew Liao, professor of bioethics at New York University, points out that it's common for a company to ask for a laptop of phone back when you leave a job. But what if you had an implant in your brain that recorded data?

"The question is now, do they own that data, and can they ask for it back? Can they ask for it back -- every time you leave work, can they erase it and put it back in the next morning?"

SEE: Mind-reading systems: Seven ways brain computer interfaces are already changing the world

Bosses and workers may also find themselves at odds in other ways with BCIs. In a world where companies can monitor what staff do on their work computers or put cameras across the office in the name of maximum efficiency, what might future employers do with the contents of their BCIs? Would they be tempted to tap into the readings from a BCI to see just how much time a worker really spends working? Or just to work out who keeps stealing all the pens out of the stationery cupboard?

"As these technologies get more and more pervasive and invasive, we might need to read to rethink our rights in the workplace," says Liao. "Do we have a right to mental privacy?"

Privacy may be the most obvious ethical concern around BCIs, but it's for good reason: we want our thoughts to remain private, not just for our own benefit, but for others' as well.

Who hasn't told a lie to spare someone's feelings, or thought cheerfully about doing someone harm, safe in the knowledge they have no intention of ever doing so? Who wouldn't be horrified if they knew every single thought that their partner, child, parent, teacher, boss, or friend thought?

"If we were all able to see each other's thoughts, it would be really bad - there wouldn't be any society left," said Liao.

If BCIs are to spread, perhaps the most important part of using 'mind-reading' systems is to know when to leave others' thoughts well alone.

Editorial standards