Special Feature
Part of a ZDNet Special Feature: Building the Bionic Brain

Mind-reading systems: Seven ways brain computer interfaces are already changing the world

From helping people regain their independence to updating Facebook faster, here are some of the ways brain computer interfaces are being trialled and implemented today.

Facebook wants to change how we control computers Facebook's latest foray into brain computer interfaces is a reflection of the changing nature of hardware and how we interact with it.

Neuralink may be one of the best-known brain computer interface companies in the world right now, but everyone from big tech to tiny startups and neuroscience researchers are working on projects aimed at linking human minds to computers.

Here are some of the experimental and real-world applications they've come up with so far.

Fighting paralysis

When Ian Burkhart had a diving accident aged 19, he lost the use of his hands and legs. Now, thanks to a BCI, he's able to perform seven different movements with his hand, and even play Guitar Hero

Burkhart has had an electrode array implanted in his motor cortex – and a separate 'sleeve' that can pass on neural signals to his arms to tell them to move – for over five years, making him one of the longest-standing users of invasive BCIs in the world.

Battelle, the company behind the BCI system, started its work using the technology to overcome spinal injuries several years earlier. There are 5.5 million Americans living with paralysis.

"The aim of the programme was to develop neural-bypass technology that can connect the brain to the hand or the limb that controls, and bypassing the injury to the spinal cord," says Gaurav Sharma, senior research scientist at Battelle. 

Now, when Burkhart thinks about moving his hand, the electrical impulses in the motor cortex of his brain -- the area that controls conscious muscle movements -- are passed directly to the muscles in his hand, leapfrogging the severed nerves in his spine, which are incapable of passing on the messages from the brain themselves due to his injury. 

Since using the BCI, Burkhart has moved out from his parents' house, and now lives alone; the amount of time he needs help from a caregiver has also been cut from 12 hours to four hours a day.  

Burkhart's motor abilities have also improved even when he's not using the BCI. "Over the last five years he's been using the system, his ability to manipulate objects on his own without using the system has improved remarkably... He will tell you he's more coordinated when he's using his own hand to do things -- he can open a doorknob, which he wasn't able to do earlier, and he can very easily manipulate his phone," says Sharma.

Updating your socials

Social networks thrive on data, and what better source of fresh, personalised data than the human brain? Facebook has already been working on technology to allow people to type just by thinking. The uses of the technology for the company are obvious: what better way to sidestep the onerous process of, er, picking up your smartphone to post a status update, when you could just think it and have it appear on your feed instead? 

SEE: Facebook's 'mind-reading' tech startup deal could completely change how we control computers

There may be an upside for humanity nonetheless: as well as Facebook staffers, the researchers working on the project at the University of California, San Francisco are aiming to use the technology to help people with brain damage recover their ability to speak. The technology underpinning Facebook's effort is high-density electrocorticography (ECoG) -- to train the system up, subjects are asked questions with ECoG monitoring, and their neural signals are then matched from the brain to particular speech. The first fruits of Facebook's research have been revealed, demonstrating a working system that is able to recognise a few words and phrases from signals from the brain's speech centre.

Creating music therapy

A collaboration between researchers -- including neuroscientists, biomedical engineers, and musicians -- has been looking at the potential for BCIs to be used with music. They are working on a system that could analyse a person's emotional state using their neural signals, and then automatically develop an appropriate piece of music. For example, if you're feeling down, the system's algorithms could write you a piece of music to help lift your mood. 

The system has been tested on healthy volunteers, as well as on one individual with the neurodegenerative condition Huntington's disease, which causes depression and low mood.

"Part of the reason someone might have a music therapy session is because they have trouble understanding their own emotions or expressing their own emotions, so the idea is to use music and the skills of the therapist, and potentially this device is better in helping them understand their emotions," says Ian Daly, lecturer at the University of Essex's School of Computer Science and Electronic Engineering.

Telepathic Tetris

Using BCIs for gaming is one thing, but using them for collaborative gaming? Yep, that's possible too. Arguably more a brain-computer-brain interface than a brain computer one, recent research published by the University of Washington allowed three people to play a Tetris-type game by networking their brains. 

The game was the culmination of years of work on machine learning to decode someone's intended movements from an EEG. "There was a question that came up which was, if the signal that you extract is being sent to a robotic device or cursor on a screen, what if you could send that signal directly to a person's brain?" says Rajesh Rao, professor at the University of Washington's Paul G. Allen School of Computer Science & Engineering and a co-director of the university's Center for Neurotechnology.

SEE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

The game, called BrainNet, requires a collaborative effort between three players to rotate onscreen blocks until they fit into differently shaped gaps below. Two players could see both the gap and block but not rotate the block, while one could rotate the block but not see the gap. To instruct another person to rotate a block, a player would concentrate on the word 'yes' or 'no' on a screen, each linked to an LED flashing at a different rate. EEG caps would read the brain activity corresponding to the particular light, and pass it on to the other player's cap, generating a flash in their field of vision for 'yes' and no flash for 'no', using a technology called transcranial magnetic stimulation. By working together, the teams were able to succeed 80% of the time.

The project was inspired by previous work on brain computer interfaces that aimed to help people with paralysis operate prosthetics, and see if it could be taken to the next level by connecting up the brains of more than one person. "Potentially one could have a proof of concept demonstration of computer-assisted telepathy, or brain-to-brain communication. We asked the question, can we go beyond two people to a network of people, reading from and writing into the brain?" Rao says. 

Health and safety gone mind

Neurable's technology is designed to measure emotion, interpret intent and allow people to control their environment using their thoughts. One of Neurable's focuses is virtual reality, for uses including training up staff. By training workers in a simulated environment and measuring their emotional response, employers can gauge their performance and emotional response, and adapt the training as necessary. 

"The training space is very interesting. A lot of it's done in virtual reality, a lot of companies are exploring how to make their training more efficient and more successful, and also safer -- if you're going to do a dangerous task on an oil rig or a power line, training people in a virtual space is much better to start with, rather than start with the situation and hope they don't get hurt," Jamie Alders, VP of product at Neurable, told ZDNet. 

Overcoming repetitive strain injury

For most office workers, a desk job means using a computer for hours on end every day -- and using a computer for hours every day means repetitive strain injury (RSI). Could BCIs offer a more ergonomic way of using technology?

CTRL-labs hopes so: it uses sensing bracelets that detect EMG (electromyography) activity to pick up intended movements in the hands, and relay it to external systems. The bracelet picks up the neural signal sent from the brain and then uses that signal to control a device. One area where this technology could be useful is in VR and AR, where users currently have to make do with controllers with too many buttons that they can't see.

It's a vision that's appealed to one of the biggest names in tech: Facebook acquired the company in September.

SEE: Neural implants: Why connecting your brain to a computer will create a huge headache for everyone

Facebook and CTRL-labs' vision for gesture-controlled tech might have a knock on effect on users' health: by using your hands freely in space to control your hardware, rather than another piece of hardware, there should be fewer aches and pains that go alongside using technology. 

"Ergonomics is not the most pressing health issue in the world, but millions of people have RSI and that actually affects them on a daily basis and they have to stop working and typing… I think this technology will go the other way; we will adapt to what the person wants to do and it has the possibility to be a much more comfortable and natural way to interact with technology," says Adam Berenzweig, head of R&D at CTRL-labs

Helping people with locked-in syndrome communicate

People with locked-in syndrome are entirely mentally aware, but can move none, or almost none, of their muscles. They can't speak or write; their ability to communicate with the outside world is limited to perhaps moving an eyelid or a single finger when asked a question. 

BCIs are opening up new options for those with locked-in syndrome to communicate more fully, being able to use their brain signals to choose letters in order to write messages, send emails and respond to questions. 

A variety of methods have been used to pick up the brain signals of people with locked-in syndrome, including NIRS optodes, which pick up metabolic activity in the brain, and intracortical local field potentials, which read electrical activity.