Gartner's newly released Hype Cycle for emerging technologies suggests it will take more than 10 years before brain-computer interfaces (BCI) go mainstream, but recent developments could shift the adoption curve.
As reported on sister site CNET (see gallery on SmartPlanet), engineers at the University of Illinois demonstrated a tattoo-like "device platform" with electronic components for medical diagnostics, communications and human-machine interfacing. It's essentially a patch so thin and durable that it can be mounted to skin much like a temporary tattoo.
The device represents a paradigm shift in brain-computer interfacing because until now, most systems that used EEG (electroencephalography) required clunky headsets and plenty of accompanying electronic components and devices. This holds true for systems in R&D labs and those commercially available, such as from Emotiv and S.M.A.R.T. BrainGames.
BCIs in academia hold a lot of promise for those with impaired or limited brain or motor control. For instance, researchers at the University of Maryland recently created a "brain cap" that taps into a user’s neural network to control computers, robotic prosthetic limbs and motorized wheelchairs.
The BCI developed at the University of Illinois is no different in this respect, but it also opens up a slew of previously unimaginable possibilities in the field of brain-machine interfaces well beyond biomedical applications, said UC San Diego professor Todd Coleman. He explained in a news release:
The brain-machine interface paradigm is very exciting, and I think it need not be limited to thinking about prosthetics or people with some type of motor deficit. I think taking the lens of the human and computer interacting, and if you could evolve a very nice coupling that is remarkably natural and almost ubiquitous, I think there are applications that we haven’t even imagined. That is what really fascinates me -- really the coupling between the biological system and the computer system.
Coleman and his researchers helped to create the wearable device's circuit design and signal processing and used it to enable a person to control a computer game with muscles in his throat by speaking the commands. They're now exploring what other capabilities could be achieved, such as to enhance a group's ability to work as a team by simultaneously acquiring all of the neural signals and coupling them with a computer.
One of the advantages of using EEG-powered brain-computer interfaces is that they allow for non-invasive interfacing, meaning you do not need to make an implant directly into the brain. But there are drawbacks to a system primarily built on pattern recognition. ExtremeTech's Sebastian Anthony lays it out well:
The problem with EEGs (and with any "mind reading” interface) is that we can’t actually understand the brain. We can see various neurons firing and measure the electrical signals and waves that they produce, but we don’t know what they mean. For these systems to work, then, the computer controllers must be trained to recognize the electrical state of the brain when it’s trying to brake or shift gears. In other words, basic cryptanalysis/linguistic analysis is used: if a tribal pygmy always greets you with a smile, a wave, and an unknown grunt, you can assume that the grunt will mean “hello.” It’s a brute force way of understanding a system, but if you only know the most basic rules of how the system works, that’s all you can do.
Brute it may be, but BCI technology is continually improving in both reliability and functionality. One day, controlling most aspects of your environment with thought alone may not be as far-fetched as it seems right now. Coleman said:
If you think about the advances that are being made in artificial hips and rehabilitation and the fact that people are living longer, it is no longer the case that your body is giving up before your mind. It’s going to be increasingly the case that we need to think about fixing minds along with fixing bodies."