Facebook shared plans to develop a non-invasive brain-speech interface that would create a system--via sensors--to enable people to communicate without speech. In a nutshell, you think and communicate silently.
Regina Dugan, head of Facebook's Building 8 research lab, acknowledged that these "silent speech interfaces" are "a few years away." Building 8 is taking on two projects:
Type with your brain. Facebook wants to develop a silent speech interface that provides the speed and flexibility of voice with the privacy of text. This system has a goal of typing 100 words per minute straight from the speech center in your brain. The approach would decode words that you've already decided to share.
The project will rely on non-invasive sensors shipped at scale. Sensors would measure brain activity without signal distortions. There are no imaging sensors that can measure this way today. Facebook has a team of more than 60 scientists, integrators and engineers working on the project.
"How do I get all that information out of my brain and into the world? What are my choices? Speech is essentially a compression algorithm and a lossy one at that," said Dugan.
Hear with your skin. The second project revolves around building hardware and software to allow you to deliver language through your skin, which is a network of nerves that sends information to your brain.
This effort aims to be a more high-tech version of Braille. Facebook is hoping to replicate what the cochlea in your ear does--takes sound and separates into frequencies transmitted to the brain--except via your skin.
CNET profiled Building 8 and Facebook's moonshot projects. Two key takeaways from the CNET profile:
"If I'm doing my job well, we should deliver things people didn't know to ask for," Dugan -- who previously headed Darpa, the Defense Department's famed tech arm -- tells me Monday from a working space on Facebook's campus. "There's the risk of failure. But that's precisely the price you pay for the honor of working on something new.
The technology could be game-changing for something like augmented reality glasses, suggests Dugan. Simple "yes" and "no" buttons in front of your eyes could be helpful in a number of situations. For instance, answering "yes" to the question "Do you want to see in the dark?" might activate a night-vision mode. All you'd have to do is think of moving a cursor to the "yes" button, and visualize clicking it.
In Dugan's keynote, she said that Facebook's research is a bit scary yet could have broad implications for how humans communicate. For starters, language barriers would fade away. What's unclear is whether this system could take brain activity and semantic information and make sense of it on the fly. Dugan noted that language and semantics are just compressed thoughts.
The other huge wild card here is the privacy implications. Facebook already knows a ton about you, but brain activity may be a bridge too far. Nevertheless, Dugan's group has one interesting research project growing. She acknowledged that this thought-communication effort is a bit scary. "This matters. If we fail this is going to suck."