Researchers at the Massachusetts Institute of Technology have created an emotionally intelligent device called EQ-Radio that can gauge a person's emotion via wireless signals. This technology could be built into the smart home, gadgets, and ultimately enable a bevy of business uses.
MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has published a paper outlining EQ-Radio, a tool that can detect emotions based on changes in breathing and heart rhythms. Researchers claim that EQ-Radio is 87-percent accurate at detecting whether a person is excited, happy, angry, or sad.
In a paper, MIT researchers noted:
Emotion recognition is an emerging field that has attracted much interest from both the industry and the research community. It is motivated by a simple vision: Can we build machines that sense our emotions? If we can, such machines would enable smart homes that react to our moods and adjust the lighting or music accordingly. Movie makers would have better tools to evaluate user experience. Advertisers would learn customer reaction immediately. Computers would automatically detect symptoms of depression, anxiety, and bipolar disorder, allowing early response to such conditions. More broadly, machines would no longer be limited to explicit commands, and could interact with people in a manner more similar to how we interact with each other.
The big win is that EQ-Radio doesn't need on-body sensors to deliver its results. MIT professor Dina Katabi sees the system being used in entertainment, consumer behavior, and healthcare. In a blog post, MIT noted that "smart homes could use information about your mood to adjust the heating or suggest that you get some fresh air",
Here's how the system works: wireless signals are reflected off people's bodies, then EQ-Radio measures heartbeats as accurately as an ECG monitor, and from there, waveforms in each heartbeat are aligned to a mode. In the future, machines could monitor and diagnose depression and anxiety. EQ-Radio measures minute variations in each individual beat length.
We have built EQ-Radio into a full-fledged emotion recognition system. EQ-Radio's system architecture has three components: The first component is an FMCW radio that transmits RF signals and receives their reflections. The radio leverages the approach to zoom in on human reflections and ignore reflections from other objects in the scene. Next, the resulting RF signal is passed to the beat extraction algorithm. The algorithm returns a series of signal segments that correspond to the individual heartbeats. Finally, the heartbeats - along with the captured breathing patterns from RF reflections - are passed to an emotion classification sub-system as if they were extracted from an ECG monitor. The emotion classification sub-system computes heartbeat-based and respiration-based features recommended in the literature and uses an SVM classifier to differentiate among various emotional states.
In its paper, MIT researchers noted that EQ-Radio is more accurate than Microsoft's Emotion API, which revolves around facial expressions. EQ-Radio detects each emotion better than Microsoft's Emotion API, which only outperforms for neutral emotion.