X
Business

Microsoft looks to brain for user interface design

Microsoft has filed a patent for filtering EEG data so that it can be used to improve the design of human-computer interfaces. Brain signals from sensors attached to the scalp are interpreted by software to classify the brain states associated user interaction.
Written by Dan Farber, Inactive

Microsoft has filed a patent for filtering EEG data so that it can be used to improve the design of human-computer interfaces. Brain signals from sensors attached to the scalp are interpreted by software to classify the brain states associated user interaction. Potential uses include comparing workload types against a variety user interfaces and real-time adaptation of user interfaces to users' cognitive states.

United States Patent Application 20070185697
Kind Code A1
Tan; Desney S. ; et al. August 9, 2007

Using electroencephalograph signals for task classification and activity recognition
Abstract
A method for classifying brain states in electroencephalograph (EEG) signals comprising building a classifier model and classifying brain states using the classifier model is described. Brain states are determined. Labeled EEG data is collected and divided into overlapping time windows. The time dimension is removed from each time window. Features are generated by computing the base features; combining the base features to form a larger feature set; pruning the large feature set; and further pruning the feature set for a particular machine learning technique. Brain states in unlabeled EEG data are classified using the classifier model by dividing the unlabeled EEG data into overlapping time windows and removing the time dimension from each time window. Features required by the classifier model are generated. Artifacts in the labeled and unlabeled EEG data comprise cognitive artifacts and non-cognitive artifacts.

Microsoft, of course, is not alone in tapping into brain signals to adapt user interfaces. For example, Emotiv has

developed a toolkit that identifies facial expressions, such as blink, smile, wink, horizontal eye movement, clenching and eyebrow movement. Players wear a headset with sensors that map the signals from the brain surface back to their original source in the cortex, according the company. The data could be used to animate avatars more realistically, offer more implicit feedback and detect a player's emotional state, which could impact the direction of game play. In addition, Emotiv has also developed a method for manipulating virtual objects using mind control. At this point, the technology has not been integrated into any games, however.

Editorial standards