VR headset may be the best concussion test in the world

Concussion is still diagnosed with rudimentary question-answer tests. FDA approval for a new device could change that.
Written by Greg Nichols, Contributing Writer

The concussion controversy has forced the NFL to take a hard look at itself, right?


Sure. In headlines, maybe. In reality, regular season and Super Bowl viewership are through the roof. Profits are soaring and no one with a financial stake in the league seems particularly nervous.

(Lest I idly cast stones, I recognize I'm part of the problem. I'm a football fan and I still watch. Worse, I'm loyal to the Chargers, and as anyone who follows the NFL will know, my home team's greatest hero, Junior Seau, recently met a tragic end that was almost certainly precipitated by chronic traumatic encephalopathy brought about by repeated blows to the head.)

But there has been a bit of progress. Concussions are now understood to be serious injuries, and diagnosis is more commonplace. It also seems to be getting more scientific. At present, concussion is still diagnosed using rudimentary question-answer tests.

Enter SyncThync, a company that makes eye-tracking technology. SyncThync just received FDA approval for a virtual reality device called the Eye-Sync that can accurately diagnose concussion based on irregular eye movement.

The device is "an integrated head-mounted eye tracking device for rapid, reliable recording, viewing and analyzing of eye movement impairment through the use of virtual reality."

Abnormal eye movement is one of the most common deficits after a concussion occurs, an assessment Eye-Sync can make in under 60 seconds. That's huge news, and not just for football players. Emergency room doctors and EMTs frequently miss concussion diagnoses in the melee following an accident.

SyncThink currently has 10 granted patents and a normative database that includes more than 10,000 individuals, as well as more than 40 peer-reviewed research articles characterizing the impact of concussion on visual attention.

Editorial standards