X
Innovation

Optic flow guides us in the right direction

The way we perceive the visual motion of objects as we move is a phenomenon called 'optic flow.' Now, Brown University researchers have discovered that optic flow helps us to walk. They've used Brown's Virtual Environment Navigation Lab, or VENLab, one of the largest virtual reality labs in the U.S. to conduct trials on 40 subjects. All of them had to find a virtual doorway, but half of the subjects had optic flow, or a stream of visual information, available to them, while the other half had not. The results were spectacular. With optic flow, the subjects adapted themselves 7 times faster to their virtual environment than the other ones. This discovery could soon be applied in robotics to develop machines with more accurate guidance capabilities.
Written by Roland Piquepaille, Inactive

The way we perceive the visual motion of objects as we move is a phenomenon called 'optic flow.' Now, Brown University researchers have discovered that optic flow helps us to walk. They've used Brown's Virtual Environment Navigation Lab, or VENLab, one of the largest virtual reality labs in the U.S. to conduct trials on 40 subjects. All of them had to find a virtual doorway, but half of the subjects had optic flow, or a stream of visual information, available to them, while the other half had not. The results were spectacular. With optic flow, the subjects adapted themselves 7 times faster to their virtual environment than the other ones. This discovery could soon be applied in robotics to develop machines with more accurate guidance capabilities.

Finding a virtual door

As you can see on the image above, "the target doorway was presented to test subjects through a prism that effectively shifted its location to one side. Subjects who maintained optic flow were able to adapt to the shift and find their way to the door far better than subjects without access to optic flow." (Credit: Brown University) You'll find a larger version of this picture at the bottom of this page.

These experiments were conducted at the VENLab under the direction of William Warren, chairman of Brown's Department of Cognitive and Linguistic Sciences. But the project was led by Hugo Bruggeman, a postdoctoral research fellow in the Warren lab.

Here are some details about the experiments. "The team created a virtual display that simulates a prism, bending light so that the visual scene shifts to one side. The target -- in this case, a virtual doorway toward which subjects were told to walk -- appeared to be farther to the right than it actually was. A total of 40 subjects ran through about 40 trials each, with everyone trying to walk through the virtual doorway while wearing the simulated prism. Half those subjects had optic flow, or a steady stream of visual information, available to them. The other half did not."

And how did the two different groups react? "The researchers found that, on average, all 40 subjects missed the doorway by about five feet on the first few steps. But after a while, subjects adapted and were able to walk straight toward the doorway. Then the simulated prism was removed, and subjects were again asked to walk to the virtual doorway. Surprisingly, they all missed their mark on the opposite side because their brains and bodies had adapted to the prism. After a few tries, subjects quickly readjusted again and were able to walk straight to the doorway."

Here is a comment by Bruggeman about his findings. He "said the kicker came when they compared data from subjects who had optic flow available during the trials with data from those who did not. When subjects had optic flow, they took a straight path toward the doorway and made it, on average, in just three tries. When optic flow was eliminated, and subjects had only a lone target to aim for, it took an average of 20 tries before they walked straight to the target."

This research work will appear on the cover story of the Current Biology journal on December 4, 2007 under the name "Optic flow drives human visuo-locomotor adaptation." Here is a link to the abstract.

A slightly different version of this abstract has been published online by the Journal of Vision under the title "Optic flow serves as a teaching signal for visual-locomotor adaptation" (Volume 7, Number 9, Abstract 152, Page 152a, June 30, 2007). Here is a link to this document.

Here are some selected excerpts from this paper. "Humans rely on two strategies to walk to a goal: (1) Optic flow strategy: null the visual angle between the heading specified by optic flow and the visual direction of the goal; (2) Egocentric direction strategy: null the angle between the locomotor axis and the egocentric direction of the goal. Optic flow dominates in environments with sufficient visual surface structure. In the 1960's, Held proposed that optic flow might also drive prism adaptation during walking."

So how did the researchers test this hypothesis? "By adapting participants to displays in which the optic flow pattern was displaced from the walking direction by 10 deg to the right, and then testing them with normal flow. Participants walked to a target in an immersive virtual environment (the 12 m x 12 m VENLab) while wearing a head-mounted display and head tracker. Two worlds were used in both adaptation and test phases: (a) a lone target line (minimal optic flow), or (b) textured surfaces and posts (rich optic flow). This created a 2x2 design with four groups of participants.

And what kind of results did they obtain? "In the Line world, results show gradual adaptation, with gradually reduced heading error and straighter paths over 36 trials. In the Textured world there was immediate behavioral adaptation in the first few trials. [...] The results indicate that optic flow serves as the teaching signal that adapts the visual-motor mapping between egocentric direction and walking direction. Optic flow thus dominates both the guidance of walking and visual-locomotor adaptation in visually structured environments."

Finally, if you want to learn more about the VENLab at Brown University, here is a link to a very informative article from Nature Neuroscience, "Virtual reality in behavioral neuroscience and beyond" (Volume 5, Issue 11s, Pages 1089-1092, November 2002).

Sources: Brown University news release, November 15, 2007; and various websites

You'll find related stories by following the links below.

Editorial standards