September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Optimal integration of heading specified by optic flow and target egocentric direction
Author Affiliations
  • Wei Sun
    Department of Electronic Engineering, Shanghai Jiao Tong University, PRC
  • Zhenyu Zhu
    NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, PRC
  • Jing Chen
    NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, PRC
  • Guangtao Zhai
    Department of Electronic Engineering, Shanghai Jiao Tong University, PRC
  • Michael Landy
    Department of Psychology and Center for Neural Science, New York University, New York, US
  • Li Li
    NYU-ECNU Institute of Brain and Cognitive Science, New York University Shanghai, PRC
Journal of Vision September 2018, Vol.18, 1044. doi:https://doi.org/10.1167/18.10.1044
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Wei Sun, Zhenyu Zhu, Jing Chen, Guangtao Zhai, Michael Landy, Li Li; Optimal integration of heading specified by optic flow and target egocentric direction. Journal of Vision 2018;18(10):1044. https://doi.org/10.1167/18.10.1044.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Background. Previous research suggests that heading specified by optic flow and target egocentric direction are used for the control of walking toward a goal. We examined whether these two cues are optimally combined in this process. Methods. In the walking task, participants (n=12) wore a head-mounted display (Oculus DK2, FOV: 100°) and walked toward a line target (width: 0.5°, retinal size did not increase with approach) placed at 8 m in (1) an empty virtual environment that provided only target egocentric direction or (2) with a textured ground and ceiling that also provided dense optic flow. Participants' heading in the virtual environment was displaced ±15° from their physical walking direction (i.e., straight ahead). In the perceptual task, participants passively viewed displays of the textured ground and ceiling environment with heading specified by optic flow displaced ±15° away from straight ahead. They judged perceived heading with a mouse-controlled probe at the end of each 1 s simulated self-motion display. Results. The heading error was 15° throughout the course of walking toward the target with the empty environment. With the textured environment, heading error dropped quickly and was reduced to 7° on average by the end of each trial. For the perceptual task, with a heading offset of 15°, perceived heading from optic flow had a center bias of 3.7°. Mean heading error and SD observed in the perceptual task (heading cue) and the walking task with the empty environment (target egocentric direction cue) successfully predicted heading error in the walking task with the textured environment (cue conflict) assuming optimal cue integration. This supports the claim that heading specified by optic flow and target egocentric direction cues are optimally combined for goal-oriented locomotion control.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×