August 2014
Volume 14, Issue 10
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Angular, speed and density tuning of flow parsing
Author Affiliations
  • Diederick C Niehorster
    Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
  • Li Li
    Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong SAR, China
Journal of Vision August 2014, Vol.14, 487. doi:https://doi.org/10.1167/14.10.487
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Diederick C Niehorster, Li Li; Angular, speed and density tuning of flow parsing. Journal of Vision 2014;14(10):487. https://doi.org/10.1167/14.10.487.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent studies have suggested that the visual system subtracts the optic flow experienced during self-motion from the retinal motion of the environment to recover scene-relative object motion, a phenomenon called "flow parsing". The psychophysical characteristics of this process however remain unclear. Here, by measuring the gain with which flow parsing is performed, we examined how flow parsing is affected by the angle between the object motion and the background flow at the object's location (Experiment 1), the self- or the object motion speed (Experiments 2 and 3), and the density of the elements in the background flow (Experiment 3). In each 0.5-s trial, the display (83°H x 83°V, 60 Hz) simulated forward self-motion at .5–5 m/s toward a frontal plane covered with 10–5000 white random dots placed at 2 m. A red probe dot moved leftward or rightward at 1–10 deg/s on the frontal plane. A component toward the FOE was added to the probe's horizontal retinal motion under the control of an adaptive staircase to determine when the probe was perceived to move horizontally. The results show that flow parsing was strongly affected by each of the factors we varied. Specifically, flow parsing gain decreased exponentially as the object motion direction deviates from the background flow at its retinal location. Surprisingly, flow parsing gain also decreased exponentially with the increase of the simulated self-motion speed. Flow parsing gain increased linearly with the object motion speed and increased logarithmically with the density of the background flow. We conclude that while increasing the object motion speed and the number of elements in the scene helps the perception of scene-relative object motion during self-motion, the performance is best at normal walking speed and when the object moves in the same direction as the background flow at its retinal location.

Meeting abstract presented at VSS 2014

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×