October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Object motion and flow variance across optical contexts
Author Affiliations & Notes
  • Jan Jaap R. van Assen
    NTT Communication Science Laboratories
  • Takahiro Kawabe
    NTT Communication Science Laboratories
  • Shin'ya Nishida
    NTT Communication Science Laboratories
    Graduate School of Informatics, Kyoto University
  • Footnotes
    Acknowledgements  Japan Society for the Promotion of Science (JSPS KAKENHI Grant Number JP15H05915)
Journal of Vision October 2020, Vol.20, 458. doi:https://doi.org/10.1167/jov.20.11.458
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jan Jaap R. van Assen, Takahiro Kawabe, Shin'ya Nishida; Object motion and flow variance across optical contexts. Journal of Vision 2020;20(11):458. https://doi.org/10.1167/jov.20.11.458.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

One main goal of visual motion computation is to estimate the trajectory of objects moving in the scene from retinal optical flow. This is a tough computational problem under real-world conditions because retinal optical flow drastically changes with the optical material properties of the moving object. Specular and diffuse reflections, as well as refractions at object surfaces can produce complex patterns of optical flow that do not correspond with the object motions. In addition, these complex flow patterns vary with object shape and surrounding illumination. In this study we are investigating how constant we are in perceiving object motion across various contexts and if we compensate for other causal sources in motion. To see how much perceived object motion is independent of, or dependent on these factors, we asked twelve naïve observers to compare the rotational speed of two objects (T: test and M: match). M has matte shading with a texture, “knot” shape, and “forest” illumination map, while T was chosen from combinations of ten optical properties (e.g., matte, glossy, translucent), three shapes (knot, cubic, blobby), and three illumination maps (sunny, cloudy, indoor). The object rotation was around the vertical axis of the object at 0.5 rotation/sec for T, and variable for M. The exposure of each object was 500ms, and the PSE was estimated by a 2-IFC staircase method. We find illusory differences in perceived rotational speeds for different material, illumination, and shape conditions (e.g., transparent materials appear to rotate faster). Low-level optical flow models using horizontal optical flow and optical flow gradients explain 56% of the variance of the perceived speeds, but the magnitude of misperception predicted by the optic flow is much larger than the observed effect. These results suggest that the human visual system only partially compensates for effects of optical contexts in object motion.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.