September 2015
Volume 15, Issue 12
Vision Sciences Society Annual Meeting Abstract  |   September 2015
Keep on rolling: Visual search asymmetries in 3D scenes with motion-defined targets
Author Affiliations
  • Matthew Cain
    U.S. Army Natick Soldier Research, Engineering, & Development Center Visual Attention Lab, Brigham & Women's Hospital
  • Emilie Josephs
    Visual Attention Lab, Brigham & Women's Hospital
  • Jeremy Wolfe
    Visual Attention Lab, Brigham & Women's Hospital Departments of Radiology & Ophthalmology, Harvard Medical School
Journal of Vision September 2015, Vol.15, 1365. doi:10.1167/15.12.1365
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Matthew Cain, Emilie Josephs, Jeremy Wolfe; Keep on rolling: Visual search asymmetries in 3D scenes with motion-defined targets. Journal of Vision 2015;15(12):1365. doi: 10.1167/15.12.1365.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Many simple feature searches are asymmetric; that is, finding a target defined by feature value A among distractors with value B is more efficient than finding B among A. In motion, for example, finding moving targets among stationary distractors is more efficient than finding stationary among moving (but see Rosenholtz, 2001). Most previous work involves simple motions in the 2D plane including manipulations of speed (Ivry & Cohen, 1992), rotation (Thornton & Gilden, 2001), expansion, contraction (Takeuchi, 1997), and randomness (Horowitz, et al., 2007). Here, we extend this work to environments with depth, using objects rotating around different axes in 3D environments. Observers searched for targets that were “rolling” about a horizontal axis among distractors “spinning” about a vertical axis or vice versa. Objects appeared to rest on a slanted plane and did not translate along this surface. Set sizes were 4, 8, and 12. Search for rolling targets among spinning distractors was markedly more efficient than search for spinning among rolling (RT x set size slopes: 12 vs 36 msec/item). Half of observers had target-present slopes < 10 msec/item, suggesting that “rolling” may behave like a fundamental feature such as color or orientation. More broadly, these results suggest that more features that guide attention may be waiting to be discovered as we move beyond simple stimuli in the frontal plane. Horowitz, et al. (2007). Visual search for type of motion is based on simple motion primitives. Perception, 36, 1624-1634. Ivry & Cohen (1992). Asymmetry in visual search for targets defined by differences in movement speed. JEP:HPP, 18, 1045-1057. Rosenholtz (2001). Search asymmetries? What search asymmetries? Perception and Psychophysics, 63, 476-489. Takeuchi (1997). Visual search of expansion and contraction. Vision Research, 37(15), 2083-2090. Thornton & Gilden, (2001). Attentional limitations in the sensing of motion direction. Cognitive Psychology, 43, 23-52.

Meeting abstract presented at VSS 2015


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.