August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Aurally aided visual search in depth using ‘virtual’ crowds of people
Author Affiliations
  • Jason S. Chan
    Trinity College Dublin, School of Psychology and Institute of Neuroscience
  • Corrina Maguinness
    Trinity College Dublin, School of Psychology and Institute of Neuroscience
  • Simon Dobbyn
    Trinity College Dublin, Department of Computer Science
  • Paul McDonald
    Trinity College Dublin, Department of Mechanical Engineering
  • Henry J. Rice
    Trinity College Dublin, Department of Mechanical Engineering
  • Carol O'Sullivan
    Trinity College Dublin, Department of Computer Science
  • Fiona N. Newell
    Trinity College Dublin, School of Psychology and Institute of Neuroscience
Journal of Vision August 2010, Vol.10, 886. doi:https://doi.org/10.1167/10.7.886
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jason S. Chan, Corrina Maguinness, Simon Dobbyn, Paul McDonald, Henry J. Rice, Carol O'Sullivan, Fiona N. Newell; Aurally aided visual search in depth using ‘virtual’ crowds of people. Journal of Vision 2010;10(7):886. https://doi.org/10.1167/10.7.886.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

It is well known that a sound can improve visual target detection when both stimuli are presented from the same location along the horizontal plane (Perrott, Cisneros, McKinley, & D'-Angelo, 1996; Spence & Driver, 1996). However in those studies, the auditory and visual stimuli were always congruent along the depth plane. In previous experiments, we demonstrated that it is not enough for an auditory stimulus to be congruent along the horizontal plane; it must be congruent in depth as well. However, congruency along the depth plane may not be crucial in virtual reality (VR). It is well known that visual distance perception in VR suffers from a compression of space, whereby objects appear closer to the observer than they are intended to be. In the following experiment we presented virtual scenes of people and the participant's task was to locate a target individual in the visual scene. Congruent and incongruent virtual voice information, containing distance and direction location cues, were paired with the target. We found that response times were facilitated by a congruent sound. Participants were significantly worse when the sound was incongruent to the visual target in terms of either the horizontal or depth plane. Ongoing experiments are also investigating the effects of moving audio-visual stimuli on target detection in virtual scenes. Our findings suggest that a sound can have a significant influence on locating visual targets presented in depth in virtual displays and has implications for understanding crossmodal influences in spatial attention and also in the design of realistic virtual environments.

Chan, J. S. Maguinness, C. Dobbyn, S. McDonald, P. Rice, H. J. O'Sullivan, C. Newell, F. N. (2010). Aurally aided visual search in depth using ‘virtual’ crowds of people [Abstract]. Journal of Vision, 10(7):886, 886a, http://www.journalofvision.org/content/10/7/886, doi:10.1167/10.7.886. [CrossRef]
Footnotes
 Science Foundation Ireland.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×