October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
The effect of auditory cues on visual learning in multisensory perceptual training in virtual reality
Author Affiliations & Notes
  • Catherine A. Fromm
    Rochester Institute of Technology
  • Kelsey E. Murphy
    Rochester Institute of Technology
  • Melissa J. Polonenko
    University of Rochester
  • Ross K. Maddox
    University of Rochester
  • Krystel R. Huxlin
    University of Rochester
  • Gabriel J. Diaz
    Rochester Institute of Technology
  • Footnotes
    Acknowledgements  Unyte Foundation Pipeline to Pilot
Journal of Vision October 2020, Vol.20, 867. doi:https://doi.org/10.1167/jov.20.11.867
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Catherine A. Fromm, Kelsey E. Murphy, Melissa J. Polonenko, Ross K. Maddox, Krystel R. Huxlin, Gabriel J. Diaz; The effect of auditory cues on visual learning in multisensory perceptual training in virtual reality. Journal of Vision 2020;20(11):867. https://doi.org/10.1167/jov.20.11.867.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Visual training improves performance in visually-intact and visually-impaired participants, making it useful as a rehabilitation tool. An interesting question in rehabilitation is whether invoking multisensory integration could increase training efficacy. Four visually-intact subjects (three female) were included in a 10-day training experiment in which they repeatedly performed a 4-way direction discrimination task. In this task, a gaze-contingent, 5degree diameter visual global motion stimulus was presented at 10degree azimuth/elevation in virtual reality (VR), using the HTC Vive Pro Eye head-mounted display with integrated eye-tracking. The stimulus could move along one of four oblique directions. Two subjects trained with only visual stimulation (V group). The other two subjects trained with an accompanying pulsed white-noise auditory cue (AV group) moving in VR along the horizontal component of the visual motion, rendered with the SteamAudio 3D audio spatializer. Visual task difficulty was manipulated with a staircase by changing the range of directions in which dots could move around the principal direction of motion in the visual stimulus. The staircase level was determined only by correct judgment of the vertical component of motion. Direction range thresholds (DRT) were computed daily. The mean (+/- standard deviation) slope of DRT on the overall judgement across 10 days was 2.1+/-2.3degrees for the AV group and 1.9+/-0.9degrees for the V group (n=2 each). Visual-only pre- and post-tests showed an average change in DRT of -3.5+/-0.7degrees for the AV group, and 33.5+/-19.1degrees for the V group. This suggests that AV subjects learned to rely on the auditory cue for overall task performance and failed to improve their skill on the purely visual component. Once the auditory cues were removed, performance in the AV group dropped to pre-training levels. Thus, adding even an informative auditory cue to a visual task may impair rather than enhance visual learning under certain conditions.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.