August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Individual Variability in Real-Time Multisensory Integration
Author Affiliations
  • Benjamin Rowland
    Department of Neurobiology and Anatomy, Wake Forest School of Medicine
  • John Vaughan
    Department of Neurobiology and Anatomy, Wake Forest School of Medicine
  • Barry Stein
    Department of Neurobiology and Anatomy, Wake Forest School of Medicine
Journal of Vision September 2016, Vol.16, 154. doi:https://doi.org/10.1167/16.12.154
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Benjamin Rowland, John Vaughan, Barry Stein; Individual Variability in Real-Time Multisensory Integration. Journal of Vision 2016;16(12):154. https://doi.org/10.1167/16.12.154.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The brain's synthesis of information across visual and auditory modalities confers substantive benefits to it in detecting, localizing, and identifying salient environmental events. When encountering brief and discrete cross-modal cues, the brain appears to use all of the available sensory information, and the ensuring multisensory detection and localization decisions conform to statistically optimal models of multisensory integration. The aim of the present study was to determine if and how this optimality extends to processing continuous, patterned stimuli needed to inform an ongoing behavioral task. Human participants (n=20) manually tracked the fluctuations of patterned visual (an expanding/contracting annulus) or auditory (an FM tone) stimuli in the presence of significant signal contamination. Cues were tested both individually and in combination. The frequency of oscillation in one or both modalities subtly shifted on some trials, requiring participants to detect the change(s) and shift their behavioral plan. In addition to tracking baseline performance, we analyzed the probability and rapidity with which subjects detected changes in the underlying pattern, began their shift to the new pattern, and re-established stable tracking. These performance metrics were compared across the unisensory and multisensory test conditions. The largest multisensory enhancements were observed in change-detection and the speed of shifting to the new pattern. The enhancements for most (but not all) subjects approximated predictions of the statistical optimal model. Thus, the statistical optimality of multisensory processing appears to apply not only for static cues but also continuous-time contexts.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×