Journal of Vision Cover Image for Volume 17, Issue 10
September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Audio-visual Interactions in Multistable Perception: Evidence from No-report Paradigms
Author Affiliations
  • Wolfgang Einhäuser
    Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
  • Sabine Thomassen
    Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
  • Philipp Methfessel
    Physics of Cognition Group, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
    Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
  • Alexandra Bendixen
    Cognitive Systems Lab, Institute of Physics, Chemnitz University of Technology, Chemnitz, Germany
Journal of Vision August 2017, Vol.17, 1215. doi:https://doi.org/10.1167/17.10.1215
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Wolfgang Einhäuser, Sabine Thomassen, Philipp Methfessel, Alexandra Bendixen; Audio-visual Interactions in Multistable Perception: Evidence from No-report Paradigms. Journal of Vision 2017;17(10):1215. https://doi.org/10.1167/17.10.1215.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Multistable phenomena are observed for almost any sensory modality, yet studies on multisensory interactions in multistability are surprisingly rare. Typically, such studies address the consistency of statistical properties and individual biases of multistability across modalities, or one modality is used to distract attention from the other. Very few studies attempt to measure multistability in two modalities simultaneously. This might be a consequence of the difficulty to query two modalities simultaneously without response interference or dual-task costs. We overcome this issue by using no-report paradigms for the visual modality. Specifically, we continuously read-off the perceptually dominant direction of motion from the slow phase of the optokinetic nystagmus. We present three studies that use this no-report approach to probe audio-visual effects. The first study demonstrates that newly learnt audio-visual associations bias subsequent binocular rivalry: a grating previously coupled with a concurrently presented tone dominates over a grating previously coupled with a different tone. The second study uses binocular rivalry to tag the dominant percept in auditory multistability: two distinct tone sequences are simultaneously presented; a grating that has motion transients co-occurring with the tone perceived in the foreground dominates over a grating whose motion transients co-occur with the competing background tone. Third, a dynamic visual stimulus that is alternatingly perceived as plaid or two gratings, is presented simultaneously with a multistable auditory stimulus. Despite some evidence for intra-observer consistency between the two modalities, we find little influence of the reported auditory percept on the concurrently measured visual percept. Hence, situations in which auditory perception affects visual rivalry can readily be created, through stimulus design or by learnt associations, but configurations exist where the cross-modal influence is minimal. Our results make a case for no-report paradigms as a useful tool to study truly multimodal effects in multistability.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×