Purchase this article with an account.
Joo Huang TAN, Po-Jang HSIEH; Context dependent crossmodal assocations between visual spatial frequencies and auditory amplitude modulation rates.. Journal of Vision 2017;17(10):193. doi: 10.1167/17.10.193.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Our percept of the world is a multisensory one. In order to successfully navigate our surrounds, we must integrate information originating from different modalities. Here, we delve into the relationship between two low level sensory features. Specifically, we investigated the highly specific perceptual matches that exist between auditory amplitude modulation (AM) rate and visual spatial frequency. We conducted a series of perceptual matching tasks between visual spatial frequencies and auditory AM rates. Participants were tasked to adjust the AM rates of the sound stimulus to match the spatial frequency of a grating stimulus displayed on screen. Each participant was only presented with one specific visual spatial frequency for the experiment. Initial AM rate of the sound stimulus at the start of each trial was manipulated as an independent variable across subjects. We demonstrate that perceptual associations made between specific pairs of visual spatial frequencies and auditory amplitude modulation rates are highly conserved across individuals within specific experimental context. However, these associations are highly, and easily influenced by auditory context. This work serves to demonstrate the point that sensory context can exert strong influences on crossmodal associations.
Meeting abstract presented at VSS 2017
This PDF is available to Subscribers Only