Purchase this article with an account.
Petra Vetter, Karin Petrini, Lukasz Piwek, Fraser Smith, Vijay Solanki, Matthew Bennett, Frank Pollick, Lars Muckli; Decoding emotional valence of sounds in early visual cortex. Journal of Vision 2016;16(12):472. doi: 10.1167/16.12.472.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Using fMRI decoding techniques we recently demonstrated that early visual cortex contains content-specific information from sounds in the absence of visual stimulation (Vetter, Smith & Muckli, Current Biology, 2014). Here we studied whether the emotional valence of sounds can be decoded in early visual cortex during emotionally ambiguous visual stimulation. Participants viewed video clips in which two point-light walkers interacted with each other, either emotionally neutrally (having a normal conversation) or emotionally negatively (having an argument). Videos were paired with low-pass filtered soundtracks of this interaction either congruently or incongruently. Participants' task was to judge the overall emotion of the interaction. The emotionally ambiguous condition consisted of the neutral visual stimulus which could be interpreted as either a negative or neutral interaction depending on the soundtrack. The emotionally unambiguous condition consisted of the negative visual stimulus which was judged as negative independently of soundtrack (as confirmed behaviourally). Functional MRI data were recorded while participants viewed and judged the interaction. Activity patterns from early visual cortex (as identified with individual retinotopic mapping) were fed into a multi-variate pattern classification analysis. When the visual stimulus was neutral, and thus emotionally ambiguous, the emotional valence of sounds could be decoded significantly above chance in V1. However, when the visual stimulus was negative, and thus emotionally unambiguous, emotional valence of sounds could not be decoded in early visual cortex. Furthermore, emotional valence of the visual stimulus was decoded in both early visual and auditory cortex independent of soundtrack. The results suggest that emotional valence of sounds is contained in early visual cortex activity when visual information is emotionally ambiguous, but not when it is emotionally unambiguous. Thus, feedback from audition may help the visual system to resolve ambiguities when interpreting a visual scene, and thus may serve a function in perception.
Meeting abstract presented at VSS 2016
This PDF is available to Subscribers Only