August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Decoding emotional valence of sounds in early visual cortex
Author Affiliations
  • Petra Vetter
    Dept. of Psychology & Center for Neural Science, New York University
  • Karin Petrini
    Dept. of Psychology, University of Bath
  • Lukasz Piwek
    Bristol Business School, University of West England
  • Fraser Smith
    Dept. of Psychology, University of East Anglia
  • Vijay Solanki
    School of Critical Studies, University of Glasgow
  • Matthew Bennett
    Institute of Neuroscience and Psychology, University of Glasgow
  • Frank Pollick
    Institute of Neuroscience and Psychology, University of Glasgow
  • Lars Muckli
    Institute of Neuroscience and Psychology, University of Glasgow
Journal of Vision September 2016, Vol.16, 472. doi:https://doi.org/10.1167/16.12.472
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Petra Vetter, Karin Petrini, Lukasz Piwek, Fraser Smith, Vijay Solanki, Matthew Bennett, Frank Pollick, Lars Muckli; Decoding emotional valence of sounds in early visual cortex. Journal of Vision 2016;16(12):472. https://doi.org/10.1167/16.12.472.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Using fMRI decoding techniques we recently demonstrated that early visual cortex contains content-specific information from sounds in the absence of visual stimulation (Vetter, Smith & Muckli, Current Biology, 2014). Here we studied whether the emotional valence of sounds can be decoded in early visual cortex during emotionally ambiguous visual stimulation. Participants viewed video clips in which two point-light walkers interacted with each other, either emotionally neutrally (having a normal conversation) or emotionally negatively (having an argument). Videos were paired with low-pass filtered soundtracks of this interaction either congruently or incongruently. Participants' task was to judge the overall emotion of the interaction. The emotionally ambiguous condition consisted of the neutral visual stimulus which could be interpreted as either a negative or neutral interaction depending on the soundtrack. The emotionally unambiguous condition consisted of the negative visual stimulus which was judged as negative independently of soundtrack (as confirmed behaviourally). Functional MRI data were recorded while participants viewed and judged the interaction. Activity patterns from early visual cortex (as identified with individual retinotopic mapping) were fed into a multi-variate pattern classification analysis. When the visual stimulus was neutral, and thus emotionally ambiguous, the emotional valence of sounds could be decoded significantly above chance in V1. However, when the visual stimulus was negative, and thus emotionally unambiguous, emotional valence of sounds could not be decoded in early visual cortex. Furthermore, emotional valence of the visual stimulus was decoded in both early visual and auditory cortex independent of soundtrack. The results suggest that emotional valence of sounds is contained in early visual cortex activity when visual information is emotionally ambiguous, but not when it is emotionally unambiguous. Thus, feedback from audition may help the visual system to resolve ambiguities when interpreting a visual scene, and thus may serve a function in perception.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×