October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Distracted by affective pictures: Neural mechanisms revealed by multivariate pattern analysis
Author Affiliations
  • Ke Bo
    J Crayton Pruitt Family Department of Biomedical Engineering, University of Florida
  • Nathan Petro
    Department of Psychology, University of Nebraska at Lincoln
  • Changhao Xiong
    J Crayton Pruitt Family Department of Biomedical Engineering, University of Florida
  • Andreas Keil
    Department of Psychology and the NIMH Center for Emotion and Attention, University of Florida
  • Mingzhou Ding
    J Crayton Pruitt Family Department of Biomedical Engineering, University of Florida
Journal of Vision October 2020, Vol.20, 528. doi:https://doi.org/10.1167/jov.20.11.528
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ke Bo, Nathan Petro, Changhao Xiong, Andreas Keil, Mingzhou Ding; Distracted by affective pictures: Neural mechanisms revealed by multivariate pattern analysis. Journal of Vision 2020;20(11):528. https://doi.org/10.1167/jov.20.11.528.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Affective pictures are highly potent distractors. In this study we examined the impact of picture valence on task-relevant visual processing and the underlying neural mechanisms. Simultaneous EEG-fMRI were recorded while participants detected instances of coherent motion in a random dot kinematogram (RDK) overlayed on IAPS pictures (positive=erotic couples, neutral=workplace people, and negative=bodily mutilations). RDK and IAPS pictures flickered on and off at different frequencies, evoking two independent steady-state visual evoked potentials (ssVEP). Applying support vector machines to BOLD responses in ventral visual cortex and MT cortex we found the following results. First, decoding accuracy of both positive-vs-neutral and negative-vs-neutral distractors is above chance level in ventral visual cortex, at 62.6% and 59.4% respectively; positive-vs-neutral decoding accuracy is marginally higher than negative-vs-neutral decoding accuracy (p=0.08). Second, across subjects, decoding accuracy of negative-vs-neutral distractors is negatively correlated with the correctly identified instances of coherent motion (p=0.01), namely, the higher the decoding accuracy the lower the correctly identified instances of coherence motion; decoding accuracy of positive-vs-neutral distractors, however, is not associated with behavioral performance (p=0.9). Third, in MT cortex, decoding accuracy of positive-vs-neutral and negative-vs-neutral distractors is also above chance level, at 71.2% and 64.5% respectively, with positive-vs-neutral decoding accuracy significantly higher than negative-vs-neutral decoding accuracy (p=0.0004). Fourth, neither the positive-vs-neutral decoding accuracy nor the negative-vs-neutral decoding accuracy in MT cortex was found to be predicting behavioral performance (p>0.05). In summary, these results demonstrate that (1) although positive distractors are better represented in both ventral visual cortex and MT cortex than negative distractors, it is the negative distractors that have a stronger influence on behavior and (2) although MT cortex is the neural substrate underlying the task-relevant visual processing, it is the ventral visual cortex where the processing of negative distractors adversely impacts behavior.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×