October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Decoding visual spatial attention control
Author Affiliations & Notes
  • Sreenivasan Meyyappan
    J.Crayton Pruitt Family Department of Biomedical Engineering, Univ. of Florida, Gainesville, FL
  • Abhijit Rajan
    J.Crayton Pruitt Family Department of Biomedical Engineering, Univ. of Florida, Gainesville, FL
  • Jesse Bengson
    Department of Psychology, Sonoma State University, Rohnert Park, CA
  • George Mangun
    Center for Mind and Brain Univ. of California, Davis, CA
  • Mingzhou Ding
    J.Crayton Pruitt Family Department of Biomedical Engineering, Univ. of Florida, Gainesville, FL
  • Footnotes
    Acknowledgements  NIH grant MH117991
Journal of Vision October 2020, Vol.20, 156. doi:https://doi.org/10.1167/jov.20.11.156
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sreenivasan Meyyappan, Abhijit Rajan, Jesse Bengson, George Mangun, Mingzhou Ding; Decoding visual spatial attention control. Journal of Vision 2020;20(11):156. https://doi.org/10.1167/jov.20.11.156.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Deploying anticipatory visual spatial attention in advance of stimulus onset enhances the processing of task-relevant stimuli and suppresses distraction. In this study, we investigated the neural representations of attention control signals in visual cortex by analyzing two fMRI datasets, one recorded at University of Florida (n=13) and the other at University of California, Davis (n=18), using machine learning techniques. In both datasets, the participants performed a cued visual spatial attention task, in which each trial began with a cue, instructing the subject to either attend the left or the right visual hemifield. After a random delay, a grating (Gabor patch) was presented in one of the two hemifields, and the subject was asked to discriminate the spatial frequency of the grating in the attended hemifield and ignore the grating appearing in the un-attended hemifield. Estimating cue-evoked fMRI responses trial by trial and applying multi-voxel pattern analysis (MVPA) to multiple ROIs within the visual cortex, we found the following results. (1) Accuracy of decoding attend-left versus attend-right was significantly above chance level in all the ROIs within the visual cortex (average accuracy=65%). (2) The decoding accuracy was highly correlated across different visual ROIs with 80% of the variance explained by the first principal component. (3) Subjects with higher decoding accuracy performed better on the task as indexed by lower inverse efficiency (response time/response accuracy). These results, consistent across the two datasets, suggest that (1) attention control signals are present in both high order (e.g., intra-parietal sulcus) as well as low order visual areas (e.g., primary visual cortex) and (2) the distinctness of the neural representations of attention control is a personal trait and explains individual differences in task performance.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.