September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
From Preparatory Attention to Stimulus Selection: Neural Mechanisms Revealed by Multivariate Analysis of fMRI Data
Author Affiliations
  • Qiang Yang
    Department of Biomedical Engineering, University of Florida
  • Sreenivasan Meyyappan
    Department of Psychology and Center for Mind and Brain, University of California Davis
  • George R Mangun
    Department of Psychology and Center for Mind and Brain, University of California Davis
  • Mingzhou Ding
    Department of Biomedical Engineering, University of Florida
Journal of Vision September 2024, Vol.24, 275. doi:https://doi.org/10.1167/jov.24.10.275
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Qiang Yang, Sreenivasan Meyyappan, George R Mangun, Mingzhou Ding; From Preparatory Attention to Stimulus Selection: Neural Mechanisms Revealed by Multivariate Analysis of fMRI Data. Journal of Vision 2024;24(10):275. https://doi.org/10.1167/jov.24.10.275.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Preparatory attention is often studied using the cueing paradigm. According to the prevailing theory, following an attention-directing cue, top-down signals from the frontoparietal attention control regions propagate to visual cortex to bias sensory neurons to enable stimulus selection. Despite years of research, the underlying neural mechanisms remain to be better elucidated. We recorded fMRI data from participants performing a cued visual spatial attention task. At the beginning of each trial, participants were asked to covertly deploy attention to one of the two visual fields. Following a random cue-target period, a stimulus appeared either at the attended location or at the unattended location, and participants discriminated the stimulus appearing at the attended location and ignored the stimulus appearing at the unattended location. Applying MVPA to fMRI data, we reported the following findings: (1) attend-left vs attend-right can be decoded from the cue-evoked neural activity in all visual areas, (2) stimulus-left vs stimulus-right can be decoded from the target-evoked activity in all visual areas, (3) classifiers built on the cue-evoked data can decode stimulus-evoked activity in all visual areas and vice versa, and (4) the higher the cross-decoding accuracy, the better the behavioral performance. These results suggest that top-down control signals form neural patterns in the cue-target period that resemble the neural patterns evoked by the stimulus and these “attentional templates” enable stimulus selection and improve behavior.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×