September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Feature-specific preparatory signals across the visual hierarchy
Author Affiliations & Notes
  • Taosheng Liu
    Department of Psychology, Michigan State University
    Neuroscience Program, Michigan State University
  • Mengyuan Gong
    Department of Psychology, Michigan State University
Journal of Vision September 2019, Vol.19, 45c. doi:https://doi.org/10.1167/19.10.45c
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Taosheng Liu, Mengyuan Gong; Feature-specific preparatory signals across the visual hierarchy. Journal of Vision 2019;19(10):45c. doi: https://doi.org/10.1167/19.10.45c.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Selective attention can facilitate perceptual processing of features and objects. Observers can often prepare for a specific feature before the arrival of the sensory stimulus. Whether overlapped or separate mechanisms of feature-based attention exist for preparatory activity and stimulus-evoked activity remains unclear. In an fMRI experiment, we used a feature cueing paradigm to assess how preparatory attention leads to selective processing of subsequent stimuli. Each trial began with a colored cue informing participants to attend to one of the two motion directions (i.e., 45° or 135°). After a delay period, two superimposed moving dot fields (along the 45° and 135° direction) were briefly presented. Subjects were asked to perform a threshold direction discrimination task on the cued direction (e.g., was the attended direction clockwise or counterclockwise relative to 45°?). We examined multi-voxel neural patterns as an index of the neural representation of the attended feature, by training a classifier to decode the attended direction for both the cue and stimulus periods. We found above-chance classification accuracy across the visual hierarchy in both the cue and stimulus periods. Compared to the cue period, classification accuracy was further improved in the stimulus period for early visual areas, but not for frontoparietal areas. The equivalent strength of attentional signals between cue and stimulus periods in frontoparietal areas is consistent with these areas maintaining signals for top-down control of feature-based attention. Furthermore, a classifier trained on the cue period did not generalize to the stimulus period, suggesting different coding mechanisms underlying preparatory attention and feature selection.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×