Purchase this article with an account.
Ming W. H. Fang, Taosheng Liu; The profile of attentional modulation to visual features. Journal of Vision 2019;19(13):13. doi: https://doi.org/10.1167/19.13.13.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Although it is well established that feature-based attention (FBA) can enhance an attended feature, how it modulates unattended features remains less clear. Previous studies have generally supported either a graded profile as predicted by the feature-similarity gain model or a nonmonotonic profile predicted by the surround suppression model. To reconcile these different views, we systematically measured the attentional profile in three basic feature dimensions—orientation, motion direction, and spatial frequency. In three experiments, we instructed participants to detect a coherent feature signal against noise under attentional or neutral condition. Our results support a nonmonotonic hybrid model of attentional modulation consisting of feature-similarity gain and surround suppression for orientation and motion direction. For spatial frequency, we also found a similar nonmonotonic profile for higher frequencies than the attended frequency, but a lack of attentional modulation for lower frequencies than the attended frequency. The current findings can reconcile the discrepancies in the literature and suggest the hybrid model as a new framework for attentional modulation in feature space. In addition, a computational model incorporating known properties of spatial frequency channels and attentional modulations at the neural level reproduced the asymmetric attentional modulation, thus revealing a connection between surround suppression and the basic neural architecture of an early visual system.
This PDF is available to Subscribers Only