Abstract
Attention to a specific feature (e.g., the color red) has been shown to increase neural population responses tuned to the attended feature value, while decreasing population responses tuned to other feature values. However, similarity between features can differ greatly across feature space, and it is not known how feature-based attention resolves this problem. We collected EEG from 16 participants while they selectively attended to one of two colored dot arrays to detect brief intervals of coherent motion. Distractor dots were either similar (60° apart in CIElab color space) or distinct (180° apart) from the target color, and performance was matched across the conditions using a thresholding procedure. We used steady-state visual evoked responses (SSVEPs) to track how feature-based attention modulates neural responses of targets and distractors. First, we measured the signal-to-noise ratio (SNR) of the SSVEPs and found a reliable attentional modulation for targets when distractors were distinct (p = .007) but not when similar (p = .200). Next, we fit an inverted encoding model to the SSVEP responses. Attention increased the fidelity of the reconstructed target responses when distractors were distinct (p = .027, one-tailed), but not when distractors were similar (p = .180), consistent with the SNR measures. However, when the distractor was similar to the target color, there was a shift in the center of the reconstructed color responses (p = .003) such that target reconstructions were biased away from the distractor color. No such bias was observed when distractors were distinct from the target (p = .649). This suggests that similarity between targets and distractors affects feature-based attentional modulations: when targets are sufficiently different from distractors, attention increases their gain and fidelity, however when targets are more similar to distractors, attention acts to increase the distinctiveness of targets by biasing neural populations away from distractors.