Abstract
Attention can improve the accuracy of visual discrimination, however little is known about how attention affects the precision of visual encoding and the implications for subsequent visual working memory (VWM). Four randomly oriented Gabors were displayed briefly. Encoded representations of a target Gabor were assessed through report of a feature – here, orientation – by choosing from a 20-orientation palette, one every 9° (−90° to 90°= −90°). In a spatial attention condition (Chu, Dosher, & Lu, VSS 2010), the target Gabor was marked with a simultaneous report cue and attention was either validly or invalidly pre-cued. We report two new VWM conditions with a report cue 900 ms after the four-item display. One VWM condition included an attention pre-cue, and a standard VWM condition did not. Performance in all three conditions showed considerable variability in the probability of reporting the correct, or very close, target orientation(s) yet showed remarkable stability in the spread about the correct orientation. Probability of reporting the target increased with contrast, decreased with external noise, increased with valid pre-cuing, and decreased with delay. There was little evidence for intrusions from non-target locations. The surprising similarity of the spread about the correct orientation over conditions together with many other detailed features of the complex data pattern were well accounted for by a multi-alternative perceptual template model of the observer. Twenty templates respond to the target stimulus with gains determined by the degree of match and the contrast, and noise variance determined by both internal and external noises. The reported orientation is selected by a max rule. The stable clustering or precision of reported orientations reflects the tuning of the templates, which are estimated as closely similar to the matched-template. The data, and the corresponding model, suggest a common mechanism for immediate perception and visual working memory.
National Institute of Mental Health Grant # R01MH81018 and by the National Eye Institute Grant # EY-17491.