Purchase this article with an account.
Kelly Shen, Martin Paré; Neural basis of object memory during visual search. Journal of Vision 2010;10(7):1294. doi: https://doi.org/10.1167/10.7.1294.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Current models of selective attention and visual search incorporate two processes believed to be crucial in searching for an item in a visual scene: the selection of locations to be attended and the temporary prevention of re-selecting previously attended locations. In natural situations, the deployment of visual attention is accomplished by sequences of gaze fixations, and the active suppression of recently visited locations can be examined by analyzing the distribution of gaze fixations as a function of time and location. We trained four monkeys to perform a visual search task, in which they could freely search for a target stimulus with a unique conjunction of features. Monkeys made multiple fixations on distracters before foveating the target (mean: 3.1; range: 1-14) and their probability of foveating the target with a single fixation was only 0.25. Performance in this difficult task, however, was generally efficient as monkeys rarely re-fixated previously inspected stimuli. The probability of a re-fixation increased with time and approximated chance levels after 5-6 fixations, suggesting that foveated information is retained across fixations but completely degraded within about 1000 ms of fixation. To investigate the neural mechanisms underlying this behavior, we recorded the activity of superior colliculus (SC) neurons while two animals performed the task. SC sensory-motor activity was sufficient to guide this behavior: activity associated with previously fixated stimuli was significantly lower than that of stimuli not yet fixated. More than two-thirds of neurons retained these differences up to 100 ms following fixation. These results suggest a neural mechanism for suppressing the re-fixation of stimuli temporarily maintained in memory. These findings demonstrate how neural representations on the visual salience map are dynamically updated from fixation to fixation, thus facilitating visual search.
This PDF is available to Subscribers Only