Abstract
During natural vision, scene perception depends on accurate targeting of attention, anticipation of the physical consequences of motor actions, and the ability to continuously integrate visual inputs with stored representations. For example, when there is an impending eye movement, the visual system anticipates where the target will be next and, for this, attention updates to the new location. Recently, two different types of perisaccadic spatial attention shifts were discovered. One study shows that attention lingers after saccade at the (irrelevant) retinotopic position, that is, the focus of attention shifts with the eyes and updates not before the eyes land to its original position (Golomb et al., 2008, J Neurosci.; Golomb et al., 2010, J Vis.). Another study shows that shortly before saccade onset, spatial attention is remapped to a position opposite to the saccade direction, thus, anticipating the eye movement (Rolfs et al., 2011, Nat Neurosci.). Recently, we proposed a model of perisaccadic perception based on predictive remapping and corollary discharge signals to explain several phenomena of vision (Ziesche & Hamker, 2011, J Neurosci.; Ziesche & Hamker, 2014, Front. Comput. Neurosci.). The model allows for the simulation of stimulus position across eye movements by using a discrete eye position signal and a corollary discharge. We extended the model with an additional feedback loop and a tonic spatial attention signal and show that the observations made by Golomb et al. and Rolfs et al. are not contradictory and emerge through the model dynamics. The former is explained by the proprioceptive eye position signal and the latter by the corollary discharge signal. Interestingly, both eye-related signals are core ingredients of the model and are required to explain data from mislocalization and displacement detection experiments. Thus, our model provides a comprehensive framework to discuss multiple experimental observations that occur around saccades.
Meeting abstract presented at VSS 2015