Abstract
Saccadic eye movement are widely used to investigate underlying decision processes and numerous quantitative models have been proposed to account for changes in saccade reaction time (SRT) distributions, often in conjunction with neurophysiological data. Although they are typically short – often ranging from 100 to 400 ms – and conventionally regarded as a consequence of the duration of decision-making process, previous findings showed that reinforcement contingencies modulates SRT distributions (Madelain et al. 2007) which raises the possibility of a voluntary control of SRTs. Because some minimal perception must be necessary for any voluntary response we ask whether it is possible to accurately perceive such short reaction time. We first collected baseline SRTs in three subjects tracking a stepping visual target. For each subject we computed four SRT classes using the four quartiles of the individual baseline SRT distributions (individual example of SRTs class intervals: 80-182ms; 183-212ms; 213-237ms; 238-400ms). Subjects were then trained to discriminate their SRTs: after each saccade they had to classify the saccade latency in a 4 AFC task and received a feedback indicating the correct answer. Results indicate that, after intensive training, subjects could overall classify their SRT with up to 42% correct responses, well above chance level (25%). Moreover, data showed that for each of the four latency classes the probability of a correct response was systematically higher (p< 0.01) than for any other response. In the same way, for each of the four discrimination responses the probability of SRT falling in the correct quartile was systematically higher (p>0.01) than for any of the 3 other classes. Altogether these results are the first to indicate that human subjects are able to discriminate their own saccade latencies, albeit imperfectly. The precision and extent of this ability remains to be experimentally probed.
Meeting abstract presented at VSS 2016