Abstract
When we look at the world surrounding us, we experience a continuous flow of information coming to our eyes. The most intuitive interpretation of this experience of continuity is that the visual system processes information in a continuous manner. In striking contrast with this intuition, a growing body of evidence suggests that we apprehend the world via discrete processing epochs (VanRullen, Reddy & Koch, 2005; 2006; Ward, 2003). However, the exact nature of these oscillations has been derived indirectly via modelization. Here, we investigated the nature of these oscillations using a classification image approach. We asked five subjects to determine if a stimulus, presented during 200 ms, was a face or a house. On every trial, the signal was either a house or a face that dissolved sinusoïdally into a white Gaussian noise field (phase relative to stimulus onset: 0, pi/6, pi/3, pi/2, 2*pi/3, and 5*pi/6; frequencies: 5, 10, 15, and 20 Hz). Performance was maintained at 75% correct by adjusting signal-to-noise ratio with QUEST (Watson & Pelli, 1983). We found a modulation of the performance as a function of the frequency and the phase relative to stimulus onset (peak-to-trough differences ranging from 11.9% and 18.7%, and from 5.9% and 19.0% for the frequency and the phase respectively), further supporting the hypothesis of discrete processing epochs. We have reconstructed the optimal stimulus in the least-mean-square sense for every participant by performing multiple regressions on the stimuli oscillations and performance. We typically found two oscillations in the classification images: one at 5 Hz and the other between 15 and 20 Hz. We believe that we have revealed fossilized oscillations at different stages of visual processing; the fastest oscillation could be the elusive “perceptual moments” and slower ones could be attention-related oscillations (see also VanRullen, Carlson, & Cavanagh, 2007).