Abstract
Our perception of the world appears deceptively continuous. In reality, our brains take "snapshots" of the surrounding environment from 5 to 15 times a second, creating "perceptual cycles". At the neuronal level, this is evidenced by fluctuations in the excitability of the cortex as a function of the local field potential phase: spikes are more likely to occur at a given phase of the cycle. These fluctuations have also been documented in human observers: for example, subjects are more likely to detect a threshold-level target at a given EEG phase, and less likely at the opposite phase. Yet, most studies usually report EEG phase modulations peaking 100-200ms before stimulus onset. Presumably, the phase that matters for perception should be the one present in the brain during information processing. Why is the phase modulation usually not detectable after stimulus onset? First, we use simulations to show that the target-evoked ERP (together with signal filtering) results in an apparent shift of the peak phase modulation back in time, even before stimulus presentation. Second, we present a novel method, the "white-noise paradigm", which can be used to uncover the true latency of phase modulation effects, free of any contamination by the ERP. The impulse response functions of twenty subjects recorded during a first session were used to reconstruct (rather than record) the brain activity to (new) white-noise sequences presented in a second session. The background oscillatory phase around targets embedded within these sequences could then be reliably estimated, without any influence of the target-evoked response. Using these reconstructed signals, we found that the fronto-occipital ~6Hz oscillatory phase at about 70 ms after target onset is the one that matters for perception. These results confirm the causal influence of phase on perception, at the time the stimulus is actually processed by the brain.
Meeting abstract presented at VSS 2017