Abstract
People can adapt to a fixed temporal lag between audiovisual events, resulting in changes in perceived simultaneity (Fujisaki et al., 2004, Nat Neurosci). Our aim was to investigate whether one modality recalibrates the other in this adaptation. We used a temporal bisection task in which three consecutive events were presented. Participants judged if the middle event was closer in time to the first or the last one. Visual events were flashed white discs and auditory events were 2kHz pure tone. To reduce the saliency of the tones, white auditory noise was presented in the background during each trial. Before running the bisection task, participants were presented with 100 audiovisual events where the visual event lagged by 100ms the auditory. We contrasted four conditions: two unimodal and two multimodal. Unimodal conditions consisted of three purely visual or auditory events. In a multimodal condition, the first and the third event were either auditory or visual and the middle event was audiovisual with a lag identical to the adaptation phase. Points of indifference (PI, ratio of physical durations of first over second intervals leading to equal perceived durations of first and second intervals) were calculated separately for the four conditions. Across participants in the unimodal conditions, PIs were typically less than 1 and different for vision and audition. We reasoned that if vision recalibrates audition, the PI for the multimodal auditory condition should be shifted towards the unimodal visual PI, and conversely if audition recalibrates vision. We found shifts in PI for both multimodal conditions relative to the corresponding unimodal conditions, suggesting partial recalibration of both modalities. This result is understandable within our setup where both modalities had comparable sensitivities in the bisection task. In summary, the temporal bisection task appears to be a valuable tool to assess recalibration to multisensory lags.
Meeting abstract presented at VSS 2016