Abstract
The perception of duration can be biased by the physical properties of a sensory stimulus. For example, visual stimuli with higher temporal frequency are perceived as longer (Kanai et al., 2006). Objects of different temporal frequencies often appear simultaneously in the environment, providing conflicting information about duration. Does the brain keep separate duration representations for each object, or form a single representation? If a single duration representation is kept, how is it formed? One possibility is by Bayesian cue integration (Ahrens & Sahani, 2011); another is by reading out the total neural energy for encoding all the stimuli (Eagleman & Pariyadath 2009, 2012). Human participants estimated the duration of Gabor patterns drifting at 1Hz and 6Hz (denoted by L for low and H for high frequency, and LH when the two were simultaneously presented. In Experiment 1, participants compared the duration of LH against H. Psychometric functions revealed no bias between them. This suggests observers might overweight the dominant frequency channel (every stimulus includes an H), or were able to keep separate duration representations for each frequency channel and only use the H channel for judgments. In Experiment 2, LH was always presented first, followed by LH, H, or L. Duration of H was perceived longer than LH, consistent with a Bayesian cue integration model. Relative to LH, the judgments to H and L were significantly different, ruling out the model of separate duration representations. The precision of judging LH was better than H and L for the majority of participants. Experiment 3 used a static Gabor pattern (S) as the standard stimulus, and showed a compatible result. These data suggest observers weight duration information from multiple stimuli to form a single estimate. However, the distribution of stimuli in experiment context can influence the weights.
Meeting abstract presented at VSS 2013