Abstract
To reliably construct percepts from visual signals, the visual system must exploit the statistical relationships between properties of the world and signals. We propose that the visual system actively monitors and refines the process by which signals are converted into percepts. In particular, the visual system should start to utilize arbitrary signals as perceptual “cues” when exposed to novel correlations between those signals and the world. We tested this proposition using the perceived rotation direction of a rotating Necker cube covered with dots. Perceived direction based on structure-from-motion alone is bistable for this stimulus and therefore might be easily biased by newly recruited cues. On “training” trials, we added stereo and occlusion cues to disambiguate the rotation direction. Critically, the rotation of the cube was also artificially correlated with one of three novel training signals - position in the visual field, translational movement of the cube, and the frequency of a sound tone. On “probe” trials, stereo and occlusion cues were eliminated so the direction of rotation was ambiguous unless subjects made use of the newly trained cue. We found that position and translation became effective at biasing perceived rotation on probe trials, but the auditory cue did not. Effects persisted into the next day. An additional session with opposite correlation reduced or reversed the previous learned bias, but the amplitude of the reversed bias was weaker than the original. These results suggest that there are processes that actively seek to learn the statistical contingencies of the world.