Abstract
Expectations broadly influence our experience of the world. However, the process by which they are acquired and then shape our sensory experiences is not well understood. Here, we set out to understand whether expectations of simple stimulus features can be developed implicitly through fast statistical learning, and if so, how they are combined with visual signals to modulate perception. On each trial human participants were presented with either a low contrast random dot kinematogram moving coherently in a single direction, or a blank screen. They were tested on their ability to report the direction (estimation) and the presence (detection) of the motion stimuli. Participants were exposed to a bimodal distribution of motion directions where two directions, 64° apart from each other, were presented in a larger number of trials than other directions. After a few minutes of task performance, participants perceived stimuli to be moving in directions that were more similar to the most frequently presented directions than they actually were. Further, on trials where no stimulus was presented, but where participants reported seeing a stimulus, they were strongly biased to report motion in these two directions. No such effect was observed on trials where they did not report seeing a stimulus. Modelling of participants' behaviour showed that their estimation biases could not be well explained by a simple response bias nor more complex response strategies. On the other hand, the results were well accounted for by a model which assumed that participants solved the task using a Bayesian strategy, combining a learned prior of the stimulus statistics (the expectation) with their sensory evidence (the actual stimulus) in a probabilistically optimal way. Our results demonstrate that stimulus expectations are rapidly learned and can powerfully influence perception of simple visual features, both in the form of perceptual biases and hallucinations.