Abstract
Perceptual decisions can be regarded as the result of a Bayesian inference process, combining prior beliefs with sensory observations to form posterior beliefs about hidden states of the world. Depending on whether sensory observations are passively sampled or actively generated, prior beliefs guiding the inference process can be informed by probabilistic sensory cues, or by knowledge about action-outcome relationships. However, it remains unclear exactly how goal-directed actions impact belief updating and subsequent perceptual decision-making. Here, we compared the outcomes of inference for perceptual judgments or goal-directed actions during a probabilistic reversal learning task, in which we manipulated the uncertainty of sensory observations (Experiment 1) or the volatility of the environment (Experiment 2). Participants were either asked to infer a hidden state from computer-sampled observations, or to generate specific observations determined by a hidden state, while keeping track of sudden reversals in the hidden state. Critically, participants received the same amount of evidence for the current hidden state under each instruction. Results indicate that active inference may slow reversal learning by reducing responsiveness to conflicting evidence. In addition, using Bayesian computational modelling, we investigated trial-by-trial belief trajectories and response models, aiming to disentangle perception and learning from decision noise or response bias and to study how goal-directed actions may impact the way we perceive and form beliefs about the world in noisy and volatile environments.