Abstract
Bayesian accounts of perception hold that observers should take into account prior experience when making sensory judgements. In our everyday experience, the sizes and weights of objects are strongly positively correlated. When objects are lifted, visual information about size can be combined with haptic feedback about weight, and a naive application of Bayes rule predicts that the perceived weight of larger objects should be exaggerated and smaller objects underestimated. Instead it is the smaller of two objects of equal weight that is perceived as heavier, a striking and counterintuitive phenomenon termed the Size-Weight Illusion (SWI). Here we provide a new normative explanation of the SWI based on principles of efficient coding, which dictate that stimulus properties should be encoded with a fidelity that depends on how frequently those properties are encountered in the natural environment. Recent work has shown how efficient coding induces seemingly "anti-Bayesian" biases in individual visual features, including planar orientation, but the theory has not previously been applied to multidimensional or cross-modal stimuli. Here we show that the precision with which human observers estimate object weight varies as a function of both mass and volume in a manner consistent with the joint distribution of those properties observed among everyday objects. We further show that participants' biases in weight perception (the SWI) are quantitatively predicted by Bayesian estimation when taking into account the gradient of discriminability induced by efficient encoding. Results from the related Material-Weight Illusion (MWI), in which objects seemingly made of less dense material are judged as heavier, can also be accounted for on these principles, with surface material mediating the expected relationship between volume and mass.