Abstract
In our everyday experience, the sizes and weights of objects we encounter are strongly correlated. When objects are lifted, visual information about size can be combined with haptic feedback about weight, and a naive application of Bayes rule predicts that the perceived weight of larger objects should be exaggerated and smaller objects underestimated. Instead it is the smaller of two objects of equal weight that is perceived as heavier, a phenomenon termed the Size-Weight Illusion (SWI). Here we provide a normative explanation of the SWI based on principles of efficient coding, which dictate that stimulus properties should be encoded with a fidelity that depends on how frequently those properties are encountered in the natural environment. We show that the precision with which human observers estimate object weight varies as a function of both mass and volume in a manner consistent with the estimated joint distribution of those properties among everyday objects. We further show that participants' seemingly "anti-Bayesian" biases (the SWI) are predicted by Bayesian estimation when taking into account the gradient of discriminability induced by efficient encoding. The related Material-Weight Illusion (MWI) can also be accounted for on these principles, with surface material providing a visual cue for object density. The model framework can be applied to predict perceptual bias and variability for any sensory properties that are correlated in the environment.
Publisher
Cold Spring Harbor Laboratory