Abstract
Perceiving objects moving toward us is a vital survival skill. Surprisingly, humans judging 3D motion report an object will miss them when on a collision course with the head (Harris & Dean, 2003 JEP:HPP 29 869 – 881). Here we propose that this bias is a consequence of the brain combining non-redundant information about lateral motion (Vx) and motion in depth (Vz) in a manner that reflects differential sensitivity to the information. We show that measured biases in 3D motion perception are accounted for by a model that incorporates estimates of observers' higher sensitivity to lateral motion than motion in depth. First, we estimate relative sensitivity to component motions by recording thresholds for detecting an increment in displacement when movement is lateral or in depth. Then we show that this model provides a good account of observers' behavior by recording their estimates of 3D motion trajectories for a range of trajectory angles left-right of the head. Finally, we show that observers' bias is decreased when external noise results in reduced sensitivity to lateral motion, as predicted by this model. These results provide novel evidence that the brain cannot help but take into account the reliability with which information is encoded even at the cost of perceptual bias when combining orthogonal sources of information about the environment.
Max Planck Society. BBSRC UK