We considered that these perceptual errors might arise as a natural consequence of estimating motion direction given sensory noise and the geometry of 3D viewing. We characterized the retinal motion signals produced by objects moving along arbitrary trajectories through three dimensions and developed a Bayesian model of perceptual inference. The model predicted a number of known errors, including a lateral bias in the perception of motion trajectories, and a dependency of this bias on stimulus contrast and viewing distance. The model also predicted a number of previously unknown errors, including a dependency of perceptual bias on eccentricity, and a surprising tendency to misreport approaching motion as receding and vice versa.
We then used standard 3D displays as well as a virtual reality (VR) headsets to test these predictions in naturalistic settings, and established that people make the predicted errors.
In sum, we developed a quantitative model of 3D motion perception and provided a parsimonious account for a range of systematic perceptual errors in naturalistic environments.