Abstract
When performing a purely visual task such as contrast detection, any strategy based on nonvisual factors will lead to suboptimal performance. We recently found such suboptimal behavior while probing the mouse's visual capabilities in a two-alternative forced choice experiment (Busse et al. Journal of Neuroscience, 2011). Daily sessions in mice yielded high-quality psychometric curves for contrast detection, with hundreds of trials. Although the stimulus could appear, with equal probability, on the left or right of the display, mice rarely acted as if the stimulus location was random: their choices depended on estimates of reward value and on "superstition" factors such as recent failures and rewards. This behavior was captured by a simple generalized linear model involving superstition terms depending on the outcome of the previous trial. These terms often had more weight than a 20% contrast grating on the subsequent decision. Is this superstitious behavior limited to mice? We performed similar experiments in humans (rewarding them with a sound rather than a drop of water), and found to our surprise that very similar superstitious effects shape human psychometric curves. Human observers give particularly strong negative weights to past failures, and they often extend their superstitious weighting of nonvisual factors to events occurring in the past two or three trials. Once again, a generalized linear model was very effective in predicting all these effects. This superstitious behavior is likely affecting many a published psychometric curve. It masquerades as noise that limits sensitivity. Knowing about it and accounting for it with a simple model can be useful for multiple reasons: (1) it allows for much cleaner estimates of psychophysical sensitivity; (2) it might be used to defeat superstition by devising appropriate stimulus presentation schedules; (3) it allows the study of the superstition influences, both in psychophysics and in neural recordings.
Meeting abstract presented at VSS 2012