Abstract
We perceive a coherent visual world, despite the fact that visual processing is largely parallel and modular. There are, however, cases in which this binding is broken, causing asynchronous perception of an object's features (e.g. color and motion) or conflicting judgments of its properties (e.g. motion and position). Despite these examples of misbinding, it remains unclear whether different properties of an object are coded independently. Here we used contingent motion aftereffects (MAE) (Arnold, et al., Current Biology, 2001) to psychophysically measure processing differences both between and within features of the same object. Subjects adapted to a rotating grating whose direction of rotation (alternating clockwise and counterclockwise) was paired synchronously or asynchronously with changes in spatial frequency (either low or high spatial frequency). Following adaptation, subjects judged either the perceived MAE or the perceived position shift of the test grating. If the attributes of an object are processed with the same time course, then direction reversals that are synchronously paired with the spatial frequency changes should produce the strongest spatial-frequency contingent aftereffect. For judgments of the MAE, however, we found that a physical asynchrony must be introduced between attributes to maximize the contingent MAE; the motion reversals had to lag the spatial frequency changes by approximately 90ms. On the other hand, for judgments of the test grating's position (i.e., the phase), no asynchrony was revealed. The distinct time courses for judgments of motion and position- following precisely the same physical adaptation- suggests that they are coded by separate neural populations.