Abstract
We have recently suggested that optic flow arising due to observer movement is parsed from the retinal image (e.g. Rushton & Warren, 2005, Current Biology, 15, R542-3). It is proposed that this flow parsing process is involved in segmenting the visual scene into moving and scene-stationary objects during observer movement. Here, we present a strong test of this hypothesis and demonstrate that flow parsing depends on global optic flow processing.
Observers (monocularly) fixated the centre of limited lifetime radial flow displays consistent with forwards movement through a cloud of dots. Dots were presented in the upper hemifield, the lower hemifield or both. Simultaneously, a horizontally moving probe was presented at different distances (2 or 4cm) above or below fixation. Observers were then asked to indicate perceived probe trajectory using an adjustable onscreen paddle. Flow parsing predicts that the radial flow will be parsed from the retinal image, and this process will affect perceived probe trajectory. Specifically, in this case, parsing should add an additional perceived downwards component when the probe is above fixation (and vice versa) and its magnitude should increase with distance from fixation. Furthermore, if flow parsing relies upon global visual processing then perceived trajectory should be consistent with these predictions regardless of whether motion is present in the same hemifield as the probe. The results confirm these predictions. By analogy to previous findings on ‘phantom motion after-effects’ (Snowden & Milne, 1997, Current Biology, 7, 717–722) in which a motion after effect was seen to transfer between adapted and non-adapted regions of the visual field, we say the latter result is due to phantom flow parsing.
These results contribute to a growing body of evidence for a purely visual mechanism, reliant upon optic flow processing, involved in the perception of scene-relative object movement during self movement.