Abstract
Purpose. When an observer moves through the three-dimensional world, a characteristic field of velocity vectors is generated on the retina. Although many theoretical, psychophysical, and physiological studies have demonstrated the use of such optic flowfields for a number of navigational tasks under laboratory conditions, surprising little is known about the actual motion signal distribution under natural operating conditions. We intended to study what motion information is available to the visual system in the real world.
Methods. A panoramic imaging device was mounted on a stepping-motor driven gantry and on moved on accurately defined three-dimensional paths in outdoors environments. The captured image sequences were processed by a biologically inspired motion detector network which allows us to analyse the distribution of motion signals generated by various types of locomotion.
Results. We found that motion signals are sparsely distributed in space and that local directions can be ambiguous and noisy. Spatial pooling or temporal integration can help to retrieve reliable information about local motion vectors from such motion signal maps in which local direction and strength can vary considerably. On the other hand, the overall structure of the flowfield, with distinct centres of expansion and contraction, is obvious even in sparse and noisy motion signal maps, and a simple algorithm can be used to retrieve rather accurately the direction of heading, demonstrating the richness of information contained in the panoramic field of view.
Conclusions. Our approach is a first step to assess the role of behavioural, environmental and computational constraints in natural optic flow processing. Although the local motion information in natural flowfields tends to be sparse and noisy, the extended pattern of motion signals is a rich source of information about observer locomotion.