Abstract
Psychophysical studies have demonstrated the reduced discrimination ability for obliquely oriented patterns as compared to horizontal and vertical ones in the central area of the visual field. This so called oblique effect is reversed at larger eccentricities, where oblique patterns are discriminated and detected better. What is the origin of this space variant oblique effect? It has been proposed to understand it through the second order statistics of natural images, as quantified by the power spectrum.
To investigate the role of higher order statistics and task behavior we obtain image sequences by simulating an agent navigating through simulated wooded environments. Unsupervised learning algorithms are used to learn a sparse code of the images but contrary to previous approaches, the learning is separately carried out for different regions of the visual field and different gaze allocations during walking. The learned receptive fields show a preference for horizontal and vertical orientations at the center and oblique directions in the periphery, larger area with increasing eccentricity, and steeper radial tuning curves. Furthermore, it is shown that the distributions are significantly dependent on where gaze is directed within the visual field during walking. Finally, by simulating walking through environments with severely reduced higher order statistics it is shown that the second order dependencies are not sufficient for observing the receptive fields anisotropies. The analysis was repeated on a dataset obtained from human subjects walking through a wooded environment also demonstrating comparable results.
We conclude that the properties of model receptive fields not only depend on the statistics of visual scenes in the environment, but also on the statistics imposed on the stimulus by the imaging geometry and the statistics of the interaction with the environment during natural task execution. Taking all these determinants together, receptive fields can be learned that explain the oblique effect.