Abstract
Perceptual analysis of local visual signals is typically modelled via static circuits resembling cortical neurons. It is now evident that these circuits are impacted by the wider context within which they are embedded, such as the highly structured nature of environmental signals, however existing measurements are restricted to constrained viewing conditions where human observers cannot display the full repertoire of natural vision. We overcome this limitation by performing quantitative measurements of visual discrimination in virtual reality. Participants operated within a virtual room containing different boxes at random locations. This environment was designed to present numerous and varied edges, alongside a method to quantify whether each edge is generated by 'image-based' or 'object-based' elements. For example, a high-contrast edge lining the wallpaper is image-based but not object-based; conversely, the transition between a box and its background may carry no image-based contrast, but would mark an object-based edge. We locally perturbed edge regions and required observers to discriminate their orientation ('sensory task'). At the same time, observers engaged in a 'memory task' that probed spatial representation of room layout. We manipulated the availability/reliability of shadow information and studied its impact on cognitive processes supporting the two tasks. Our results demonstrate an interplay between these processes, despite their operating on completely different spatio-temporal scales. When shadow information is unavailable/unreliable, observers shift their weight towards image-based cues to the detriment of object-based cues for performing the sensory task. Dynamic re-allocation of cue information happens slowly on the scale of minutes and is driven by global environmental changes, however it percolates down to local processes that analyze visual signals on the sub-second scale, thus providing a compelling demonstration of the integrated nature of sensory processing during natural behaviour.