August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Neurodynamics of visual and auditory scene size representations
Author Affiliations
  • Santani Teng
    CSAIL, Massachusetts Institute of Technology
  • Radoslaw Cichy
    CSAIL, Massachusetts Institute of Technology
  • Dimitrios Pantazis
    McGovern Institute for Brain Research, Massachusetts Institute of Technology
  • Verena Sommer
    CSAIL, Massachusetts Institute of Technology
  • Aude Oliva
    CSAIL, Massachusetts Institute of Technology
Journal of Vision September 2016, Vol.16, 571. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Santani Teng, Radoslaw Cichy, Dimitrios Pantazis, Verena Sommer, Aude Oliva; Neurodynamics of visual and auditory scene size representations. Journal of Vision 2016;16(12):571.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Perceiving the geometry of space is a core ability of most animals, mediating spatial cognition between lower-level perceptual processing and navigation-related processing. Information about spatial layout, i.e. the boundaries and size of an environment, is multisensory: the perceived extent of an indoor volume can come from the visual perception of boundaries or be indexed by auditory reverberations. Although the cortical loci of visual spatial layout perception are well described, the dynamics of human spatial cognition in vision and audition remain elusive, as the neuronal markers indexing spatial processing are unknown. Here, we report the electrophysiological signatures of spatial layout perception in visual and auditory modalities. We conducted two experiments. First, in the visual domain, we recorded magnetoencephalography (MEG) in 15 healthy participants who viewed 48 images of scenes differing in size and other factors. Second, in the auditory domain, we recorded MEG in 14 participants who heard 9 reverberant sounds differing in the space size they indexed and the sound sources they contained. We used multivariate pattern classification and representational similarity analysis to analyze both experiments. For vision, we identified a marker of scene size perception around ~250ms, independent of low-level image and semantic properties (i.e. luminance, contrast, clutter, semantic category), thus indexing neural representations robust to changes in viewing conditions as encountered in real-world settings. For audition, we identified an auditory signature of scene size with a more extended temporal profile, peaking at ~385ms, and robust to variation in sound sources as well. These results constitute the first descriptions of an electrophysiological signal for spatial scene processing in humans in both visual and auditory domains. They elucidates the temporal dynamics with which the human brain extracts spatial information from the environment, and opens the door to further investigation of the timing of space perception.

Meeting abstract presented at VSS 2016


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.