June 2006
Volume 6, Issue 6
Free
Vision Sciences Society Annual Meeting Abstract  |   June 2006
Ultrarapid extraction of configural information from biologically salient visual stimuli: Magnetoencephalographic evidence
Author Affiliations
  • Hanneke K. Meeren
    Cognitive and Affective Neuroscience Laboratory, Tilburg University, Tilburg, The Netherlands, and F.C. Donders Centre for Cognitive Neuroimaging, Nijmegen, The Netherlands
  • Nouchine Hadjikhani
    MGH/MIT/HMS Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA, USA
  • Seppo P. Ahlfors
    MGH/MIT/HMS Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA, USA
  • Matti S. Hämäläinen
    MGH/MIT/HMS Athinoula A. Martinos Center for Biomedical Imaging, Charlestown, MA, USA
  • Beatrice de Gelder
    Cognitive & Affective Neuroscience Laboratory, Tilburg University, Tilburg, The Netherlands, and FC Donders Centre for Cognitive Neuroimaging, Nijmegen, The Netherlands, and GH/MIT/HMS Athinoula Martinos Center for Biomedical Imaging, Charlestown, MA, USA
Journal of Vision June 2006, Vol.6, 430. doi:10.1167/6.6.430
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hanneke K. Meeren, Nouchine Hadjikhani, Seppo P. Ahlfors, Matti S. Hämäläinen, Beatrice de Gelder; Ultrarapid extraction of configural information from biologically salient visual stimuli: Magnetoencephalographic evidence. Journal of Vision 2006;6(6):430. doi: 10.1167/6.6.430.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We used magnetoencephalography (MEG) to investigate early visual processing of biologically meaningful stimuli. Healthy subjects had to make an orientation judgment while viewing 9 different stimulus conditions, consisting of photographs of faces, bodies and houses, which were presented in either their canonical (upright) or inverted (upside-down) orientations, or in a Fourier phase-randomized scrambled version to control for low-level visual attributes. MEG data were acquired with a 306-channel Neuromag VectorView system and were co-registered with structural high-resolution T1-weighted MR images. The underlying neuronal generators of the evoked responses were estimated with anatomically constrained noise-normalized minimum norm estimate. Time courses of the source estimates were extracted from several anatomically-defined regions of interest (ROI), amongst them the calcarine sulcus, i.e. the site where the primary visual cortex (V1) resides. The estimated source currents in V1 showed a prominent early component, starting around 50–60 ms and peaking around 95ms. This component was attenuated for meaningful stimuli compared to scrambled images. In addition, there was an early “inversion effect” for biologically salient stimuli (faces and bodies), with larger responses for inverted versus upright stimuli, whereas no such effect was found for houses. In particular, responses to upright faces were strongly suppressed and started to diverge significantly from the responses to inverted faces as early as 63 ms after stimulus onset. These findings show that the extraction of configural information from biological stimuli already starts at very early stages in the visual processing stream, before information enters the object recognition system of the ventral visual pathway.

Meeren, H. K. Hadjikhani, N. Ahlfors, S. P. Hämäläinen, M. S. de Gelder, B. (2006). Ultrarapid extraction of configural information from biologically salient visual stimuli: Magnetoencephalographic evidence [Abstract]. Journal of Vision, 6(6):430, 430a, http://journalofvision.org/6/6/430/, doi:10.1167/6.6.430. [CrossRef]
Footnotes
 This research was financially supported by the Netherlands Organization for Scientific Research
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×