September 2005
Volume 5, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2005
Rendering visual representations from oscillatory brain activity
Author Affiliations
  • Marie L. Smith
    University of Glasgow, Scotland
  • Frederic Gosselin
    University of Montreal, Canada
  • Philippe G. Schyns
    University of Glasgow, Scotland
Journal of Vision September 2005, Vol.5, 903. doi:https://doi.org/10.1167/5.8.903
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marie L. Smith, Frederic Gosselin, Philippe G. Schyns; Rendering visual representations from oscillatory brain activity. Journal of Vision 2005;5(8):903. https://doi.org/10.1167/5.8.903.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The subjectively seamless nature of visual experience would intuitively suggest that the underlying representations of the visual world evolve continuously. There is, however, a controversial alternative suggesting that these visual representations are in fact discrete, built up in the brain over a number of discrete processing epochs. In order to investigate this assertion we extended a new method, based on Bubbles (Gosselin & Schyns, 2001; Smith, Gosselin & Schyns, 2004), to relate EEG oscillatory activity (low frequency theta band, 4–8Hz) to the time course of visual stimulus information processing. In a first experiment naïve observers categorized sparsely sampled pictures of faces, by gender in one session and expressive or not in a second. Using estimates of the information driving behavioral response (accuracy, reaction times) we derived the sensitivity of low frequency EEG oscillations to facial features when observers resolved each of the tasks. We show that theta (4–8Hz) oscillations support discrete information processing epochs, corresponding to a modulated sensitivity of the brain to specific facial features. We reveal the integration of these features over several epochs to forge specific visual representations for different face categorizations. These later epochs not only represent more facial features, but they also integrate information across hemi-fields (i.e. bilaterally rather than contra-laterally). In a second experiment, we instructed naïve observers to categorize by expression, (fear, disgust, anger or surprise), sparsely presented images of expressive faces sampled over a range of spatial frequency bands. Applying this methodology we again found evidence of discrete processing epochs. This technique also enables a tracking in time of the sensitivity to specific facial features in the brain providing more direct evidence of “information picking” strategies.

Smith, M. L. Gosselin, F. Schyns, P. G. (2005). Rendering visual representations from oscillatory brain activity [Abstract]. Journal of Vision, 5(8):903, 903a, http://journalofvision.org/5/8/903/, doi:10.1167/5.8.903. [CrossRef]
Footnotes
 This research was partly supported by ESRC grant R000239646.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×