September 2015
Volume 15, Issue 12
Vision Sciences Society Annual Meeting Abstract  |   September 2015
Words jump-start vision: a label advantage in object recognition
Author Affiliations
  • Bastien Boutonnet
    Department of Psychology, University of Wisconsin, Madison
  • Gary Lupyan
    Department of Psychology, University of Wisconsin, Madison
Journal of Vision September 2015, Vol.15, 11. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Bastien Boutonnet, Gary Lupyan; Words jump-start vision: a label advantage in object recognition. Journal of Vision 2015;15(12):11. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Making sense of visual input and its structure largely depends on interplay between bottom-up signals and top-down influences from higher-level processes. Often neglected, is the fact that humans live in a world additionally structured by language where people use language to shape each other’s behaviour in flexible ways. Could language play a key role visual processing? Traditionally, effects of language on perception are often assumed to be “high-level” in that, while language clearly influences reasoning, and decision-making, it does not influence low-level visual processes. Here, and in opposition with this common view, we test the prediction that words are able to provide top-down guidance at the earliest stages of visual processing. We compared whether visual processing of images of familiar animals and artefacts was enhanced after hearing their name (e.g., “dog”) compared to hearing an equally familiar and unambiguous nonverbal sound (e.g., dog-bark). We predicted that words would deploy more effective categorical templates, allowing enhanced visual recognition. By recording EEGs, we were able to distinguish whether this “label-advantage” stemmed from changes to early visual processing or to later semantic decision-making. The results show that hearing a label affects visual processes within 100 ms of image presentation, and that this modulation is category-sensitive. ERPs show that the P1 was larger when people were cued by labels compared to when they were cued by equally informative nonverbal cues. More importantly, this enhancement predicted behavioural responses occurring almost 500 ms later. Hearing labels modulated single-trial P1 activity such that it distinguished between target and non-target images, showing, for the first time, that words rapidly guide early visual processing. Crucially, while cue-picture congruence modulated the N4 – known to index semantic integration– cue-types did not, confirming that both cue-types were equally informative, and that the label-advantage results from modulations of perceptual processes.

Meeting abstract presented at VSS 2015


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.