August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
See an object, hear an object file: Object correspondence transcends sensory modality
Author Affiliations
  • Kerry Jordan
    Department of Psychology, Utah State University
  • Kait Clark
    Center for Cognitive Neuroscience, Duke University
  • Stephen Mitroff
    Center for Cognitive Neuroscience, Duke University, and Department of Psychology and Neuroscience, Duke University
Journal of Vision August 2009, Vol.9, 724. doi:https://doi.org/10.1167/9.8.724
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kerry Jordan, Kait Clark, Stephen Mitroff; See an object, hear an object file: Object correspondence transcends sensory modality. Journal of Vision 2009;9(8):724. https://doi.org/10.1167/9.8.724.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

An important task of perceptual processing is to parse incoming information from the external world into distinct units and to subsequently keep track of those units over time as the same, persisting internal representations. Within the realm of visual perception, this concept of maintaining persisting object representations has been theorized as being mediated by “object files” - episodic representations that store (and update) information about objects' properties and track objects over time and motion via spatiotemporal information (e.g., Kahneman et al., 1992). Although object files are typically conceptualized as visual representations, here, we demonstrate that object-file correspondence can be computed across sensory modalities. We employed a novel version of the object-reviewing paradigm: Line-drawn pictures (e.g., a phone and a dog) were briefly presented within two distinct objects in a preview display. Once the pictures disappeared, the objects moved (to decouple objecthood from location) and then a sound (e.g., a dog bark) occurred. The sound was localized to the left or right of the display, corresponding to the end locations of the two objects. Participants were instructed to indicate whether the sound matched either preview picture or whether it was novel (e.g., a dog bark would “match” if either preview picture was a dog). Participants were significantly faster to respond when the sound occurred with the object originally containing the associated picture compared to when the sound occurred with the other object. This significant response time benefit provides the first evidence for visual and auditory information working in tandem to underlie object-file correspondence. An object file can be initially formed with visual input and later accessed with corresponding auditory information. Object files may thus operate at a highly abstract level of perceptual processing that is not tied to specific modalities.

Jordan, K. Clark, K. Mitroff, S. (2009). See an object, hear an object file: Object correspondence transcends sensory modality [Abstract]. Journal of Vision, 9(8):724, 724a, http://journalofvision.org/9/8/724/, doi:10.1167/9.8.724. [CrossRef]
Footnotes
 This research was supported by a R03 MH080849-01 grant to Stephen Mitroff.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×