September 2015
Volume 15, Issue 12
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2015
Feature correlation guidance in category visual search
Author Affiliations
  • Rachel Wu
    Brain and Cognitive Sciences, University of Rochester
  • Zoe Pruitt
    Brain and Cognitive Sciences, University of Rochester
  • Megan Runkle
    Brain and Cognitive Sciences, University of Rochester
  • Kristen Meyer
    Brain and Cognitive Sciences, University of Rochester
  • Gaia Scerif
    Dept of Experimental Psychology, University of Oxford
  • Richard Aslin
    Brain and Cognitive Sciences, University of Rochester
Journal of Vision September 2015, Vol.15, 926. doi:https://doi.org/10.1167/15.12.926
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Rachel Wu, Zoe Pruitt, Megan Runkle, Kristen Meyer, Gaia Scerif, Richard Aslin; Feature correlation guidance in category visual search. Journal of Vision 2015;15(12):926. https://doi.org/10.1167/15.12.926.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Compared to objects with uncorrelated features (e.g., jelly beans come in many colors), objects with correlated features (e.g., bananas tend to be yellow) enable more robust object and category representations (e.g., Austerweil & Griffiths, 2011; Wu et al., 2011; Younger & Cohen, 1986). It is unknown whether these more robust representations impact attentional templates (i.e., working memory representations guiding visual search). Adults participated in four visual search tasks (2x2 design) where targets were defined as either one item (a specific alien) or a category (any alien) with correlated features (e.g., circle belly shape, circle back spikes) or uncorrelated features (e.g., circle belly shape, triangle back spikes). We measured behavioral responses and the N2pc component, an event-related potential (ERP) marker for target selection. Behavioral responses were better for correlated items than uncorrelated items for both exemplar and category search. While the N2pc amplitude was larger for exemplar search compared to category search, the amplitude only differed based on feature correlation for category search: The N2pc was present for category search with correlated features, and not present in search for uncorrelated features. Our ERP results demonstrate that correlated (and not uncorrelated) features for novel categories provide a robust category representation that can guide visual search.

Meeting abstract presented at VSS 2015

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×