Purchase this article with an account.
Rachel Wu, Zoe Pruitt, Megan Runkle, Kristen Meyer, Gaia Scerif, Richard Aslin; Feature correlation guidance in category visual search. Journal of Vision 2015;15(12):926. doi: 10.1167/15.12.926.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Compared to objects with uncorrelated features (e.g., jelly beans come in many colors), objects with correlated features (e.g., bananas tend to be yellow) enable more robust object and category representations (e.g., Austerweil & Griffiths, 2011; Wu et al., 2011; Younger & Cohen, 1986). It is unknown whether these more robust representations impact attentional templates (i.e., working memory representations guiding visual search). Adults participated in four visual search tasks (2x2 design) where targets were defined as either one item (a specific alien) or a category (any alien) with correlated features (e.g., circle belly shape, circle back spikes) or uncorrelated features (e.g., circle belly shape, triangle back spikes). We measured behavioral responses and the N2pc component, an event-related potential (ERP) marker for target selection. Behavioral responses were better for correlated items than uncorrelated items for both exemplar and category search. While the N2pc amplitude was larger for exemplar search compared to category search, the amplitude only differed based on feature correlation for category search: The N2pc was present for category search with correlated features, and not present in search for uncorrelated features. Our ERP results demonstrate that correlated (and not uncorrelated) features for novel categories provide a robust category representation that can guide visual search.
Meeting abstract presented at VSS 2015
This PDF is available to Subscribers Only