December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Functionally Related Objects Capture Attention and Improve Search Guidance
Author Affiliations
  • Steven Ford
    University of Central Florida
  • Gregory Zelinsky
    Stony Brook University
  • Joseph Schmidt
    University of Central Florida
Journal of Vision December 2022, Vol.22, 4259. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Steven Ford, Gregory Zelinsky, Joseph Schmidt; Functionally Related Objects Capture Attention and Improve Search Guidance. Journal of Vision 2022;22(14):4259.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Consistency between objects and scene locations improves search performance (Draschkow & Vo, 2017). Functionally related objects (i.e., a hammer above a nail) represents a form of object consistency that may lead to perceptual grouping and attentional capture (Green & Hummel, 2004; 2006). We test this hypothesis utilizing a search task with eye-tracking and Event-Related-Potential (ERP) metrics. Participants were cued with two objects, which were either functionally related or unrelated. After a brief retention interval in which the ERPs were assessed, participants searched for one of the two objects among three unrelated distractors. If related items capture attention more than unrelated items, we should see a larger cue-related N2pc, consistent with a stronger spatial attention shift (Luck, 2012). Additionally, if functionally related items are perceptually grouped, we should see an increased N2pc (Mazza & Caramazza, 2012; Marini & Marzi, 2016), and reduced contralateral-delay-activity (CDA), indicating a lower visual working memory (VWM) load, consistent with perceptual grouping (Diaz et al., 2021). Finally, we predict improvements in search guidance, indexed by a greater percentage of initial search saccades directed at the target. Related objects produced a larger cue-related N2pc, consistent with perceptual grouping and indicating a stronger shift of spatial attention (i.e., greater attentional capture) relative to unrelated objects. However, there were no significant differences in CDA, suggesting that both items in related and unrelated pairs are similarly represented in VWM. Additionally, related objects produced stronger search guidance and improved search performance across several measures. Our findings are mostly consistent with prior reports suggesting that related objects are perceptually grouped and capture attention (Green & Hummel, 2004; 2006). These results suggest that while multiple-target search tends to be more difficult than single target search (Cain et al., 2011; Menneer et al., 2007), the unique coding of functionally related objects improves performance.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.