December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Evidence of separate learning system contributions in categorical visual search
Author Affiliations & Notes
  • Corey Bohil
    University of Central Florida
  • Ashley Phelps
    University of Central Florida
  • Mark Neider
    University of Central Florida
  • Joseph Schmidt
    University of Central Florida
  • Footnotes
    Acknowledgements  Research was supported by NIH National Eye Institute grant R15EY029511
Journal of Vision December 2022, Vol.22, 4229. doi:https://doi.org/10.1167/jov.22.14.4229
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Corey Bohil, Ashley Phelps, Mark Neider, Joseph Schmidt; Evidence of separate learning system contributions in categorical visual search. Journal of Vision 2022;22(14):4229. https://doi.org/10.1167/jov.22.14.4229.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

A longstanding approach to studying visual search is to initiate trials by displaying a pictorial cue that matches a search target. However, in recent years, many researchers have utilized categorical cues that arguably convey greater ecological validity (Schmidt & Zelinsky, 2009). Categorical search opens the door to many questions guided by research into category learning and mental representations of categories. For example, much classification research has focused on the contributions of separate explicit- and implicit-rule learning systems in the brain (Ashby & Maddox, 2011). In the current study, we compared visual search performance using categorical cues after participants learned four categories defined by either an explicit conjunction (verbalizable) or implicit information-integration (nonverbalizable) rule. Participants classified sine wave gratings varying in spatial frequency and orientation, and eye-tracking indicated which items (target and distractors) were fixated during search. Decision-bound models were fit to individual participant classification data to determine which participants used the correct rule type during learning and whether explicit and implicit rules lead to differences during search. Results showed no accuracy difference between rule types at the end of classification training, although search accuracy was higher after explicit-rule training. Implicit-rule training led to faster responding, both during speeded classification and the search phase. Additionally, target dwell times during search were longer in the explicit rule condition than in the implicit rule condition. Importantly, no search differences were found when the identical categorical stimuli were tested using pictorial cues, suggesting the effects are specific to applying a category rule. These data suggest that applying an explicit verbalizable classification rule takes longer than an implicit nonverbalizable classification rule in category learning and search. These results align with theoretical explanations indicating that explicit verbalizable rule use involves conscious reasoning, whereas implicit nonverbalizable rule use involves more automatic associative learning.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×