August 2014
Volume 14, Issue 10
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Searching through the hierarchy: A behavioral and computational approach to understanding categorical search
Author Affiliations
  • Justin Maxfield
    Department of Psychology, Stony Brook University
  • Chen-Ping Yu
    Department of Computer Science, Stony Brook University
  • Gregory Zelinsky
    Department of Psychology, Stony Brook University
Journal of Vision August 2014, Vol.14, 940. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Justin Maxfield, Chen-Ping Yu, Gregory Zelinsky; Searching through the hierarchy: A behavioral and computational approach to understanding categorical search. Journal of Vision 2014;14(10):940. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

How is the hierarchical structure of categories expressed in a categorical search task, and can we quantify the affects of this structure using a computational model of visual similarity? Trials consisted of a text cue designating a target category at either the superordinate, basic, or subordinate levels, followed by a 6-item target present/absent search array. Targets and distractors were images selected from ImageNet, and distractors on target-present trials differed from the target at the superordinate level. We found that search guidance, measured as the percentage of trials where the target was the first object fixated, was strongest at the subordinate level (29.9%), weaker at the basic level (25.4%), and weaker still at the superordinate level (20%), F(2, 48) = 15.26, p <.001. Target verification, measured as the time between target fixation and the present/absent search decision, showed the standard basic-level advantage; basic-level verification (978ms) was faster than subordinate (1173ms) or superordinate (1267ms) level verification. We modeled these data using dense-hierarchical-SIFT and pyramid-of-HOG features and Chi-squared statistics computed over hundreds of images per each subordinate/basic/superordinate category to obtain similarity estimates between each category and the specific exemplars used as targets in the behavioral search experiment. Analysis of these similarity distances revealed the behavioral guidance pattern; targets were most similar to the subordinate categories, followed by basic and then superordinate categories, F(2, 286) = 58.09, p <.001. This suggests that categorical guidance can be well described by Chi-squared similarity distance to a category. Moreover, a PCA analysis of the features revealed that the basic level required the fewest principal components, corresponding to the advantage seen during target verification. In conclusion, by adopting features from computer vision we show how a similarity-based exemplar model can be extended to real-world objects, and used to explain guidance and verification in categorical search.

Meeting abstract presented at VSS 2014


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.