Purchase this article with an account.
Jun Saiki; Stimulus-driven mechanism of search asymmetry revealed by classification image analysis of singleton search. Journal of Vision 2006;6(6):530. doi: https://doi.org/10.1167/6.6.530.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
In previous work (Saiki, VSS05), the classification image (CI) technique applied to search asymmetry between O and Q revealed the use of the same vertical bar feature in both search tasks, and strong stimulus-dependent nonlinearity since noise with Q stimulus alone leads to errors. Analyses of CI showed that the bar feature in Q stimulus had larger negative modulation in the O-target condition, probably because multiple Q stimuli are in the search display. Although these findings reject a hypothesis of using different target-defining features, asymmetry in feature strength may reflect top-down control of feature tuning, because the target stimulus was predefined. To address this issue, I used a singleton search task. Three observers viewed displays with 1 target and 3 distractors, either O (1.9 degree diameter ring) or Q (O plus 0.9 degree vertical bar), embedded in white Gaussian noise, each located at 3.8 degree eccentricity. O and Q targets were randomly mixed across trials, thus target identity was unpredictable. Observers first localized the target, and then judged its identity. CIs were constructed from trials with correct target identification and incorrect localization. The pattern of results was the same as that with target-defined search, suggesting that asymmetry was not due to top-down control of feature tuning. Model-based analyses revealed that CIs can be fully accounted for by nonlinear signal transduction and multiplicative noise proportional to output pooled across search items. Taken together, search asymmetry between O and Q may be purely stimulus-driven mechanism.
This PDF is available to Subscribers Only