Abstract
Biological motion is an important signal for our visual system to process efficiently and may even serve as a means to perform “life detection” (Troje & Westhoff, 2006). Using a visual search paradigm, we explored factors that influence attentional guidance by biological motion. The search targets were point-light biological motion animations depicting full body movements of recognizable actions, or scrambled versions of the same stimuli, which contain the same local but not global motion information. Targets were presented in a circular array among a cloud of three, five, or seven distracters. In Experiment 1, subjects decided whether the target was on the left or the right side of the screen. In Experiment 2, target absent trials were also included and subjects decided whether or not the target was present. In all conditions, reaction times became longer with increased number of distracters, indicating local and global information both make pre-attentive contributions to visual search. In Experiment 1, the search for the biological motion stimuli was more efficient than the search for the scrambled stimuli, but we did not observe an asymmetry in Experiment 2 (cf. Wang et al., 2010). Reaction times were notably longer in the former experiment. It is possible that local motion signals can guide attention efficiently when the task is easier, whereas global information (i.e., the Gestalt of the point-light walker) guides attention more efficiently than local information alone when the task is more difficult. Alternatively, global biological motion information may contribute significantly to attentional guidance in tasks that require locating a target, but not its detection. We will test these hypotheses, as well as the generalizability of the results to other stimuli. Presently, we conclude that global biological motion can make a contribution above and beyond local motion information to the guidance of attention in visual search.
We thank Josh Solomon for helpful discussions and Sophie Buon, Angela Chan, Chris Gauthier, and Arthur Vigil for help with data collection.