Abstract
Over recent years, there has been increasing evidence that substantial information can be extracted from a mere glance at a natural scene (Rousselet et al., 2005; Fei-Fei et al., 2007; Greene & Oliva, 2009). Free-response data has indicated that while low-level sensory detail predominates at very short presentation times, accurate semantic information can frequently be extracted by 100msec (Fei-Fei et al., 2007). Experiments requiring categorization of scenes as members of a target category have meanwhile indicated that basic or super-ordinate level semantic categorization can be achieved with presentations of 12-60 msec (Rousselet et al., 2005; Greene & Oliva, 2009). Within the emotion literature, it has been argued that we are evolutionarily prepared to rapidly extract information about the presence of potential threat (Ohman & Mineka,2001). The extent to which image emotionality impacts the rapid semantic categorization of natural images, has not however been studied. Here, we tested the hypothesis that there would be an advantage for ‘prepared’ stimuli (i.e. animate highly arousing negative stimuli) in semantic categorization at brief presentation durations. Participants were asked to categorize positive, negative and neutral images as ‘people’, ‘animals’, ‘objects’, ‘vehicles’, ‘food’ or ‘scenes’ and rate them for arousal (emotional intensity). Each image was presented for 17, 33 or 100msec prior to backward masking. A texture-synthesis algorithm was used to create masks from the test-set images (Portilla & Simoncelli, 2000; Greene & Oliva, 2009). Categorization performance was above chance by 33msec presentation time, and at ceiling by 100msec. Analysis of the 33msec data using logistic regression revealed a three-way interaction between emotional valence, arousal, and animacy. If the image was positive or negative, highly arousing, and animate, the likelihood of categorizing the image accurately increased. Our data hence suggest that not only negative but also positive high arousal animate stimuli show prioritized perceptual processing, facilitating semantic categorization at short presentation durations.
Meeting abstract presented at VSS 2013