Abstract
A previous study using the Bubbles technique (Dupuis-Roy, et al., 2009) showed that the eyes, the eyebrows, and the mouth were the most potent features for face-gender discrimination (see also Brown & Perrett, 1993; Russell, 2003, 2005). Intriguingly, the results also revealed a large positive correlation between the mouth region and rapid correct answers. Given the highly discriminative color information in this region, we hypothesized that the extraction of color and luminance cues may have different time courses. Here, we tested this possibility by sampling the chromatic and achromatic face cues independently with spatial and temporal Bubbles (see Gosselin & Schyns, 2001; Blais et al., 2009). Ninety participants (45 men) completed 900 trials of a face-gender discrimination task with briefly presented sampled faces (200 ms). To create a stimulus, we first isolated the S and V channels of the HSV color space for 300 color pictures of frontal-view faces (average interpupil distance of 1.03 deg of visual angle) and adjusted the S channel so that every color was isoluminant (±5 cd/m[sup]2[/sup]); then, we sampled S and V channels independently through space and time with 3D Gaussian windows. The group classification image computed on the response accuracy revealed that in the first 60 ms, participants used the color in the right eye-eyebrow and mouth regions, and that they mostly relied on the luminance information located in the eyes-eyebrows regions later on (>60 ms). Further classification images were computed for each gender-stimulus category. The results indicate that chromatic information in the mouth region led to systematic categorization errors. An analysis of the chromatic information available in this facial area suggests that this is not due to our face database, but rather represents a perceptual bias. Altogether, these results help to disentangle the relative contributions of chromatic and luminance information in face-gender discrimination.
Meeting abstract presented at VSS 2013