Abstract
In the current study, participants had to decide whether a hat was centered on a head. The expression of the face was irrelevant to the task. There are numerous studies (e.g., Eastwood, Smilek, & Merikle, 2003; Fenske, & Eastwood, 2003; Valdes, Rutledge, Miles, & Olah, 2006) that have found that people are less able to ignore sad faces than happy faces. It was hypothesized that participants would take longer to judge the location of the hat when the face was sad. There were five expressions: happy, sad, embarrassed, neutral and angry. The faces were composites of an actual male and female face. The embarrassed expression was excluded from any analyses because the face was asymmetrical. The overall accuracy was high, 96 percent. The average response latency for correct responses was submitted to a 3 × 4 (location by expression) ANOVA. Only the main effect for expressions was significant, F(3, 36) = 13.94, p [[lt]].05. Post hoc analyses indicated that judgments about a sad face were significantly slower (730 ms) than the other conditions. A replication is planned with gendered faces. Some emotions are more likely to be associated with particular genders. For example people are more likely to view women as sad and men as angry (e.g., Condry & Condry, 1976; see Hyde, 2007 for review of gender as a stimuli). It is hypothesized that making judgments about women's sad faces will be slower than men's sad faces because of the automatic activation of this gender stereotype.