Abstract
Facial emotion plays an important role in nonverbal human communication. There are a large number of studies on how facial emotion is estimated in our brains by using frontal facial images. However, we mostly observe non-frontal faces and it has not been clear how much we can estimate the facial emotion of side faces. To clarify the influence of the face direction and the presentation time on the estimation accuracy of human facial emotion, we conducted an experiment by using 3D images of human faces. We used a 3D scanner for the acquisition of 3D images of human faces which have several kinds of expressions. By rotating the 3D models horizontally or vertically, we made visual stimuli of some kinds of side faces and leaned faces. In the experiment, participants chose an expression label after observing a stimuli image. The condition of duration were 100ms and 2000ms. Results showed that the estimation accuracy of facial expression labelled "happy" did not depend on the horizontal rotation angle of side faces whereas the recognition accuracy of other facial expressions decreased with increasing the rotation angle from the front. In particular, the response of "anger" in the estimation increases with increasing the horizontal rotation angle of the face. In addition, it was found that estimation accuracy of the facial emotion as a function of the vertical rotation angle of leaned faces depends on the kind of emotion. Moreover, it was found that the estimation accuracy of the "happy" emotion did not depend on the presentation time while the estimation accuracy at the short presentation time (100ms) is lower than that at the long presentation time (2000ms). These results suggest that independent estimation mechanisms exist for each facial emotion in our visual system.
Meeting abstract presented at VSS 2018