Open Access
Article  |   July 2016
Perception of stereoscopic direct gaze: The effects of interaxial distance and emotional facial expressions
Author Affiliations
Journal of Vision July 2016, Vol.16, 5. doi:https://doi.org/10.1167/16.9.5
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jussi Hakala, Jari Kätsyri, Tapio Takala, Jukka Häkkinen; Perception of stereoscopic direct gaze: The effects of interaxial distance and emotional facial expressions. Journal of Vision 2016;16(9):5. https://doi.org/10.1167/16.9.5.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Gaze perception has received considerable research attention due to its importance in social interaction. The majority of recent studies have utilized monoscopic pictorial gaze stimuli. However, a monoscopic direct gaze differs from a live or stereoscopic gaze. In the monoscopic condition, both eyes of the observer receive a direct gaze, whereas in live and stereoscopic conditions, only one eye receives a direct gaze. In the present study, we examined the implications of the difference between monoscopic and stereoscopic direct gaze. Moreover, because research has shown that stereoscopy affects the emotions elicited by facial expressions, and facial expressions affect the range of directions where an observer perceives mutual gaze—the cone of gaze—we studied the interaction effect of stereoscopy and facial expressions on gaze perception. Forty observers viewed stereoscopic images wherein one eye of the observer received a direct gaze while the other eye received a horizontally averted gaze at five different angles corresponding to five interaxial distances between the cameras in stimulus acquisition. In addition to monoscopic and stereoscopic conditions, the stimuli included neutral, angry, and happy facial expressions. The observers judged the gaze direction and mutual gaze of four lookers. Our results show that the mean of the directions received by the left and right eyes approximated the perceived gaze direction in the stereoscopic semidirect gaze condition. The probability of perceiving mutual gaze in the stereoscopic condition was substantially lower compared with monoscopic direct gaze. Furthermore, stereoscopic semidirect gaze significantly widened the cone of gaze for happy facial expressions.

Introduction
Gaze is an important social cue for primates. Humans use gaze to evaluate the level and target of attention, as well as mental state, and to predict the intentions and actions of others (Baron-Cohen, 1995; Langton, Watt, & Bruce, 2000). Several studies have confirmed a joint-attention effect of gaze, where the direction of a looker's gaze increases the likelihood of stimulus detection in the direction of the gaze (for a comprehensive review, see Frischen, Bayliss, & Tipper, 2007). Because gaze can convey information about the attention, mental state, and intentions of others, gaze perception is a significant component in the development of social cognition and theory of mind (e.g., Itier & Batty, 2009). The perception of mutual gaze (also known as “eye contact” and “dyadic gaze”) is a special case of perceived gaze direction where the observer and the looker look into each other's eyes. Although gaze direction and mutual-gaze perception are extensively studied topics, binocular perception of gaze direction appears neglected in the literature, particularly in conjunction with emotional facial expressions. The identification of potential sources of bias in gaze perception, such as stimulus presentation parameters, facial expressions of the looker, or their combined effects, is essential for the design of generalizable experiments and the modeling of both gaze perception and emotions. 
Humans estimate gaze direction from several cues, of which iris–sclera configuration and head turn are the most prominent (e.g., Kluttz, Mayes, West, & Kerby, 2009). Humans have a uniquely white sclera compared with other primates (Kobayashi & Kohshima, 1997), with the exception of rare variations such as are found in some gorilla species (Mayhew & Gómez, 2015). Moreover, the shape of the human eye has evolved so that more sclera is exposed compared with other primates (Kobayashi & Kohshima, 2001). The light-colored sclera provides a high luminance contrast with the relatively dark iris, which allows humans to determine the direction of a looker's gaze accurately based on the iris–sclera configuration. When a looker gazes at objects in the environment (i.e., triadic gaze), an observer can discriminate the direction of gaze with an acuity of about 1° at a 100-cm viewing distance (Symons, Lee, Cedrone, & Nishimura, 2004). 
In addition to iris–sclera configuration, head turn also modulates gaze-direction perception. Wollaston (1824; see also Todorović, 2006) discussed the effect of head turn on perceived gaze direction in portraits. In his famous demonstration, he superimposed the same drawing of left-turned eyes on two drawings of a face, one with a head turned to the left and the other to the right. The drawing where the head turn was congruent with the eye turn appeared to look to the left, while the drawing with the head turned to the right and eyes to the left appeared to look directly at the viewer. The Wollaston effect illustrates the capability of the visual system to combine cues from head turn and eye-region information in gaze-direction perception (Otsuka, Mareschal, & Clifford, 2015; Todorović, 2006). However, studies have revealed that head turn may also elicit a bias in gaze-direction judgments. In an experiment with a live looker stimulus, Gibson and Pick (1963) showed that a 30° head turn elicits a bias in peak mutual-gaze direction to the same direction as the head turn, which indicates a bias in perceived gaze direction toward the opposite direction (e.g., a gaze direction of 3° from a head turned 30° in the same direction was perceived as direct, indicating a bias of −3°). Cline (1967) also found a similar opposite-direction bias of judged gaze direction for a 30° head turn of a live looker. Anstis, Mayhew, and Morley (1969) reported a comparable bias with pictorial stimuli and a systematic overestimation. Vine (1971) suspected that the overestimation only takes place at greater gaze angles, and that the overall perceived gaze direction is nonlinear. Argyle and Cook (1976) echoed Vine's argument and hypothesized that at small angles gaze direction is underestimated, and at greater angles, overestimated. The hypothesis was tested by Masame (1990), who confirmed that observers underestimated gaze direction at angles within ±3.7° and overestimated it at greater angles. Mutual gaze appears to attract gaze-direction judgments toward the observer, whereas a gaze averted beyond a threshold begins to repel the judgments away from the observer. 
Gamer and Hecht (2007) coined the term cone of gaze as a measure of the range of a looker's gaze directions for which the observer perceives mutual gaze. In their results, the diameter of the cone for live stimuli at a 1-m viewing distance was 8.12°, which is only slightly wider than the angle between the underestimation thresholds ±3.7° at a 2-m viewing distance measured by Masame (1990). Viewing distance has an effect on the width of the cone of gaze; at 5 m, the cone of gaze narrowed to 3.90° (Gamer & Hecht, 2007). Nevertheless, the cone is substantially wider than the 1° acuity found in triadic gaze. Martin and Jones (1982) tested mutual-gaze detection in settings with reduced lighting and increased viewing distance, and found the discriminability of mutual gaze reduced. They proposed that there is a greater penalty for ignoring mutual gaze compared with the penalty for a false positive detection, which explains the excessive width of the cone of gaze as well as its widening under reduced discriminability conditions. Recent research has confirmed that human gaze perception has a prior for gaze direction that biases perception toward mutual gaze (Mareschal, Calder, & Clifford, 2013). 
Emotion studies have found that mutual gaze makes the detection of the approach-oriented emotions joy and anger faster and increases their perceived intensity, whereas averted gaze speeds up the detection of the avoidance-oriented emotions fear and sadness and intensifies them (Adams & Kleck, 2003, 2005). Conversely, the findings of Ewbank, Jennings, and Calder (2009) show that angry facial expressions widen the cone of gaze; they did not, however, find a tendency toward averted gaze in fearful expressions. Likewise, research has shown that happy and angry facial expressions increase the probability of perceiving mutual gaze (Lobmaier, Tiddeman, & Perrett, 2008) and perceiving that the other person is attending to self (Lobmaier & Perrett, 2011). Fearful facial expressions did not modulate the perception of mutual gaze or attention to self. As the effect was substantially stronger for happy facial expressions, the authors proposed that a self-referential positivity bias affects mutual gaze judgments (Lobmaier & Perrett, 2011); compared with other expressions, people judge happy expressions to be more likely directed toward themselves. 
Scholars have utilized several types of stimuli during the past few decades of gaze-direction research. Early studies utilized live stimuli, whereas the majority of research conducted during the 21st century has utilized pictorial stimuli. Pictorial stimuli have included photographs and computer-generated portraits in monoscopic and, in a few cases, stereoscopic conditions. In a study comparing gaze-direction perception between live and pictorial conditions, Anstis et al. (1969) found no clear differences between the conditions, with the assumption that the response is linear. However, a comparison of the sigmoid-shaped curves in their figure 3 leads us to speculate that a nonlinear approach could have uncovered a larger constant overestimation shift in the pictorial condition compared with the live condition. Symons et al. (2004) found higher acuity in a live condition compared with a pictorial condition in triadic-gaze judgments. 
In the domain of monoscopic portraits, the Mona Lisa effect (e.g., Brewster, 1883) describes a phenomenon where the direct gaze of a portrait follows the observer regardless of viewing angle. The effect was analyzed from a geometrical viewpoint by Todorović (2006), who explains that the combination of head turn and gaze direction in two-dimensional portraits is independent of viewing angle. In addition to the Mona Lisa effect, monoscopic stimuli exhibit a previously unexamined unnatural phenomenon: Both eyes of the observer receive a direct gaze, as if the looker were simultaneously looking at both eyes of the observer. In reality, a looker is able to fixate only on one spatial location of the observer's face at a time. When looking at facial stimuli, the initial fixations land on the center of the face, after which the majority of fixations land on salient features of the face (Bindemann, Scheepers, & Burton, 2009). Research has identified the eyes, mouth, and nose as the most salient facial features (e.g., Laidlaw, Risko, & Kingstone, 2012; Yarbus, 1967), but proportionally, people spend the overwhelming majority of time looking at the left and right eye of a face that is presented in approximately life-size at a 1-m distance (Henderson, Williams, & Falk, 2005). If the angular size of the facial stimuli is small, the fixations cluster around the center of the face instead of the specific facial features (M. Xu, Ren, & Wang, 2015). However, in social interaction between two individuals, interpersonal distance is usually approximately 1 m or less (Baxter, 1970; Worchel, 1986), which makes it probable that the fixations in such interactions land on specific facial features. When a looker fixates on either eye of the observer, that eye receives a direct gaze and the other eye receives a gaze averted by the interpupillary distance of the observer to the nasal side in live and orthostereoscopic conditions. The implications of this difference between monoscopic and three-dimensional direct gaze have not yet been investigated, and for example, it is uncertain whether such a subtle difference affects perceived gaze direction or the perception of mutual gaze. 
A few studies have examined the difference in gaze-direction perception between monoscopic and stereoscopic conditions. Imai, Sekiguchi, Inami, Kawakami, and Tachi (2006) compared judgment errors in live, monoscopic, and stereoscopic conditions for horizontally and vertically averted gazes. Vertically averted gaze was judged most accurately in the live condition, followed by stereoscopic, and least accurately in the monoscopic condition. Presentation condition had no effect on the accuracy of the judgments of horizontally averted gaze. Gamer and Hecht (2007) compared computer-generated monoscopic and stereoscopic stimuli with live stimuli and concluded that there were no differences between the widths of the cone of gaze, except that at a far viewing distance (5 m), the live-condition cone of gaze was narrower than in the pictorial conditions. The authors speculated that the difference originated from image resolution or differences in the looker attributes. More recently, West (2015) studied gaze-direction perception in monoscopic and stereoscopic conditions, with results indicating that perceived directions of gaze between monoscopic and stereoscopic conditions do not differ. However, the stimuli used in that experiment were acquired with a stereoscopic camera setup where both cameras were offset from the midline, one to the left and one to the right, and the looker fixated on the point midway between the cameras. This is analogous to a real-life situation where the looker fixates on the bridge of the observer's nose, and consequently, we call this type of stereoscopic direct gaze the bridge-of-nose gaze. As already mentioned, when looking at a face at a relatively short distance, a looker mainly fixates on the left or the right eye of the observer, not on the bridge of the nose. Thus, when viewed binocularly, each eye of the observer receives a different oculocentric gaze direction: One eye receives a direct gaze and the other eye receives a slightly averted gaze. We name this type of direct gaze semidirect gaze to distinguish it from the bridge-of-nose gaze. In the visual system, oculocentric directions received by the two eyes transform into a single egocentric direction perceived in the cyclopean view. An observer perceives the cyclopean view to originate from a location between the two eyes, sometimes called the cyclopean eye or the egocenter. The egocentric perceived directions of objects follow the Wells–Hering laws (Ono & Mapp, 1995). In particular, Wells–Hering Law 3c states that the perceived cyclopean direction is the mean of the physical oculocentric directions if the retinal images are successfully fused. The law yields a perceived visual direction that is veridical with respect to the physical environment. Similarly, such averaging over the oculocentric gaze directions would yield a veridical perceived cyclopean direction, and appears to hold for the stereoscopic bridge-of-nose gaze (West, 2015). However, little is known about egocentric perceived gaze direction in the semidirect gaze condition. 
The inspiration for the present study stems from our informal observations of stereoscopic facial photographs acquired for a previous study (Hakala, Kätsyri, & Häkkinen, 2015). In our stereoscopic photography setup, we varied the distance between the cameras—the interaxial distance (IAD)—which is the camera equivalent of the interpupillary distance (IPD). The mean IPD is 63 mm for adults (Dodgson, 2004). We observed that mutual gaze was lost as we increased the IAD beyond distances above natural IPDs, even though one eye of the observer received a direct gaze. In the present study, we verify and quantify this informal observation experimentally. We compare perceived gaze direction and mutual gaze in the stereoscopic semidirect gaze condition with direct and averted monoscopic gaze conditions. Our two hypotheses are the following: (1) Perceived gaze direction in a semidirect gaze condition equals the mean of the left- and right-eye stimulus gaze directions. (2) Mutual-gaze discrimination is based on perceived gaze direction and thus on the mean of the left- and right-eye stimulus gaze directions in the stereoscopic semidirect gaze condition. 
The basis for Hypothesis 1 lies in the fact that it results in a veridical gaze direction in natural viewing conditions, comparable with Wells–Hering Law 3c for visual direction and earlier findings with bridge-of-the-nose gaze (West, 2015). Hypothesis 2 is very similar to Hypothesis 1, but we want to avoid assuming that the perception of mutual gaze is only dependent on the perceived gaze direction and unaffected by the fact that one eye of the observer receives a direct gaze. From Hypothesis 2 it follows that the probability of perceiving mutual gaze in the semidirect gaze condition equals the probability of perceiving mutual gaze from a monoscopic gaze averted by half the IAD. Consequently, as the mean of the left- and right-eye stimulus gaze directions exceeds a specific threshold value, the probability of mutual gaze decreases below chance level, even though one eye of the observer continues to receive a direct gaze. In addition to these hypotheses, we explore whether stereoscopic semidirect gaze interacts with emotional facial expressions. Our earlier findings indicate that emotional facial expressions have the potential to elicit stronger emotions in the observer when presented stereoscopically compared with the mean of the emotions elicited by individually presented monoscopic left and right images (Hakala et al., 2015). Because stereoscopy affects the emotions elicited by facial expressions, and emotional facial expressions affect gaze perception, emotional facial expressions potentially interact with the stereoscopy condition also in gaze-direction judgment and mutual-gaze discrimination. 
Methods
Stimuli and apparatus
Stereoscopic and monoscopic photographs of two female (F1, F2) and two male (M1, M2) Finnish professional actors served as looker stimuli. Two actors had gray-green eyes, and the other two had hazel and blue-gray eyes. To acquire the stimuli, we captured photographs with two Canon 5D Mark II (Canon Inc., Tokyo, Japan) cameras equipped with a 50-mm f/1.4 USM lens attached in a parallel configuration to a beam-splitter stereo rig. The distance from the right-camera focal point to the eyes of the actor was 80 cm. The actors were instructed to look straight at the right-side (their left) camera lens at all times, so that the right-side camera always received a direct gaze. The right-side camera was positioned directly in front of the actor's face and the left camera was positioned from 15 to 115 mm left of the midline at 25-mm intervals while maintaining parallel optical axes between the cameras. Thus, the resulting stereoscopic IADs were 15, 40, 65, 90, and 115 mm. The left and right photographs of the stereoscopic pair were used as monoscopic stimuli; the left-camera photographs depicted gazes averted by the IAD with a congruent head turn, and the right-camera photographs depicted direct gazes with a straight head. The actors wore a rubber cap to cover their hair and other external facial features that might otherwise introduce variability into the stimuli and draw attention away from internal facial features (Gronenschild, Smeets, Vuurman, van Boxtel, & Jolles, 2009). We photographed the actors expressing angry, happy, and neutral emotions. To standardize the facial expressions across the actors and different shots of the same actor, the recording sessions were supervised by one of the authors (JK), who is a certified Facial Action Coding System (Ekman, Friesen, & Hager, 2002) coder. The target facial configurations were AU4+5+7+24 (anger; activation of Brow Lowerer, Upper Lid Raiser, Lid Tightener, and Lip Pressor actions) and AU6+12 (happiness; activation of Cheek Raiser and Lip Corner Puller actions). The photographs were cropped to a square aspect ratio and shifted so that the midpoint between the actor's pupils in the left and right photographs were in the middle of the image. Thus, the eyes of the actors had zero stereoscopic disparity and were perceived at the display plane in the resulting stereoscopic image. The photographs were scaled to ensure that the distance between the pupils in the right-side photograph on the display screen approximately matched the actor's IPD. Figure 1 shows samples of the final stimuli. 
Figure 1
 
Samples of the stereoscopic stimuli. Top row: Actor F1 exhibiting happy facial expression at IADs of 15, 65, and 115 mm. Bottom row: Actor M1 exhibiting neutral, angry, and happy facial expressions at a 65-mm IAD. Stereoscopic pairs are laid out for parallel free viewing.
Figure 1
 
Samples of the stereoscopic stimuli. Top row: Actor F1 exhibiting happy facial expression at IADs of 15, 65, and 115 mm. Bottom row: Actor M1 exhibiting neutral, angry, and happy facial expressions at a 65-mm IAD. Stereoscopic pairs are laid out for parallel free viewing.
The display area surrounding the stimulus photograph and the wall behind the display were gray and had a luminance of 15 cd/m2. The display device was a 24-in. autostereoscopic SL2400 display (Tridelity AG, St. Georgen, Germany) positioned 80 cm in front of the observer. Viewing distance was controlled by a chin rest, which also held the observer in the optimal viewing zone for the autostereoscopic display. The stimulus conditions of each observer included one male and one female looker. Observers were randomly assigned to one of the four different looker combinations (M1–F1, M1–F2, M2–F1, and M2–F2) such that an equal number of male and female observers saw each combination. The stimuli covered three stereoscopy conditions: stereoscopic semidirect, monoscopic left, and monoscopic right. We defined the gaze direction of the stimuli as the mean of the actual looker gaze angles presented to the left and the right eye of the observer. Thus, with the chosen IADs, the mean stimulus gaze directions αS were 1.1°, 2.9°, 4.7°, 6.5°, and 8.3° for monoscopic left stimuli; 0.5°, 1.4°, 2.3°, 3.2°, and 4.1° for stereoscopic semidirect stimuli; and 0° for monoscopic right stimuli. In all conditions, head turn was congruent with gaze direction. Figure 2 illustrates the monoscopic and stereoscopic oculocentric gaze directions. The three stereoscopy conditions, five IADs, three facial expressions, and two lookers resulted in 90 trials per observer. Half of the observers viewed the original images, and the other half viewed the images mirrored about the vertical axis. Table 1 summarizes the stimulus conditions. 
Figure 2
 
Examples of the effect of varying IAD on the oculocentric gaze directions in the monoscopic (top row) and stereoscopic semidirect (bottom row) conditions for an observer with a 65-mm IPD. The oculocentric gaze direction α received by the left eye (LE) and right eye (RE) of the observer are presented side by side. The overall stimulus gaze direction αS is the mean of the directions received by the eyes of the observer: (αLE + αRE)/2. In the monoscopic right condition, both eyes of the observer received the same direct gaze (αS = 0). In the monoscopic left condition, both eyes of the observer received the same averted gaze (αS = αLE = αRE). In the stereoscopic condition, the left eye of the observer received an averted gaze and the right eye received a direct gaze—αS = (αLE + αRE)/2. Head turn was congruent with gaze direction in all stimuli.
Figure 2
 
Examples of the effect of varying IAD on the oculocentric gaze directions in the monoscopic (top row) and stereoscopic semidirect (bottom row) conditions for an observer with a 65-mm IPD. The oculocentric gaze direction α received by the left eye (LE) and right eye (RE) of the observer are presented side by side. The overall stimulus gaze direction αS is the mean of the directions received by the eyes of the observer: (αLE + αRE)/2. In the monoscopic right condition, both eyes of the observer received the same direct gaze (αS = 0). In the monoscopic left condition, both eyes of the observer received the same averted gaze (αS = αLE = αRE). In the stereoscopic condition, the left eye of the observer received an averted gaze and the right eye received a direct gaze—αS = (αLE + αRE)/2. Head turn was congruent with gaze direction in all stimuli.
Table 1
 
Summary of the stimulus conditions. Notes: In the monoscopic conditions, both eyes of the observers received either the same left-camera image (L) or the same right-camera image (R), whereas in the stereoscopic condition the eyes received different images (L+R). The stereoscopic condition gaze and head direction are the mean of the L and R images (see Figure 2).
Table 1
 
Summary of the stimulus conditions. Notes: In the monoscopic conditions, both eyes of the observers received either the same left-camera image (L) or the same right-camera image (R), whereas in the stereoscopic condition the eyes received different images (L+R). The stereoscopic condition gaze and head direction are the mean of the L and R images (see Figure 2).
Observers
Forty observers took part the experiment: 20 men and 20 women, with a mean age of 28 years (SD = 9 years). The observers had normal or only slightly impaired visual acuity as tested with the Lea Numbers test (Lea-Test Ltd., Helsinki, Finland); visual acuity of four of the observers was below 20/20 (1.0 on the decimal scale) but above 20/40 (0.5). All observers measured phoria of less than 10 prism diopters horizontal and less than 2 prism diopters vertical in the Maddox Wing near phoria test (Clement Clarke Ltd., London, UK). All observers exhibited normal or only slightly impaired stereoacuity; four observers failed the TNO stereo test (Laméris Ootech BV, Utrecht, the Netherlands) but performed well on the RANDOT stereo test (Stereo Optical Company Inc., Chicago, IL). The mean IPD at an 80-cm convergence distance was 60.4 mm (SD = 2.7 mm). 
Procedure
After the vision tests and two practice trials, the experiment proceeded with the 90 trials in an entirely interleaved and randomized order. Trial time was unrestricted so the observers could proceed at their own pace; the mean trial duration was 26.7 s. The observers completed two tasks: mutual-gaze discrimination and gaze-direction judgment. The discrimination task was a two-alternative forced-choice task to evaluate whether or not the observer was “able to achieve eye contact with the looker.” The observers inputted their responses on a tablet computer. The direction-judgment task utilized a physical slider fixed to the table 27 cm in front of the observer's viewing position, as illustrated in Figure 3. In the beginning of the experiment, the slider knob was positioned in the middle of the slider (0°). The observer was instructed to slide the knob to the line of sight of the looker. A computer vision system recorded the slider value before the observer proceeded to the next trial. Before the next trial commenced, the observer returned the slider knob to the middle position. In addition to the two tasks, self-assessed emotional arousal and valence were measured and are reported elsewhere (Hakala et al., 2015). The present study adhered to the tenets of the Declaration of Helsinki and the ethical principles established by the Finnish Advisory Board on Research Integrity (http://www.tenk.fi/en/). 
Figure 3
 
The experimental setup. The observer judged the stimulus gaze direction with a slider, and the judged gaze direction αJ was automatically recorded.
Figure 3
 
The experimental setup. The observer judged the stimulus gaze direction with a slider, and the judged gaze direction αJ was automatically recorded.
Analysis
Gaze direction
We utilized a linear mixed-effects model to analyze the judged gaze-direction data. We chose this model instead of the more traditional analysis of variance because the design of the experiment was not fully balanced. The fixed effects in the model were the mean stimulus gaze direction αS, stereoscopy condition, facial expression, and all interactions between them. The raw data encompassed three viewing conditions: monoscopic left, monoscopic right, and stereoscopic semidirect. Because the mean stimulus gaze direction αS was included as a predictor in our model, the monoscopic right condition could not be included in the analysis, as the gaze direction in this condition was always zero for all IADs and their inclusion would have resulted in rank deficiency. Instead of conducting a separate analysis for the data from the monoscopic right condition, we assigned those data evenly to the monoscopic left and stereoscopic conditions. This procedure was motivated by the fact that both the monoscopic left and stereoscopic semidirect conditions approach the monoscopic right condition as αS approaches zero. Consequently, the analysis covered two viewing conditions: monoscopic and stereoscopic semidirect. For the monoscopic condition, αS was the gaze direction of the looker relative to the camera. For the stereoscopic semidirect condition, αS was the mean of the left- and right-eye stimulus gaze directions (in practice, αS was half of the monoscopic left stimulus gaze direction, because the other eye always received a direct gaze in the stereoscopic semidirect condition). Preliminary analysis uncovered that the judged gaze direction grew exponentially with αS, as indicated also in literature (Argyle & Cook, 1976; Masame, 1990). The nonlinear component was included in the model as a cubic term Image not available to include the possible underestimation at small angles as well as the overestimation at greater angles. Both lookers and observers had random intercepts and slopes in the model (compare Judd, Westfall, & Kenny, 2012).  
Mutual gaze
To assess how gaze direction affects mutual-gaze perception, we built a nonlinear mixed-effects model with a Gaussian function of the stimulus gaze direction αS. Although separating the “left” and “right” responses that comprise the nonmutual-gaze responses in our experiment design would have allowed us to calculate the crossover points (Mareschal, Calder, Dadds, & Clifford, 2013) using two logistic fits, our model is an acceptable approximation and fits the data well. Utilizing the approach to measure the explained variation outlined by R. Xu (2003), we obtained an Image not available value of 0.58 for our complete model. Standard deviation σ and peak height h of the Gaussian were fitted for the fixed effects, while the mean and the lower asymptote of the Gaussian were fixed to zero. Stereoscopy i and facial expression j were the fixed effects and h had random intercepts for observers and lookers. For observer k, looker l, and observation m the model becomes  where αS is the stimulus gaze direction and ϵ is the error term.  
As in the gaze-direction analysis, we divided the data from the monoscopic right condition between the monoscopic left and stereoscopic semidirect conditions. As the mean stimulus gaze-direction angle αS decreases, both the monoscopic left and stereoscopic semidirect conditions approach the monoscopic right condition, converging when αS reaches zero. To account for this, we reassigned the data from the monoscopic right condition randomly to the monoscopic left and stereoscopic semidirect conditions while ensuring that the mean reported eye contact was approximately equal in both groups separately for the three facial-expression conditions. Following the same rationale, the peak-height parameter h (i.e., the height of the Gaussian function at zero gaze angle) depended on the facial expression but not on the stereoscopy condition. 
Results
Gaze direction
Figure 4 illustrates the relationship between the mean gaze-direction judgments of the observers and the IAD. One observer responded 0° in all experimental conditions, and thus his data were omitted from the analysis. In the initial analysis, we compared the judged gaze direction in the stereoscopic condition to the mean of the judged gaze direction in the monoscopic conditions at different IADs. For all facial expressions, the judged gaze direction in the stereoscopic semidirect condition was significantly smaller than the mean of the individually judged left and right monoscopic gaze directions at IADs of 65, 90, and 115 mm, ps < 0.01. For the 15- and 40-mm IADs, the differences were nonsignificant. To facilitate comparison of the stereoscopy conditions, we used the mean stimulus gaze direction αS instead of the IAD in the linear mixed-effects model (Figure 5). The mean stimulus gaze direction αS is the mean of the actual gaze directions presented to the left and right eyes of the observer (see Figure 2 and Analysis). The main effects of αS (β = 1.16, SE = 0.16) and Image not available (β = 0.0056, SE = 0.0005) were significant, F(1, 11.6) = 33.2, p < 0.001, and F(1, 3415.4) = 105.7, p < 0.001, respectively. The mean stimulus gaze direction αS also had significant two-way interactions with facial expression, F(2, 3415.4) = 17.3, p < 0.001, and stereoscopy condition, F(1, 3420.2) = 41.0, p < 0.001. Compared with the neutral facial expression, the judged gaze direction had a flatter slope for facial expressions that were angry, β = −0.13, SE = 0.04, t(3419) = −3.19, p = 0.001, and happy, β = −0.20, SE = 0.04, t(3417) = −5.46, p < 0.001. The happy facial expression also had a significantly flatter slope than the angry facial expression, β = −0.07, SE = 0.04, t(3417) = −2.07, p = 0.039. Likewise, the stereoscopic semidirect condition yielded a significantly flatter judged-direction slope compared with the monoscopic condition, β = −0.29, SE = 0.06, t(3418) = −4.55, p < 0.001. The two-way interaction between facial expression and stereoscopy condition and the three-way interaction between αS, facial expression, and stereoscopy condition were nonsignificant.  
Figure 4
 
Judged gaze direction for different IADs and facial expressions in monoscopic left, monoscopic right, and stereoscopic semidirect conditions with 95% confidence intervals. Gray lines illustrate the mean of the judged gaze directions for the monoscopic left and right conditions.
Figure 4
 
Judged gaze direction for different IADs and facial expressions in monoscopic left, monoscopic right, and stereoscopic semidirect conditions with 95% confidence intervals. Gray lines illustrate the mean of the judged gaze directions for the monoscopic left and right conditions.
Figure 5
 
Judged gaze direction for different mean stimulus gaze directions and facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Lines illustrate the model fit.
Figure 5
 
Judged gaze direction for different mean stimulus gaze directions and facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Lines illustrate the model fit.
Figure 6 shows the bias in gaze-direction judgment (i.e., judged gaze direction minus mean stimulus gaze direction). Monoscopic gaze direction was overestimated for the neutral facial expression at 4.7° and larger angles, and for angry and happy facial expressions, monoscopic gaze direction was overestimated at 6.5° and larger angles. Monoscopic gaze direction was underestimated only for the happy facial expression at 1.1° stimulus gaze direction. Stereoscopic semidirect gaze-direction judgments fell below the stimulus gaze-direction angle for all facial expressions at the 2.3° gaze direction (65-mm IAD). Furthermore, in the stereoscopic conditions with the happy facial expression, judged gaze direction was below the stimulus gaze direction at all gaze directions above 0.5° (IADs above 15 mm). 
Figure 6
 
Gaze-direction judgment bias (judged gaze direction minus mean stimulus gaze direction) for different facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer. Lines illustrate the model fit.
Figure 6
 
Gaze-direction judgment bias (judged gaze direction minus mean stimulus gaze direction) for different facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer. Lines illustrate the model fit.
Mutual gaze
Figure 7 shows the mutual-gaze positive-response proportion means across the observers plotted as a function of IAD. Three observers reported loss of mutual gaze on only one to three trials, and thus their data were insufficient for model fitting and were omitted from the analysis. Furthermore, the data of one observer appeared random and were also discarded. As with the gaze-direction measurements, we used the mean stimulus gaze direction αS in the analysis. We constructed a nonlinear mixed-effects model as described under Analysis. Figure 8 illustrates the mutual-gaze response proportions and the model fit as a function of αS. Expression had a significant effect on peak height h, F(2, 3160) = 11.5, p < 0.001, and spread σ, F(2, 3160) = 22.5, p < 0.001. Compared with the neutral expression (h = 0.86), peak height h was higher for expressions that were angry, β = 0.056, SE = 0.020, t(3160) = 2.77, p = 0.004, and happy, β = 0.092, SE = 0.019, t(3160) = 4.76, p < 0.001. The effect of facial expression on σ is explained by the significant interaction between facial expression and stereoscopy condition, F(2, 3160) = 15.2, p < 0.001. Happy facial expressions in the stereoscopic semidirect gaze condition had a significantly larger σ compared with expressions that were neutral, β = 0.83, SE = 0.23, t(3160) = 3.55, p < 0.001, and angry, β = 0.91, SE = 0.23, t(3160) = 4.02, p < 0.001. To further investigate the difference between the stereoscopy conditions, we used the delta method (Weisberg, 2014) to obtain the 50% probability thresholds for mutual gaze in different conditions (Table 2). Neutral and angry one-sided αS thresholds were all below 3.00°, whereas the thresholds for stereoscopic semidirect and monoscopic happy facial expressions were 3.21° and 4.07°, respectively. The happy facial expression elicited the only statistically significant difference between the stereoscopic semidirect and monoscopic conditions. The differences between the facial expressions were nonsignificant within the monoscopic condition. 
Figure 7
 
Proportion of mutual-gaze responses for different IADs and facial expressions in the stereoscopic semidirect, monoscopic left, and monoscopic right conditions with 95% confidence intervals.
Figure 7
 
Proportion of mutual-gaze responses for different IADs and facial expressions in the stereoscopic semidirect, monoscopic left, and monoscopic right conditions with 95% confidence intervals.
Figure 8
 
Proportion of mutual-gaze responses (points) and probability of mutual gaze from the model fit (lines) for different stimulus gaze directions and facial expressions in the stereoscopic semidirect and monoscopic conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer.
Figure 8
 
Proportion of mutual-gaze responses (points) and probability of mutual gaze from the model fit (lines) for different stimulus gaze directions and facial expressions in the stereoscopic semidirect and monoscopic conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer.
Table 2
 
Mutual-gaze 50% response thresholds and differences (standard errors) between the stereoscopic semidirect and monoscopic conditions obtained with the delta method. Notes: The threshold is one directional (i.e., the radius of the cone of gaze). ***p < 0.001.
Table 2
 
Mutual-gaze 50% response thresholds and differences (standard errors) between the stereoscopic semidirect and monoscopic conditions obtained with the delta method. Notes: The threshold is one directional (i.e., the radius of the cone of gaze). ***p < 0.001.
Discussion
Our first hypothesis was that the judged direction of the stereoscopic semidirect gaze is the mean of the left and right stimulus gaze directions, which would yield veridical directions under natural viewing conditions. The results partially support this hypothesis, as the judgments deviated only a little from our prediction. However, the stereoscopic semidirect gaze-direction judgments fell below our hypothesis for all facial expressions at the 65-mm IAD, which corresponds to a 2.3° mean stimulus gaze direction. Furthermore, for the happy facial expression, gaze direction was below the hypothesis at all IADs above 15 mm. Monoscopic gaze direction was overestimated in most conditions. Our second hypothesis was that mutual-gaze discrimination is based on the mean of the left and right stimulus gaze directions, which was tested by calculating the cone of gaze in all conditions. The cone of gaze did not differ significantly between the stereoscopy conditions, supporting our hypothesis. The probability of perceiving mutual gaze in the stereoscopic semidirect gaze condition corresponded to the probability of perceiving mutual gaze in a monoscopic gaze averted by half the IAD. Thus, for example, the probability of perceiving mutual gaze in the stereoscopic semidirect condition at a 65-mm IAD (mean stimulus gaze direction = 2.3°) was the same as the probability of perceiving mutual gaze from a monoscopic gaze averted by 2.3°. The probability of perceiving mutual gaze fell below chance level at IADs of 90 mm and larger for neutral and angry expressions and at a 115-mm IAD for happy facial expressions. That is, mutual gaze was lost, although one eye of the observer constantly received a direct gaze. Moreover, we confirmed that stereoscopic semidirect gaze interacts with emotional facial expressions. The interaction between the stereoscopy condition and facial expression was significant in the gaze-discrimination analysis: Stereoscopic semidirect gaze widened the cone of gaze for the happy facial expression. 
The stereoscopic gaze-direction judgments fell below the stimulus gaze directions at the 65-mm IAD (mean stimulus gaze direction = 2.3°). The underestimation of the stereoscopic semidirect gaze direction at the 65-mm IAD could be explained by its closeness to the natural IPD. As a looker fixates on one eye of the observer in a live situation, the nonfixated eye of the observer receives a gaze averted to the nasal side by the IPD. Thus, the 65-mm IAD in the present study approximated most accurately the natural condition. For the happy facial expression, the judged stereoscopic gaze direction was below the stimulus gaze direction for all IADs above 15 mm, which could indicate a tendency to underestimate the gaze direction of happy faces in a stereoscopic semidirect gaze condition. However, these results must be interpreted with caution, as the interaction between stereoscopy condition and facial expression was nonsignificant in the gaze-direction analysis. 
The overestimation of monoscopic gaze direction corroborates earlier findings. Gaze direction was overestimated in the monoscopic condition in directions at and above 4.7°, a phenomenon which has been reported previously with live and pictorial stimuli (Anstis et al., 1969; Cline, 1967; Masame, 1990). Contrary to earlier results (Masame, 1990), monoscopic gaze direction was not generally underestimated at small angles, but rather only for the happy facial expression at the smallest tested averted gaze direction (1.1°). The increasing bias in judgments of monoscopic gaze direction found in the present study is comparable with earlier findings (Anstis et al., 1969). Head turn was congruent in all stimulus conditions of the present study, and furthermore, the head-turn angles were relatively small (maximum 8.3°) compared with those in earlier studies, which could explain why we found no bias in gaze direction to the opposite direction. As a comparison, Masame (1990) found a head-turn bias at a 15° head-turn angle but not at 5°. In addition to the novel stereoscopic semidirect gaze results, our results extend the examination of judgment bias into emotional facial expressions; a happy facial expression reduced the overestimation of gaze direction compared with neutral or angry facial expressions in the monoscopic viewing condition. 
Although studies have examined the effect of facial expressions on the perception of mutual gaze, to our knowledge the present study is the first to report results on the effects of facial expressions on perceived gaze-direction judgments. The happy, angry, and neutral facial expressions differed significantly with regard to the judged gaze direction. Both happy and angry facial expressions received smaller judged gaze directions compared with the neutral facial expression. Furthermore, the happy facial expression received smaller judgments than the angry facial expression. These findings show that the effect of facial expression on gaze direction is congruent with its effect on mutual gaze; angry (Ewbank et al., 2009; Lobmaier et al., 2008) and happy (Lobmaier et al., 2008) facial expressions have been found to strengthen the perception of mutual gaze. 
We modeled the mutual-gaze probability as a Gaussian function with the mean fixed at zero and varying peak height and spread. The peak height indicates the probability of experiencing mutual gaze from a direct monoscopic gaze. The peak heights differed significantly between the facial expressions; happy facial expressions had the highest peak height (0.95), followed by the angry (0.91) and lastly the neutral (0.86) facial expressions. The spread of the function was also significantly larger for the happy facial expression. These results corroborate earlier findings (Lobmaier et al., 2008). However, as we examined the cone-of-gaze width, it was only 0.6° wider for the monoscopic happy facial expression compared with the neutral expression, and not statistically significantly different. This difference is somewhat smaller than in the results of Lobmaier et al. (2008), which suggest a substantial widening of the cone of gaze for the happy facial expression based on those authors' figure 2. Likewise, our results do not show a difference between the cone-of-gaze widths for neutral and angry facial expressions, unlike in the study by Ewbank et al. (2009), who measured a small difference. Furthermore, their results show no difference in peak height between neutral and angry facial expressions. These minor differences could originate from differences in the experimental setups and the statistical power of the analysis methods. 
Whereas earlier studies on gaze perception from stereoscopic stimuli have examined the accuracy of gaze-direction perception (Imai et al., 2006; West, 2015) and cone of gaze (Gamer & Hecht, 2007) from averted gazes, our results pertain specifically to the stereoscopic semidirect gaze. Gamer and Hecht (2007) measured the cone of gaze of monoscopic, stereoscopic, and live stimuli with neutral facial expressions and found no substantial effect of stimulus type at a 1-m viewing distance. Their measurement results varied between 7° and 9°; observers reported mutual gaze for gazes that were averted up to 3.5° and 4.5° on either side, depending on the conditions. In the present study, the cone of gaze varied from 6° to 8°. Although the stereoscopic semidirect gaze-direction angle was underestimated and the monoscopic gaze-direction angle overestimated, the cone of gaze in the stereoscopic semidirect gaze condition did not differ from the monoscopic cone of gaze with the neutral and angry expressions. Instead, stereoscopic semidirect gaze widened the cone of gaze only for the happy facial expression. 
The interaction between stereoscopy and facial expression was significant in the gaze-discrimination analysis. The underlying mechanism that widened the cone of gaze for the happy facial expression in the stereoscopic semidirect condition remains unexplained. Research has shown that the approach–avoidance theory alone is inadequate to explain the effects of facial expressions on the width of the cone of gaze. The effect of a happy expression is significantly stronger compared with that of an angry expression, and the avoidance-oriented expressions have no effect on the cone of gaze (Lobmaier et al., 2008). Those authors suggest that a self-referential positivity bias contributes to gaze processing. The results of the present study indicate that a stereoscopic semidirect gaze condition strengthens this self-referential positivity bias. 
Here we propose a possible mechanism responsible for the strengthening. Findings in neuroscience have led to the development of a subcortical fast-track modulation model in face processing (Johnson, 2005; Senju & Johnson, 2009). In the model, face processing—including the processing of gaze direction—is modulated by a subcortical pathway involving the amygdala. The results from an interocular suppression study provide possible evidence for the model; eye contact presented to one eye facilitated the awareness of faces compared with averted gaze (Stein, Senju, Peelen, & Sterzer, 2011). In binocular rivalry, positive emotional faces have been shown to dominate over neutral and negative emotional faces (Alpers & Gerdes, 2007). In a stereoscopic semidirect gaze, one eye of the observer always receives a direct gaze. If we assume that the fast-track modulation utilizes a retinal, instead of a cyclopean, representation, it follows that it receives a direct gaze as input from one eye in the semidirect gaze. Thus, the direct gaze could strengthen the effect of the happy facial expression in the subcortical pathway and consequently the experiencing of mutual gaze. Furthermore, in the transition from retinal to cyclopean representation, the subcortical modulation could result in a heavier weighting of the signal from the eye receiving the direct gaze associated with the happy facial expression. 
Stereoscopically presented facial expressions have been shown to elicit a stronger emotional response at natural IADs compared with the mean of the monoscopic left- and right-eye image responses (Hakala et al., 2015). The results of the present study can partially explain the stronger emotional response. Stereoscopy widened the cone of mutual gaze in the stereoscopic condition for the happy facial expression in the present study, and mutual gaze has been shown to strengthen the emotions elicited by facial expressions (Adams & Kleck, 2005). Thus, the stronger mutual gaze could have caused the strengthening of the emotions elicited by happy facial expressions in the stereoscopic condition. 
In the gaze-direction judgment task, we utilized a slider setup similar to the one used in earlier studies (Anstis et al., 1969; Masame, 1990). The gaze-direction judgment with a slider positioned between the stimulus and the observer could be prone to systemic error. For example, if the observers judge the gaze direction at the frontal plane of their eye level instead of the slider level, and position the slider knob to the perpendicular location on the slider rail, the direction is overestimated. To avoid this, we specifically instructed the observers to judge the position where the axis of the looker's gaze intersects with the slider rail. However, we cannot rule out the possibility that part of the overestimation is due to a misunderstanding of the instructions. Some observers perceived mutual gaze in all or nearly all trials, so we excluded their data from the analysis. Their cone of gaze was apparently beyond the maximum value in our stimuli, and thus computing a threshold value from their data would have been impossible. These observers might have suffered from social anxiety disorder, a condition which has been shown to substantially widen the cone of gaze (Gamer, Hecht, Seipp, & Hiller, 2011; Hecht, Weiland, & Boyarskaya, 2011; Jun, Mareschal, Clifford, & Dadds, 2013). Moreover, stimulus iris–sclera configuration differed between the facial expressions, which could have contributed to the main effect of facial expression. However, a study utilizing upright and inverted face stimuli has demonstrated that the effect of facial expression on mutual-gaze perception is likely independent of iris–sclera configuration (Ewbank et al., 2009). In the present study, our focus was on the effects of stimulus conditions. Each participant evaluated the stimulus combinations once, but the stimuli presented to each observer included two lookers. To account for individual differences of observer and lookers, we included random effects for the observers and lookers in our models, which increased the power of the statistical analyses. Future studies are required to examine the possible individual differences in the perception of stereoscopic semidirect gaze and discover the factors that explain the potential differences. 
To conclude, we have shown that there are differences in gaze-direction judgment and mutual-gaze discrimination between monoscopic and stereoscopic direct gaze conditions. Whereas earlier studies have focused on the width of the cone of gaze in averted-gaze stimuli, we discovered a substantial difference in the direct gaze condition between monoscopic and stereoscopic settings. Monoscopic direct-gaze stimuli elicit a perception of mutual gaze with substantially higher probability than stereoscopic stimuli do, because in the case of monoscopic stimuli, both eyes of the observer receive a direct gaze. Together with the well-known Mona Lisa effect, the unnaturally strong mutual gaze perceived from monoscopic images may explain the appeal of monoscopic portraits that exhibit a direct gaze. Moreover, we showed that the interaction of semidirect gaze and happy facial expression also modulates the perception of mutual gaze. In addition to the differences that we found in the present study, several other factors may differ between the pictorial and live settings and influence the probability of perceiving mutual gaze. Future studies with live stimuli are needed to uncover the differences in mutual-gaze perception between live and stereoscopic semidirect gaze settings, because other differences between the settings may increase or decrease the probability of perceiving mutual gaze. Our findings undermine the generalizability of studies that utilize monoscopic stimuli and studies in which the probability of perceiving mutual gaze is of importance to real-life settings, and highlight the potential of stereoscopic stimuli to mediate gaze with greater fidelity and thus increase ecological validity. 
Acknowledgments
The authors thank Kenta Kusumoto for his technical assistance and contribution in conducting the experiments. This study was supported by a research grant from the Emil Aaltonen Foundation to JK and a grant from the Academy of Finland (project number 265482) to J. Häkkinen. 
Commercial relationships: none. 
Corresponding author: Jussi Hakala. 
Email: jussi.h.hakala@aalto.fi. 
Address: Department of Computer Science, Aalto University, Espoo, Finland. 
References
Adams R. B., Kleck R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14 (6), 644–647, doi:10.1046/j.0956-7976.2003.psci.
Adams R. B., Kleck R. E. (2005). Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion, 5 (1), 3–11, doi:10.1037/1528-3542.5.1.3.
Alpers G. W., Gerdes A. B. M. (2007). Here is looking at you: Emotional faces predominate in binocular rivalry. Emotion, 7 (3), 495–506, doi:10.1037/1528-3542.7.3.495.
Anstis S. M., Mayhew J. W., Morley T. (1969). The perception of where a face or television “portrait” is looking. The American Journal of Psychology, 82 (4), 474–489, doi:10.2307/1420441.
Argyle M., Cook M. (1976). Gaze and mutual gaze. London: Cambridge University Press.
Baron-Cohen S. (1995). Mindblindness: An essay on autism and theory of mind. Boston: MIT Press.
Baxter J. C. (1970). Interpersonal spacing in natural settings. Sociometry, 33 (4), 444–456.
Bindemann M., Scheepers C., Burton A. M. (2009). Viewpoint and center of gravity affect eye movements to human faces. Journal of Vision, 9 (2): 7, 1–16, doi:10.1167/9.2.7.
Brewster D. (1883). Letters on natural magic. London: Chatto & Windus.
Cline M. G. (1967). The perception of where a person is looking. The American Journal of Psychology, 80 (1), 41–50.
Dodgson N. A. (2004). Variation and extrema of human interpupillary distance. Proceedings of SPIE, 5291, 36–46, doi:10.1117/12.529999.
Ekman P., Friesen W. V., Hager J. C. (2002). Facial Action Coding System (2nd ed.). Salt Lake City, UT: Research Nexus eBook.
Ewbank M. P., Jennings C., Calder A. J. (2009). Why are you angry with me? Facial expressions of threat influence perception of gaze direction. Journal of Vision, 9 (12): 16, 1–7, doi:10.1167/9.12.16. [PubMed] [Article]
Frischen A., Bayliss A. P., Tipper S. P. (2007). Gaze cueing of attention: Visual attention, social cognition, and individual differences. Psychological Bulletin, 133 (4), 694–724, doi:10.1037/0033-2909.133.4.694.
Gamer M., Hecht H. (2007). Are you looking at me? Measuring the cone of gaze. Journal of Experimental Psychology: Human Perception and Performance, 33 (3), 705–715, doi:10.1037/0096-1523.33.3.705.
Gamer M., Hecht H., Seipp N., Hiller W. (2011). Who is looking at me? The cone of gaze widens in social phobia. Cognition and Emotion, 25 (4), 756–764, doi:10.1080/02699931.2010.503117.
Gibson J. J., Pick A. D. (1963). Perception of another person's looking behavior. The American Journal of Psychology, 76 (3), 386–394, doi:10.2307/1419779.
Gronenschild E. H. B. M., Smeets F., Vuurman E. F. P. M., van Boxtel M. P. J., Jolles J. (2009). The use of faces as stimuli in neuroimaging and psychological experiments: A procedure to standardize stimulus features. Behavior Research Methods, 41 (4), 1053–1060, doi:10.3758/BRM.41.4.1053.
Hakala J., Kätsyri J., Häkkinen J. (2015). Stereoscopy amplifies emotions elicited by facial expressions. i-Perception, 6 (6), 1–17, doi:10.1177/2041669515615071.
Hecht H., Weiland R., Boyarskaya E. (2011). The cone of gaze. In 4th international conference on human system interactions (pp. 378–385). Piscataway, NJ: IEEE.
Henderson J. M., Williams C. C., Falk R. J. (2005). Eye movements are functional during face learning. Memory & Cognition, 33 (1), 98–106.
Imai T., Sekiguchi D., Inami M., Kawakami N., Tachi S. (2006). Measuring gaze direction perception capability of humans to design human centered communication systems. Presence, 10 (2), 123–138, doi:10.1162/pres.2006.15.2.123.
Itier R. J., Batty M. (2009). Neural bases of eye and gaze processing: The core of social cognition. Neuroscience and Biobehavioral Reviews, 33 (6), 843–863, doi:10.1016/j.neubiorev.2009.02.004.
Johnson M. H. (2005). Subcortical face processing. Nature Reviews Neuroscience, 6 (10), 766–774, doi:10.1038/nrn1766.
Judd C. M., Westfall J., Kenny D. A. (2012). Treating stimuli as a random factor in social psychology: A new and comprehensive solution to a pervasive but largely ignored problem. Journal of Personality and Social Psychology, 103 (1), 54–69, doi:10.1037/a0028347.
Jun Y. Y., Mareschal I., Clifford C. W. G., Dadds M. R. (2013). Cone of direct gaze as a marker of social anxiety in males. Psychiatry Research, 210 (1), 193–198, doi:10.1016/j.psychres.2013.05.020.
Kluttz N. L., Mayes B. R., West R. W., Kerby D. S. (2009). The effect of head turn on the perception of gaze. Vision Research, 49 (15), 1979–1993, doi:10.1016/j.visres.2009.05.013.
Kobayashi H., Kohshima S. (1997). Unique morphology of the human eye. Nature, 387, 767–768.
Kobayashi H., Kohshima S. (2001). Unique morphology of the human eye and its adaptive meaning: Comparative studies on external morphology of the primate eye. Journal of Human Evolution, 40 (5), 419–435, doi:10.1006/jhev.2001.0468.
Laidlaw K. E. W., Risko E. F., Kingstone A. (2012). A new look at social attention: Orienting to the eyes is not (entirely) under volitional control. Journal of Experimental Psychology: Human Perception and Performance, 38 (5), 1132–1143, doi:10.1037/a0027075.
Langton S. R. H., Watt R. J., Bruce V. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, 4 (2), 50–59, doi:10.1016/S1364-6613(99)01436-9.
Lobmaier J. S., Perrett D. I. (2011). The world smiles at me: Self-referential positivity bias when interpreting direction of attention. Cognition and Emotion, 25 (2), 334–341, doi:10.1080/02699931003794557.
Lobmaier J. S., Tiddeman B. P., Perrett D. I. (2008). Emotional expression modulates perceived gaze direction. Emotion, 8 (4), 573–577, doi:10.1037/1528-3542.8.4.573.
Mareschal I., Calder A. J., Clifford C. W. G. (2013). Humans have an expectation that gaze is directed toward them. Current Biology, 23 (8), 717–721, doi:10.1016/j.cub.2013.03.030.
Mareschal I., Calder A. J., Dadds M. R., Clifford C. W. G. (2013). Gaze categorization under uncertainty: Psychophysics and modeling. Journal of Vision, 13 (5): 18, 1–10, doi:10.1167/13.5.18. [PubMed] [Article]
Martin W. W., Jones R. F. (1982). The accuracy of eye-gaze judgement: A signal detection approach. British Journal of Social Psychology, 21 (4), 293–299, doi:10.1111/j.2044-8309.1982.tb00551.x.
Masame K. (1990). Perception of where a person is looking: Overestimation and underestimation of gaze direction. Tohoku Psychologica Folia, 49, 33–41.
Mayhew J. A., Gómez J.-C. (2015). Gorillas with white sclera: A naturally occurring variation in a morphological trait linked to social cognitive functions. American Journal of Primatology, 77 (8), 869–877, doi:10.1002/ajp.22411.
Ono H., Mapp A. P. (1995). A restatement and modification of Wells-Hering's laws of visual direction. Perception, 24 (2), 237–252.
Otsuka Y., Mareschal I., Clifford C. W. G. (2015). Gaze constancy in upright and inverted faces. Journal of Vision, 15 (1): 21, 1–14, doi:10.1167/15.1.21. [PubMed] [Article]
Senju A., Johnson M. H. (2009). The eye contact effect: Mechanisms and development. Trends in Cognitive Sciences, 13 (3), 127–134, doi:10.1016/j.tics.2008.11.009.
Stein T., Senju A., Peelen M. V., Sterzer P. (2011). Eye contact facilitates awareness of faces during interocular suppression. Cognition, 119 (2), 307–311, doi:10.1016/j.cognition.2011.01.008.
Symons L. A., Lee K., Cedrone C. C., Nishimura M. (2004). What are you looking at? Acuity for triadic eye gaze. The Journal of General Psychology, 131 (4), 451–469.
Todorović D. (2006). Geometrical basis of perception of gaze direction. Vision Research, 46 (21), 3549–3562, doi:10.1016/j.visres.2006.04.011.
Vine I. (1971). Judgement of direction of gaze: An interpretation of discrepant results. British Journal of Social and Clinical Psychology, 10 (4), 320–331, doi:10.1111/j.2044-8260.1971.tb00756.x.
Weisberg S. (2014). Applied linear regression (4th ed.). Hoboken, NJ: John Wiley & Sons.
West R. W. (2015). Differences in the judged direction of gaze from heads imaged in 3-D versus 2-D. Perception, 44 (7), 727–742, doi:10.1177/0301006615594702.
Wollaston W. H. (1824). On the apparent direction of eyes in a portrait. Philosophical Transactions of the Royal Society of London, 114 (1), 247–256.
Worchel S. (1986). The influence of contextual variables on interpersonal spacing. Journal of Nonverbal Behavior, 10 (4), 230–254, doi:10.1007/BF00987482.
Xu M., Ren Y., Wang Z. (2015). Learning to predict saliency on face images. In 2015 IEEE international conference on computer vision (pp. 3907–3915), doi:10.1109/ICCV.2015.445. Piscataway, NJ: IEEE.
Xu R. (2003). Measuring explained variation in linear mixed effects models. Statistics in Medicine, 22 (22), 3527–3541, doi:10.1002/sim.1572.
Yarbus A. L. (1967). Eye movements and vision. New York: Plenum Press.
Figure 1
 
Samples of the stereoscopic stimuli. Top row: Actor F1 exhibiting happy facial expression at IADs of 15, 65, and 115 mm. Bottom row: Actor M1 exhibiting neutral, angry, and happy facial expressions at a 65-mm IAD. Stereoscopic pairs are laid out for parallel free viewing.
Figure 1
 
Samples of the stereoscopic stimuli. Top row: Actor F1 exhibiting happy facial expression at IADs of 15, 65, and 115 mm. Bottom row: Actor M1 exhibiting neutral, angry, and happy facial expressions at a 65-mm IAD. Stereoscopic pairs are laid out for parallel free viewing.
Figure 2
 
Examples of the effect of varying IAD on the oculocentric gaze directions in the monoscopic (top row) and stereoscopic semidirect (bottom row) conditions for an observer with a 65-mm IPD. The oculocentric gaze direction α received by the left eye (LE) and right eye (RE) of the observer are presented side by side. The overall stimulus gaze direction αS is the mean of the directions received by the eyes of the observer: (αLE + αRE)/2. In the monoscopic right condition, both eyes of the observer received the same direct gaze (αS = 0). In the monoscopic left condition, both eyes of the observer received the same averted gaze (αS = αLE = αRE). In the stereoscopic condition, the left eye of the observer received an averted gaze and the right eye received a direct gaze—αS = (αLE + αRE)/2. Head turn was congruent with gaze direction in all stimuli.
Figure 2
 
Examples of the effect of varying IAD on the oculocentric gaze directions in the monoscopic (top row) and stereoscopic semidirect (bottom row) conditions for an observer with a 65-mm IPD. The oculocentric gaze direction α received by the left eye (LE) and right eye (RE) of the observer are presented side by side. The overall stimulus gaze direction αS is the mean of the directions received by the eyes of the observer: (αLE + αRE)/2. In the monoscopic right condition, both eyes of the observer received the same direct gaze (αS = 0). In the monoscopic left condition, both eyes of the observer received the same averted gaze (αS = αLE = αRE). In the stereoscopic condition, the left eye of the observer received an averted gaze and the right eye received a direct gaze—αS = (αLE + αRE)/2. Head turn was congruent with gaze direction in all stimuli.
Figure 3
 
The experimental setup. The observer judged the stimulus gaze direction with a slider, and the judged gaze direction αJ was automatically recorded.
Figure 3
 
The experimental setup. The observer judged the stimulus gaze direction with a slider, and the judged gaze direction αJ was automatically recorded.
Figure 4
 
Judged gaze direction for different IADs and facial expressions in monoscopic left, monoscopic right, and stereoscopic semidirect conditions with 95% confidence intervals. Gray lines illustrate the mean of the judged gaze directions for the monoscopic left and right conditions.
Figure 4
 
Judged gaze direction for different IADs and facial expressions in monoscopic left, monoscopic right, and stereoscopic semidirect conditions with 95% confidence intervals. Gray lines illustrate the mean of the judged gaze directions for the monoscopic left and right conditions.
Figure 5
 
Judged gaze direction for different mean stimulus gaze directions and facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Lines illustrate the model fit.
Figure 5
 
Judged gaze direction for different mean stimulus gaze directions and facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Lines illustrate the model fit.
Figure 6
 
Gaze-direction judgment bias (judged gaze direction minus mean stimulus gaze direction) for different facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer. Lines illustrate the model fit.
Figure 6
 
Gaze-direction judgment bias (judged gaze direction minus mean stimulus gaze direction) for different facial expressions in monoscopic and stereoscopic semidirect gaze conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer. Lines illustrate the model fit.
Figure 7
 
Proportion of mutual-gaze responses for different IADs and facial expressions in the stereoscopic semidirect, monoscopic left, and monoscopic right conditions with 95% confidence intervals.
Figure 7
 
Proportion of mutual-gaze responses for different IADs and facial expressions in the stereoscopic semidirect, monoscopic left, and monoscopic right conditions with 95% confidence intervals.
Figure 8
 
Proportion of mutual-gaze responses (points) and probability of mutual gaze from the model fit (lines) for different stimulus gaze directions and facial expressions in the stereoscopic semidirect and monoscopic conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer.
Figure 8
 
Proportion of mutual-gaze responses (points) and probability of mutual gaze from the model fit (lines) for different stimulus gaze directions and facial expressions in the stereoscopic semidirect and monoscopic conditions with 95% confidence intervals. Stimulus gaze direction αS is the mean of the gaze directions received by the left and right eyes of the observer.
Table 1
 
Summary of the stimulus conditions. Notes: In the monoscopic conditions, both eyes of the observers received either the same left-camera image (L) or the same right-camera image (R), whereas in the stereoscopic condition the eyes received different images (L+R). The stereoscopic condition gaze and head direction are the mean of the L and R images (see Figure 2).
Table 1
 
Summary of the stimulus conditions. Notes: In the monoscopic conditions, both eyes of the observers received either the same left-camera image (L) or the same right-camera image (R), whereas in the stereoscopic condition the eyes received different images (L+R). The stereoscopic condition gaze and head direction are the mean of the L and R images (see Figure 2).
Table 2
 
Mutual-gaze 50% response thresholds and differences (standard errors) between the stereoscopic semidirect and monoscopic conditions obtained with the delta method. Notes: The threshold is one directional (i.e., the radius of the cone of gaze). ***p < 0.001.
Table 2
 
Mutual-gaze 50% response thresholds and differences (standard errors) between the stereoscopic semidirect and monoscopic conditions obtained with the delta method. Notes: The threshold is one directional (i.e., the radius of the cone of gaze). ***p < 0.001.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×