Open Access
Article  |   November 2018
Influence of head orientation on perceived gaze direction and eye-region information
Author Affiliations
Journal of Vision November 2018, Vol.18, 15. doi:https://doi.org/10.1167/18.12.15
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yumiko Otsuka, Colin W. G. Clifford; Influence of head orientation on perceived gaze direction and eye-region information. Journal of Vision 2018;18(12):15. https://doi.org/10.1167/18.12.15.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Using synthetic 3D head and eye models, we examined the relationship between perceived gaze direction and the information within the image eye region across changes in head orientation. For each stimulus head and eye orientation, we rendered gray-scale images with realistic pigmentation and shading, and two-tone images depicting the regions corresponding to the iris, pupil, or eye-opening. Behavioural experiments using the gray-scale images as stimuli showed that perceived gaze direction was more strongly biased opposite to head orientation (repulsive effect) in the far-eye visible condition than in the near-eye visible condition. This trend occurred regardless of whether or not the whole face was visible, suggesting that the repulsive effect arose based on eye-region information. Consistent with this, geometrical analysis of the image eye region using the two-tone images revealed that the relative position of the iris and pupil of the far eye shifted opposite to head orientation more than that of the near eye. In addition, our findings regarding the pattern of the influence of head orientation suggest that estimation of the relative iris/pupil position may be achieved through a process of amodal completion of the whole iris behind the eyelid. Additional geometrical analysis of simulated images revealed situations where a greater repulsive effect for the far eye, as found here, is likely to be observed.

Introduction
The perception of others' gaze direction is not based solely on information from the eye region but also on head orientation information. Wollaston (1824) was the first to point out the importance of head orientation in eye gaze perception. He demonstrated that perceived gaze direction from an identical image eye region could be altered by changing the head orientation context. In his demonstration, perceived gaze direction was shifted toward the head orientation, although the eyes themselves were unchanged. In this paper, we call this influence of head orientation that biases the perceived gaze direction towards it an “attractive effect.” Wollaston's stimuli employed two-dimensional (2D) cut-and-paste image manipulations. However, several later studies involving real faces or facial images that actually changed head orientation pose for a given eye orientation reported that perceived gaze direction was shifted opposite to head orientation (e.g., Anstis, AnstMayhew, & Morley, 1969; Gamer & Hecht, 2007; Noll, 1976; Gibson & Pick, 1963; Otsuka, Mareschal, Calder, & Clifford, 2014). In this paper, we call this latter effect of head orientation that biases the perceived gaze direction opposite to the head orientation a “repulsive effect.” Note that we use the term “eye orientation” to refer to the orientation of the eyes relative to the observer, not to the model's head. 
We recently proposed a dual-route model (see Figure 1a) that provides a quantitative account of the influence of head orientation on gaze perception (Otsuka, Mareschal, Calder, & Clifford, 2014; Otsuka, Mareschal, & Clifford, 2015, 2016). The dual-route model posits that head orientation changes in normal three-dimensional (3D) situations affect perceived gaze direction through two distinct routes, resulting in the simultaneous occurrence of two biases of perceived gaze direction in opposite directions, as discussed above. First, turning the head with eyes fixated at the same point (fixed eye orientation) induces a shift of the relative position of the iris within the eye region, due to changes in the amount of visible sclera, in the direction opposite to the head turn (Anstis et al., 1969). This, in turn, induces a repulsive shift in the perceived gaze direction. In the schematic of the dual-route model, this route is illustrated as the arrow from the head turn to the eye-region information (Figure 1a), suggesting that head orientation acts as an indirect cue for the perceived gaze direction via changes in the eye region of the proximal stimulus (Figure 1b). Second, information about head orientation acts as a direct cue for gaze direction that attracts the perceived gaze direction toward the head orientation (attractive effect). This latter route of the effect of head orientation is illustrated by the direct arrow from the head turn to the perceived gaze direction in the schematic (Figure 1a). 
Figure 1
 
The influence of head orientation on perceived gaze direction. (a) Schematic of the dual-route model (Otsuka et al., 2014). The arrows represent the two distinct routes of head orientation influence. First, head orientation has an indirect effect on perceived gaze direction via changes in the eye region. As the relative position of the iris/pupil for a given eye orientation shifts opposite to head turn, this indirect influence acts to bias perceived gaze direction opposite to head orientation (indirect route). Second, head orientation also has a direct effect on perceived gaze direction, biasing the perceived gaze direction toward the head orientation (direct route). (b) Illustration of the influence of head orientation on the eye-region information (indirect route). Three faces with different head orientations but fixed eye orientation relative to the observer are shown on the left side. The eyes to the right of each face are the enlarged view of the eye region of the left eye of each face (dotted square area). The eyeball above these images shows the actual orientation of the eye behind the eyelid. The eyes on the top and bottom eye region images each appear to be gazing in a direction slightly opposite to the head turn (repulsive effect), even though these eyes have an identical orientation.
Figure 1
 
The influence of head orientation on perceived gaze direction. (a) Schematic of the dual-route model (Otsuka et al., 2014). The arrows represent the two distinct routes of head orientation influence. First, head orientation has an indirect effect on perceived gaze direction via changes in the eye region. As the relative position of the iris/pupil for a given eye orientation shifts opposite to head turn, this indirect influence acts to bias perceived gaze direction opposite to head orientation (indirect route). Second, head orientation also has a direct effect on perceived gaze direction, biasing the perceived gaze direction toward the head orientation (direct route). (b) Illustration of the influence of head orientation on the eye-region information (indirect route). Three faces with different head orientations but fixed eye orientation relative to the observer are shown on the left side. The eyes to the right of each face are the enlarged view of the eye region of the left eye of each face (dotted square area). The eyeball above these images shows the actual orientation of the eye behind the eyelid. The eyes on the top and bottom eye region images each appear to be gazing in a direction slightly opposite to the head turn (repulsive effect), even though these eyes have an identical orientation.
One might assume that the occurrence of the direct effect is confined to the case of the Wollaston stimuli, in which the same image of the eye region is artificially inserted into different head orientation contexts. On the contrary, our previous work has shown that the direct effect contributes to perception in most situations, except when the estimation of head orientation is made difficult by occlusion or removal of a large part of the face and head (i.e., eye-region images where only the eyes are visible; Otsuka et al., 2014, 2015). In a normal 3D situation, these two opposing biases occur simultaneously, with a tendency for the repulsive effect occurring through the eye region to be compensated by the attractive effect based on the estimated head orientation. As the two opposing effects tend to cancel each other out, it is difficult to notice the direct effect in normal situations. The operation of the direct effect has been revealed as a reduction in the measured repulsive effect in eye-region images compared to whole face images (Otsuka et al., 2014, 2015). In the Wollaston stimuli, on the other hand, identical eye regions are placed in different head orientation contexts. As such, a manipulation eliminates any effect of the indirect cue; only the direct cue is operational, resulting in an overall attractive effect that is noticeable with the naked eye. 
While the dual-route model accounts well for the two opposing effects of head orientation on perceived gaze direction identified in previous studies (Otsuka et al., 2014, 2015, 2016), the model itself thus far is agnostic as to how the visual system extracts information from the eye region or head orientation. In order to elucidate how the visual system extracts information from the eyes, a detailed understanding of the information within the eye region of the stimulus and its relationship to perceived gaze direction is essential. 
There are two problems that make understanding eye-region information complicated. First, the two eyes of an individual are not always oriented the same way (i.e., in parallel). For example, when a person fixates on an object in the near distance, the eyes tend to orient slightly toward each other (convergence). Second, as has been pointed out in some of the previous studies on eye gaze perception (e.g., Kluttz, Mayes, West, & Kerby, 2009; West, 2015; West, Salmon, & Sawyer, 2008), human eyes have a discrepancy between the visual and pupillary axes (known as angle kappa) The fovea is typically displaced temporally to the optical axis of the eye (about 5° on average; Hashemi et al., 2010; Park, Oh, & Chuck, 2012). Therefore, even when real persons are binocularly fixating on a point (perfect convergence of the visual axis of two eyes), their eyes may spuriously appear slightly diverged from that point. This is due to the slight divergence of the pupillary axis from the visual axis, corresponding to the angle kappa. 
One way to overcome the first problem in attempting to gain an understanding of how the visual system extracts information from the eye region would be to examine the perceived gaze direction from only one eye (one-eye condition) as well as from both eyes (both-eyes condition). In fact, a few studies have compared perceived gaze direction in the one-eye and both-eyes conditions (Noll, 1976; Symons, Lee, Cedrone, & Nishimura 2004; West, et al., 2008; West, 2010, 2013, 2015). However, as these studies used real human models or photographs as stimuli, it is difficult to know exactly how each eye was oriented during the experiment due to the potential discrepancy between the visual and pupillary axis in human eyes (Park et al., 2012). 
Unlike studies using real human faces or photographs, studies using synthetic faces do not suffer from the potential problem of discrepancy between the visual and pupillary axis in real human eyes. Many recent studies investigating gaze perception have used synthetic facial images rendered using 3D computer graphics that enable precise control of eye orientation in 3D space to be depicted in 2D images (Balsdon & Clifford, 2017a, 2017b; Florey, Clifford, Dakin, & Mareschal, 2015; Gamer & Hecht, 2007; Nguyen et al., 2018; Palmer & Clifford, 2017). 
It should be noted that most previous studies have not reported how their stimulus image eye-region varied according to the changes in head orientation, fixation target angle, and distance. This makes it difficult to understand the precise relationship between the sensory input from the eye-region and perceived gaze direction, even in studies using synthetic faces that have precise control over the stimulus eye orientation. 
The only studies that reported the properties of the stimulus eye region were those that employed schematic eye and face images (Todorović, 2009), or that applied a pictorial manipulation to eyes of synthetic faces (Todorović, 2006). However, the stimuli used in such studies do not involve vergence, or the resultant difference in orientation between the eyes that normally occurs along with the change of gaze direction. Examination of the stimulus eye region in more realistic facial images would help foster an understanding of the way the visual system extracts information from the eyes in natural situations. 
The aim of the current study was two-fold. First, through psychophysical experiments, we examined the perception of gaze direction across head orientation changes when only one eye of the model was visible (one-eye condition) and when both eyes were visible (both-eyes condition). Second, through image analysis, we examined how information within each of the eyes in the stimulus images changed along with the changes in eye and head orientation. Comparisons of these data allow us to provide some insights into how the visual system estimates eye orientation from the eye-region in the image. 
It is easy to manipulate the coloring and placement of the facial features of 3D models in computer graphics. This makes it possible to render 2D images that contain only the regions corresponding to the iris, pupil, or eye-opening in a given state of eye/head orientation. Thus, we can avoid the problem of mislocalization of the iris/pupil and eye region. By taking advantage of 3D models of faces and eyes in this way, we analyzed how geometrical information within the eye region in 2D images varied with changes in eye and head orientation. In particular, we analyzed: (a) the relative position of the iris/pupil within the eye-opening; and (b) the circularity of the iris/pupil area in the image. 
As the pupil is a hole through which light enters the eye, localization of the pupil position would provide the most useful estimate for judging another's gaze direction. In fact, Anstis (2018) reported that pupil position rather than iris position determined the perceived motion direction of schematic eyes when the pupil had higher contrast than the iris. However, accurate localization of the pupil in natural facial images can be challenging due to low contrast between iris and pupil, especially when the iris has dark pigmentation. On the other hand, localization of the human iris may be aided by the white sclera that contrasts against the darker iris area (Kobayashi & Kohshima, 2001). Localization of the iris center could help an observer approximate the location of the pupil center, which is located at around the center of the iris (Wyatt, 1995). In normal situations, however, the whole area of the iris is rarely visible, as it is almost always partially occluded by the eyelid except when the eyes are open very wide. Even when the eyes are quite open, the amount of iris occlusion can increase when a person is gazing sideways. Thus, estimation of the center of the iris based solely on the visible part of the eyes could produce errors in estimation. 
Figure 2 is a schematic illustration of the difference in the centroid of the visible part of the iris area (square) and that of an image of the whole iris area (cross). The difference between the two estimates of the geometrical center of the iris area becomes especially clear when the iris is largely occluded. In such conditions, the deviation of the iris from the center of the eye opening is underestimated when only the visible part of the iris is taken into account. Based on these considerations, we analyzed the relative position of the center of the whole iris, the visible part of the iris, and the pupil within the eye opening in our stimulus images. 
Figure 2
 
Schematic illustration of the position of the center of the iris based on only the visible part (square), and on the whole iris (cross). The dotted line represents the contours of the iris occluded by the eyelids. (a) Iris near the center within the eye opening; (b) Iris at the corner of the eye opening.
Figure 2
 
Schematic illustration of the position of the center of the iris based on only the visible part (square), and on the whole iris (cross). The dotted line represents the contours of the iris occluded by the eyelids. (a) Iris near the center within the eye opening; (b) Iris at the corner of the eye opening.
As a few researchers have noted that the circularity of the iris and pupil can be a potential cue to gaze direction (Balsdon & Clifford, 2017, ECVP; Wollaston, 1824), we also examined how the circularity of the iris/pupil area varies with changes in eye and head orientation. The usefulness of such a cue to direct gaze may be aided by our sensitivity to detect deviation from a perfect circle (Regan & Hamstra, 1992; Zanker & Quenzer, 1999). To examine the possible utility of the iris/pupil circularity cue in eye gaze perception, we quantified the circularity of the iris by measuring the aspect ratio of the iris area in our stimulus images. 
Psychophysical Experiment 1A (whole face images): Pointer task
Methods
Participants
Twenty naïve observers (10 male and 10 female, mean age = 18.9 years) served as subjects. The sample size was determined based on our previous studies (Otsuka et al., 2014, 2015, 2016). All subjects had normal or corrected-to-normal vision. All experiments adhered to the declaration of Helsinki guidelines and were approved by the UNSW Human Research Ethics Committee. 
Apparatus
Stimuli were shown on a Viewsonic Graphics Series G90f CRT monitor (1,024 × 768 pixels), controlled by a computer using MATLAB (MathWorks, Natick, MA) with the PsychToolbox (Brainard, 1996). At the viewing distance of 57 cm, one pixel subtended 2 arcmin. 
Stimuli
Examples of the stimuli are shown in Figure 3a. 2D images of four grayscale synthetic neutral faces (two male and two female) with both eyes open were created in the same way as the stimuli in the normal condition in our previous study (Otsuka et al., 2016). The 3D head models created in FaceGen were imported into Blender 2.70. The original eyes in the faces were replaced with 3D eye models created in Blender. The 3D eye models simulated the anatomical structure of the human eye, which included a protruding transparent cornea with the structure corresponding to the iris and pupil located behind the cornea. Unlike the cornea of real human eyes, which acts as a magnifying lens, the material of the cornea in our stimuli was simply transparent. All face images were rendered with a single light source illuminating the face from above the camera using the Blender Render engine in Blender 2.70. Blender Render is a physically nonobjective rasterization engine that geometrically projects objects onto an image plane without advanced optical effects. Although our stimulus images may not have depicted photorealistic shading patterns of faces, we believe that they were sufficiently geometrically accurate and realistic representations of eyes and faces for the current purpose. 
Figure 3
 
Stimuli used in Experiments 1A and 1B. (a) Examples of each image type with 0°eye orientation; (b) examples of the other three faces with 0°eye orientation; (c) on-screen pointer.
Figure 3
 
Stimuli used in Experiments 1A and 1B. (a) Examples of each image type with 0°eye orientation; (b) examples of the other three faces with 0°eye orientation; (c) on-screen pointer.
The orientation of each eye was controlled by changing the angular position of the fixation target, located 40 cm from the face. The two eyes of each face were perfectly converged on the gazing target. Images for the one-eye condition were created by replacing the 3D head model used to create the both-eyes images with appropriate eye-closed versions of the 3D head model created using FaceGen. All images were shown against a medium gray background. When shown on screen, faces were approximately life-sized: In frontal orientation they measured 12.0 cm in width (excluding the ears) and 20.7 cm in height (from the chin to the top of the head) on average. The interpupillary distance was 6.04 cm on average. The faces subtended about 20° × 12° of visual angle, and were viewed from a distance of 57 cm in a dimly lit room. An on-screen pointer (Figure 3c) consisted of a sphere controlled using MATLAB. The pointer subtended 6.5° of visual angle (195 pixels) in diameter, and its angle could vary ±90° in the horizontal plane. 
Procedure
The observers' task was to indicate the perceived gaze direction of faces in the images by adjusting the orientation of the on-screen pointer via a computer mouse. The stimulus sequence for each trial was the same as in our previous study (Otsuka et al., 2016). On each trial, one facial image was shown at around the center of the screen for 500 ms in a raised cosine temporal window. The positional jitter was within ±0.83° of visual angle. The pointer appeared at the center of the screen as soon as the face image disappeared. The pointer remained visible until the observer indicated completion of the adjustment by clicking the mouse button. For each trial, the initial horizontal angle of the pointer was randomly changed in the range of ±90°. The vertical angle of the pointer was fixed at 0°. The intertrial interval was 600 ms, during which the screen was a blank gray. 
Stimuli for each of three image types were shown in separate blocks: both eyes open (both-eyes) images, the model's right eye closed with the anatomical left eye open (LE) images, and the model's left eye closed with the anatomical right eye open (RE) images. All participants completed the both-eyes image block first. This step was to avoid the possibility that repeated judgment of gaze direction from only one eye might interfere with the subsequent processing of gaze direction from both eyes. The order of LE and RE image blocks was counterbalanced between participants. Each participant completed 324 trials consisting of three blocks of 108 trials. In each block, stimuli were presented in a pseudorandom order with 4 facial identities (2 male and 2 female) × 3 head orientations {−20°, 0°, and 20°} × 9 eye orientations {−20°, −15°, −10°, −5°, 0°, 5°, 10°, 15°, and 20°}. The term “eye orientation” and “head orientation” refer to the orientation of the eyes and head relative to the observer. 
Analysis
A previous study reported that the repulsive effect of head orientation differed depending on whether only the near eye or only the far eye of the model was visible (Noll, 1976). Based on this finding, we sorted individual data from both-eyes, LE, and RE images into both-eyes, near-eye, and far-eye conditions, respectively. Specifically, we assigned data from the LE images with a head orientation of −20° and data from the RE images with head orientation of 20° as the data for the near-eye condition. Likewise, we assigned data from the RE images with head orientation of −20° and data from the LE images with head orientation of 20° as the data for the far-eye condition. For these two conditions, we assigned the average data from the LE and RE images as data with a 0° head orientation. We used the data from both-eyes-open images for the both-eyes condition. 
We analyzed data from each condition in the same manner as the data for the “normal condition” in our previous study (Otsuka et al., 2016). Specifically, we performed multiple regression analyses with explanatory variables of eye orientation and head orientation on the mean adjusted pointer angle across subjects. This procedure was done separately for each of the three conditions. The purpose of the current analysis was to derive the values necessary to compute the weightings of head orientation in the dual-route model for each condition. As the weightings of head orientation in the dual-route model provide a simple measure for comparing how information from the head and eye(s) is used in each condition, we considered that the current analysis involving separate multiple regressions best served our purpose. 
We used the weights of the linear regression to estimate the relative weighting of eye orientation, E, and head orientation, H, in determining perceived direction of gaze in the dual-route model (Otsuka et al., 2014). The perceived direction of gaze, G, was modelled as  
\(\def\upalpha{\unicode[Times]{x3B1}}\)\(\def\upbeta{\unicode[Times]{x3B2}}\)\(\def\upgamma{\unicode[Times]{x3B3}}\)\(\def\updelta{\unicode[Times]{x3B4}}\)\(\def\upvarepsilon{\unicode[Times]{x3B5}}\)\(\def\upzeta{\unicode[Times]{x3B6}}\)\(\def\upeta{\unicode[Times]{x3B7}}\)\(\def\uptheta{\unicode[Times]{x3B8}}\)\(\def\upiota{\unicode[Times]{x3B9}}\)\(\def\upkappa{\unicode[Times]{x3BA}}\)\(\def\uplambda{\unicode[Times]{x3BB}}\)\(\def\upmu{\unicode[Times]{x3BC}}\)\(\def\upnu{\unicode[Times]{x3BD}}\)\(\def\upxi{\unicode[Times]{x3BE}}\)\(\def\upomicron{\unicode[Times]{x3BF}}\)\(\def\uppi{\unicode[Times]{x3C0}}\)\(\def\uprho{\unicode[Times]{x3C1}}\)\(\def\upsigma{\unicode[Times]{x3C3}}\)\(\def\uptau{\unicode[Times]{x3C4}}\)\(\def\upupsilon{\unicode[Times]{x3C5}}\)\(\def\upphi{\unicode[Times]{x3C6}}\)\(\def\upchi{\unicode[Times]{x3C7}}\)\(\def\uppsy{\unicode[Times]{x3C8}}\)\(\def\upomega{\unicode[Times]{x3C9}}\)\(\def\bialpha{\boldsymbol{\alpha}}\)\(\def\bibeta{\boldsymbol{\beta}}\)\(\def\bigamma{\boldsymbol{\gamma}}\)\(\def\bidelta{\boldsymbol{\delta}}\)\(\def\bivarepsilon{\boldsymbol{\varepsilon}}\)\(\def\bizeta{\boldsymbol{\zeta}}\)\(\def\bieta{\boldsymbol{\eta}}\)\(\def\bitheta{\boldsymbol{\theta}}\)\(\def\biiota{\boldsymbol{\iota}}\)\(\def\bikappa{\boldsymbol{\kappa}}\)\(\def\bilambda{\boldsymbol{\lambda}}\)\(\def\bimu{\boldsymbol{\mu}}\)\(\def\binu{\boldsymbol{\nu}}\)\(\def\bixi{\boldsymbol{\xi}}\)\(\def\biomicron{\boldsymbol{\micron}}\)\(\def\bipi{\boldsymbol{\pi}}\)\(\def\birho{\boldsymbol{\rho}}\)\(\def\bisigma{\boldsymbol{\sigma}}\)\(\def\bitau{\boldsymbol{\tau}}\)\(\def\biupsilon{\boldsymbol{\upsilon}}\)\(\def\biphi{\boldsymbol{\phi}}\)\(\def\bichi{\boldsymbol{\chi}}\)\(\def\bipsy{\boldsymbol{\psy}}\)\(\def\biomega{\boldsymbol{\omega}}\)\(\def\bupalpha{\unicode[Times]{x1D6C2}}\)\(\def\bupbeta{\unicode[Times]{x1D6C3}}\)\(\def\bupgamma{\unicode[Times]{x1D6C4}}\)\(\def\bupdelta{\unicode[Times]{x1D6C5}}\)\(\def\bupepsilon{\unicode[Times]{x1D6C6}}\)\(\def\bupvarepsilon{\unicode[Times]{x1D6DC}}\)\(\def\bupzeta{\unicode[Times]{x1D6C7}}\)\(\def\bupeta{\unicode[Times]{x1D6C8}}\)\(\def\buptheta{\unicode[Times]{x1D6C9}}\)\(\def\bupiota{\unicode[Times]{x1D6CA}}\)\(\def\bupkappa{\unicode[Times]{x1D6CB}}\)\(\def\buplambda{\unicode[Times]{x1D6CC}}\)\(\def\bupmu{\unicode[Times]{x1D6CD}}\)\(\def\bupnu{\unicode[Times]{x1D6CE}}\)\(\def\bupxi{\unicode[Times]{x1D6CF}}\)\(\def\bupomicron{\unicode[Times]{x1D6D0}}\)\(\def\buppi{\unicode[Times]{x1D6D1}}\)\(\def\buprho{\unicode[Times]{x1D6D2}}\)\(\def\bupsigma{\unicode[Times]{x1D6D4}}\)\(\def\buptau{\unicode[Times]{x1D6D5}}\)\(\def\bupupsilon{\unicode[Times]{x1D6D6}}\)\(\def\bupphi{\unicode[Times]{x1D6D7}}\)\(\def\bupchi{\unicode[Times]{x1D6D8}}\)\(\def\buppsy{\unicode[Times]{x1D6D9}}\)\(\def\bupomega{\unicode[Times]{x1D6DA}}\)\(\def\bupvartheta{\unicode[Times]{x1D6DD}}\)\(\def\bGamma{\bf{\Gamma}}\)\(\def\bDelta{\bf{\Delta}}\)\(\def\bTheta{\bf{\Theta}}\)\(\def\bLambda{\bf{\Lambda}}\)\(\def\bXi{\bf{\Xi}}\)\(\def\bPi{\bf{\Pi}}\)\(\def\bSigma{\bf{\Sigma}}\)\(\def\bUpsilon{\bf{\Upsilon}}\)\(\def\bPhi{\bf{\Phi}}\)\(\def\bPsi{\bf{\Psi}}\)\(\def\bOmega{\bf{\Omega}}\)\(\def\iGamma{\unicode[Times]{x1D6E4}}\)\(\def\iDelta{\unicode[Times]{x1D6E5}}\)\(\def\iTheta{\unicode[Times]{x1D6E9}}\)\(\def\iLambda{\unicode[Times]{x1D6EC}}\)\(\def\iXi{\unicode[Times]{x1D6EF}}\)\(\def\iPi{\unicode[Times]{x1D6F1}}\)\(\def\iSigma{\unicode[Times]{x1D6F4}}\)\(\def\iUpsilon{\unicode[Times]{x1D6F6}}\)\(\def\iPhi{\unicode[Times]{x1D6F7}}\)\(\def\iPsi{\unicode[Times]{x1D6F9}}\)\(\def\iOmega{\unicode[Times]{x1D6FA}}\)\(\def\biGamma{\unicode[Times]{x1D71E}}\)\(\def\biDelta{\unicode[Times]{x1D71F}}\)\(\def\biTheta{\unicode[Times]{x1D723}}\)\(\def\biLambda{\unicode[Times]{x1D726}}\)\(\def\biXi{\unicode[Times]{x1D729}}\)\(\def\biPi{\unicode[Times]{x1D72B}}\)\(\def\biSigma{\unicode[Times]{x1D72E}}\)\(\def\biUpsilon{\unicode[Times]{x1D730}}\)\(\def\biPhi{\unicode[Times]{x1D731}}\)\(\def\biPsi{\unicode[Times]{x1D733}}\)\(\def\biOmega{\unicode[Times]{x1D734}}\)\begin{equation}\tag{1}G = \left( {1 - \beta } \right)E + \beta H.\end{equation}
The relative weighting of the head orientation information, β, was derived separately for each of the three conditions (Figure 4b). Note that the relative weighting of the head orientation information, β, calculated here reflects the combined effect of head orientation as an indirect cue and as a direct cue.  
Figure 4
 
Results from Experiment 1A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represents bootstrapped 95% CIs.
Figure 4
 
Results from Experiment 1A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represents bootstrapped 95% CIs.
We calculated confidence intervals (CIs) of the relative weighting of the head orientation information by bootstrapping (resampling with the replacement of participants). Bootstrapping was performed with 1000 iterations. In order to determine whether the relative weightings of the head orientation differed among the three conditions at the significance level of p < 0.05 with the Bonferroni correction, we examined whether 98% CIs of the difference in bootstrapped weightings between the conditions crossed zero. Throughout the paper, we report a significant difference when the 98% CIs of the difference between any two groups did not cross zero. 
Results and Discussion
Figure 4a shows the average adjusted pointer angle in each condition together with the linear fits. The equations obtained from multiple regression analysis on the average data in each of the three conditions were: 
  •  
    Pointer angle (both-eyes condition) = −1.19° + 1.61 × eye – 0.69 × head
  •  
    Pointer angle (near-eye condition) = −0.69° + 1.59 × eye – 0.49 × head
  •  
    Pointer angle (far-eye condition) = −1.05° + 1.51 × eye – 0.76 × head
The percentage of variance explained by the model was greater than 99% across all conditions. 
Figure 4b shows the relative weightings of the head orientation for each condition in determining perceived direction of gaze. The weights are negative across all conditions, suggesting a general repulsive effect across conditions. The weighting in the near-eye condition was significantly less negative compared to the other two conditions (difference from both-eyes condition = 0.31, 98% CI [0.14, 0.64]; difference from far-eye condition = 0.55, 98% CI [0.2, 1.67]). On the other hand, weightings between the both-eyes condition and far-eye condition were not significantly different (difference = −0.24, 98% CI [−0.06, 1.03]). These results show that the repulsive effect of head orientation on perceived gaze direction was smaller in the near-eye condition than in the other two conditions. 
Psychophysical Experiment 1B (whole face images): Categorization task
Some recent studies have reported that the judged gaze direction varies depending on the task (Balsdon & Clifford, 2017a). In order to examine whether the pattern of results obtained in the pointer task occurs consistently across tasks, we employed a categorization task to measure the perceived gaze direction from the same stimuli used in Experiment 1A
Methods
Participants
Twenty-one naïve observers (nine male and 12 female, mean age = 19.1 years) served as subjects. The sample size was determined based on our previous studies (Otsuka et al., 2014, 2015, 2016). Of these, two participants were excluded from the final sample (see Analysis). All subjects had normal or corrected to normal vision. All experiments adhered to the Declaration of Helsinki guidelines and were approved by the UNSW Human Research Ethics Committee. 
Apparatus, stimuli, and procedure
Apparatus and stimuli were the same as those used in Experiment 1A. The observers' task was to categorize the perceived gaze direction as averted to their left, direct, or averted to their right using key-presses “LeftArrow,” “DownArrow,” and “RightArrow,” respectively. Each subject completed 972 trials consisting of 3 blocks of 324 trials. In each block, stimuli were presented in random order with three repeats. Each face stimulus was presented in the same manner as in the Pointer experiment, except that a blank gray screen followed the disappearance of the face stimulus. As in the Pointer experiment, all participants completed the block with the both-eyes images first. Ten participants performed the RE image block next, followed by the LE block. For the rest of the participants, the order of these last two blocks was reversed. 
Analysis
We analyzed the data from each condition in a similar manner to our previous study (Otsuka et al., 2015). That is, subjects' reports of the direction of gaze as leftwards, direct, or rightwards were recorded as 0, 0.5, and 1, respectively. As in the Pointer experiment, we sorted individual data from the both-eyes images, LE images, and RE images into the both-eyes condition, near-eye condition, and far-eye condition, respectively. For each condition, head orientation, and eye orientation, a proportion rightwards score was calculated as the sum of recorded scores divided by the number of presentations. We fitted a logistic function to the mean proportion rightwards score as a function of eye orientation for each condition and head orientation separately. Participants were excluded from further analysis if any of the logistic functions fitted to their data were so shallow that the difference in stimulus eye orientation between the 50% and 75% points was greater than the range of eye orientation tested (Balsdon & Clifford, 2017a). Two participants were excluded on this basis. We calculated the mean proportion rightwards score across a final sample of participants for each condition, head orientation, and eye orientation. We fitted a logistic function of eye orientation to this mean proportion rightwards scores as a function of eye orientation for each condition and head orientation separately (Figure 5a). 
Figure 5
 
Results from Experiment 1B. Data averaged across participants (n = 19). (a) The logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the logistic fitted data. Solid lines are best-fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 5
 
Results from Experiment 1B. Data averaged across participants (n = 19). (a) The logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the logistic fitted data. Solid lines are best-fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
We calculated the 50% point of each fitted logistic psychometric function, corresponding to subjectively direct gaze. On these points, we performed linear regression as a function of the degree of head rotation pose for each condition (Figure 5b). 
We used the slope of the regression line, m, to estimate the relative weighting of eye orientation, E, and head orientation, H, in determining perceived direction of gaze in the dual-route model (Otsuka et al., 2014). The perceived direction of gaze, G, was modelled as a weighted average of eye and head orientation, such that two weights were constrained to sum to one:  
\begin{equation}\tag{2}G = \left( {{1 \over {1 - m}}} \right)E + \left( {{m \over {m - 1}}} \right)H.\end{equation}
 
Weights were derived separately for each of the three conditions (Figure 5c). 
Results and discussion
Figure 5a shows the logistic fits to the averaged data recorded as the proportion of the rightwards responses for each head orientation and condition. Figure 5b shows the eye orientation corresponding to the subjectively direct gaze estimated by taking the 50% point of each psychometric function. Figure 5c shows the relative weighting of the head orientation for each condition in determining the perceived gaze direction. 
As in Experiment 1A (Pointer task), the weights were negative across all conditions, suggesting a general repulsive effect across conditions. The weighting in the far-eye condition was significantly more negative than the other two conditions (ps < 0.05, difference from both-eyes condition = 0.21, 98% CI [0.05, 0.45]; difference from near-eye condition = 0.24, 98% CI [0.10, 0.43]). On the other hand, the weightings between the both-eyes and near-eye condition did not differ significantly (difference = 0.03, 98% CI [−0.12, 0.08]). These results show that the repulsive effect of head orientation on the perceived gaze direction was greater in the far-eye condition compared to the other two. 
Psychophysical Experiment 2A (eyes-only images): Pointer task
Across both the pointer task (Experiment 1A) and the categorization task (Experiment 1B), we consistently found that the repulsive influence of head orientation was greater in the far-eye than near-eye condition. Such a difference may have occurred because the magnitude of the head orientation influence to the eye region differed between the near eye and the far eye (indirect cue account). That is, the relative position of iris/pupil of the near eye may have repulsed from the head orientation to a greater extent than that of the far eye. Alternatively, the influence of head orientation as the direct cue, which attracts perceived gaze direction toward the head orientation, could somehow have been stronger in processing the near eye than the far eye, resulting in the greater repulsive effect for the far eye (direct cue account). Although information about head orientation given in the images was constant between the near- and far-eye conditions, it is possible that focusing on one of the eyes may have changed how head orientation was estimated and/or used. For example, Noll (1976) suggested that the amount of head turn might be underestimated when an observer looked at the near eye, as the near side of the model's face is fully visible. He also suggested that the observer might be able to estimate the amount of head turn more accurately when looking at the far eye, as head turn may be more obvious through occlusion of the far side of the face by the nose. If in fact the head orientation was estimated differentially, or head orientation information was used differentially, this could have led to the difference in weightings of the direct cue and the attractive effect. In turn, such a difference may have led to the observed difference between the near- and far-eye conditions even if the indirect cue inducing the repulsive effect of head orientation was constant across the two conditions. 
The data from Experiment 1 do not distinguish between these two accounts, as the influence of head orientation in the whole face images involves both the direct and indirect routes of the influence of head orientation (Otsuka et al., 2014; see also Figure 1). In order to address this point, we repeated psychophysical experiments using images that contained only the eye opening area of faces (eyes-only images). As little or no information about head orientation is available in these images, the influence of head orientation is limited to the indirect cue. A greater repulsive influence of head orientation for the far-eye than for the near-eye condition in this experiment would therefore support the indirect cue account. 
Methods
Participants
Twenty naïve observers (10 male and 10 female, mean age = 18.9 years) served as subjects. The sample size was determined based on our previous studies (Otsuka et al., 2014, 2015, 2016). All subjects had normal or corrected-to normal-vision. All experiments adhered to the Declaration of Helsinki guidelines and were approved by the UNSW Human Research Ethics Committee. 
Apparatus
Apparatus was the same as that used in Experiment 1, except that stimuli were displayed on a Cambridge Research Systems Display++ LCD monitor (1,920 × 1,080 pixels). At the viewing distance of 57 cm, one pixel subtended 2.16 arcmin. 
Stimuli
Stimulus images were the same as in Experiment 1, save for the fact that they were masked except for the region around each open eye (eye-only images). Examples of the stimuli are shown in Figure 6
Figure 6
 
Stimuli used in Experiments 2A and 2B. Examples of eyes-only images with 0°eye orientation.
Figure 6
 
Stimuli used in Experiments 2A and 2B. Examples of eyes-only images with 0°eye orientation.
Procedure and analysis
The procedure and analysis were the same as in Experiment 1A, except that the eyes-only images were used as stimuli rather than the images depicting a whole face. Note that the relative weighting of the head orientation information, β, calculated in this image condition reflects the effect of head orientation only as an indirect cue, because utilization of the direct cue was severely restricted due to the elimination of information about head orientation. 
Results and discussion
Figure 7a shows the average adjusted pointer angle in each condition together with the linear fits. The equations obtained from multiple regression analysis on the average data in each of the three conditions were 
  •  
    Pointer angle (both-eyes condition) = −0.20° + 1.89 × eye – 0.89 × head
  •  
    Pointer angle (near-eye condition) = −1.23° + 1.71 × eye – 0.74 × head
  •  
    Pointer angle (far-eye condition) = −0.42° + 1.67 × eye – 1.06 × head
Figure 7
 
Results from Experiment 2A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represent bootstrapped 95% CIs.
Figure 7
 
Results from Experiment 2A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represent bootstrapped 95% CIs.
The percentage of variance explained by the model was greater than 97% across all conditions. 
Figure 7b shows the relative weighting of the head orientation in each condition. The weights were negative across all conditions, suggesting a general repulsive effect across conditions. The weighting of head orientation in the far-eye condition was significantly more negative than that of the other two conditions (difference from both-eyes condition = 0.84, 98% CI [0.47, 1.49], difference from near-eye condition = 0.97, 98% CI [0.55, 1.71]). On the other hand, the weightings between the both-eyes and near-eye conditions did not differ significantly (difference = 0.13, 98% CI [−0.33, 0.02]). These results show that the repulsive effect of head orientation on the perceived gaze direction was greater in the far-eye condition compared to other two conditions. 
Psychophysical Experiment 2B (eye-only images): Categorization task
Methods
Participants
The twenty naïve observers who participated in Experiment 2A also served as subjects in this experiment. As in Experiment 1B, two subjects were excluded from the final sample because the logistic functions fitted to their data were so shallow that the difference in stimulus eye orientation between the 50% and 75% points was greater than the range of eye orientation tested (Balsdon & Clifford, 2017a). 
Apparatus and stimuli
Apparatus and stimuli were the same as those used in Experiment 2A
Procedure and analysis
The procedure and analysis were the same as in Experiment 1B except that the eyes-only images were used as stimuli rather than the images depicting a whole face. 
Results and discussion
Figure 8a shows the logistic fits to the data recoded as the proportion of rightwards responses for each head orientation and condition. The 50% point of each resulting psychometric function was taken as the eye orientation corresponding to the subjectively direct gaze. On these points, we performed linear regression as a function of the degree of head orientation (Figure 8b). Figure 8c shows the relative weighting of the head orientation in each condition. As in the Pointer experiments, the weights are negative across all conditions, suggesting a general repulsive effect. The negativity in the weightings is most pronounced in the far-eye condition, followed by the near-eye condition, and least pronounced in the both-eyes condition. The weighting in the far-eye condition was significantly more negative compared to the both-eyes condition (difference = 0.54, 98% CI [0.23, 0.97]), while the difference between other conditions did not reach significance. Although the difference between the near- and far-eye conditions did not reach significance in the current experiment, the trend of greater negativity for the far-eye condition is consistent with Experiment 2A
Figure 8
 
Results from Experiment 2B. Data averaged across participants (n =18). (a) Logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the fitted data. Solid lines are best fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 8
 
Results from Experiment 2B. Data averaged across participants (n =18). (a) Logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the fitted data. Solid lines are best fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
In order to examine whether the weighting of the direct cue in the dual-route model (Otsuka et al., 2014; see also Figure 1) differed between conditions, we used the weightings of head orientation in Experiment 1 (A, B: whole face images) and Experiment 2 (A, B: eye-only images) to infer the weighting on the direct cue of head orientation in the dual-route model. The dual-route model computes the perceived direction of gaze, G, as a weighted average of the eye orientation, E, and head orientation, H, such that two weights are constrained to sum to one, as described in Equation 1. For Experiment 1 (whole face images), we can model the perceived direction of gaze, Gwhole face, as  
\begin{equation}\tag{3}{G_{{\rm{whole\ face}}}} = \left( {1 - {\beta _{{\rm{whole\ face}}}}} \right)E + {\beta _{{\rm{whole\ face}}}} \times H\end{equation}
where  
\begin{equation}\tag{4}{\beta _{{\rm{whole\ face}}}} = \beta + \alpha \left( {1 - \beta } \right)\end{equation}
 
The weighting, βwhole face, attached to head orientation obtained in the whole face images reflects the aggregate effect of head orientation on eye region information (indirect cue: modelled as α) and as an explicit cue to gaze direction in its own right (direct cue: modelled as β). 
For Experiment 2 (eyes-only images), we can model the perceived direction of gaze, Geye only, as  
\begin{equation}\tag{5}{G_{{\rm{eye\ only}}}} = \left( {1 - {\beta _{{\rm{eye\ only}}}}} \right)E + {\beta _{{\rm{eye\ only}}}} \times H\end{equation}
where  
\begin{equation}\tag{6}{\beta _{{\rm{eye\ only}}}} = \alpha \end{equation}
 
The weighting, βeye only, attached to head orientation obtained in eye-only images, reflects the effect of head orientation on eye region information (indirect cue: modelled as α), only. 
Solving Equations 4 and 6 for β gives:  
\begin{equation}\tag{7}\beta = {{{\beta _{{\rm{whole\ face}}}} - {\beta _{{\rm{eye\ only}}}}} \over {1 - {\beta _{{\rm{eye\ only}}}}}}\end{equation}
 
Based on Equation 7, the weighting of the direct cue was calculated separately for each eye condition and task. The calculated weighting of the direct cue for each condition is shown in Figure 9. We found no significant difference between the conditions among the weighting of the direct cue in either task (ps > 0.05). These results are inconsistent with the direct account, as they indicate that the greater repulsive effect found for the perceived gaze direction of the far eye than for the near eye in Experiment 1 was not due to differences in the weighting of the head as a direct cue. Instead, the similarity of the weightings of the influence of head orientation as a direct cue between the near eye and far eye in both tasks supports the role of the information within the eye region of stimuli posited by the indirect account. 
Figure 9
 
Estimated direct cue based on the results of the pointer task (Experiment 1A and 2A), and categorization task (Experiment 1B and 2B).
Figure 9
 
Estimated direct cue based on the results of the pointer task (Experiment 1A and 2A), and categorization task (Experiment 1B and 2B).
Image Analysis: The influence of head orientation on the stimulus eye region
Here, we examined properties of our stimulus images. The facial images for Experiments 1 and 2 were produced by controlling the orientation of realistic 3D model heads and eyeballs. Taking advantage of the ease of colouring and placement of features in such models, we performed geometrical analysis of the stimulus eye region. 
The psychophysical results from whole face images (Experiments 1A and B) showed that the repulsive effect of head orientation is greater for the far eye than for the near eye. Further, the psychophysical results from eye-only images (Experiments 2A and B) suggested that this difference could be attributed to a greater repulsive effect of head orientation on the eye region of the far eye than of the near eye (indirect cue). If this is the case, a greater repulsive effect for the far eye should be found in our stimulus image eye region. 
Methods
The 3D eyeballs used to produce the stimulus images in the psychophysical experiment each consisted of an opaque sphere corresponding to the sclera, with a transparent protruding region corresponding to the cornea, and a surface structure corresponding to the iris with a pupil hole in the center. 
In order to measure the position and the extent of the eye opening area, we rendered eye opening images by making the surface of the face and the background color black and the eyeball color white for each stimulus head orientation and face identity in Blender (Figure 10, top row). There were small gaps between the eyeball and the inner corner of the eye opening for the near-eye image with a head orientation of ±20°. A previous study suggested that the differential degree of the visibility of the inner corner of the eye due to epicanthal folds influenced perceived gaze direction (West et al., 2008). Thus, we decided to include the gap between the eyelid and eyeball within the eye-opening region, and manually to fill this gap with white, using Adobe Photoshop. In each of these images, we measured the position of the left and right extrema for each of the two white areas corresponding to the eye openings in each image using MATLAB. 
Figure 10
 
Examples of images rendered for the analysis of the stimulus eye area.
Figure 10
 
Examples of images rendered for the analysis of the stimulus eye area.
In order to measure the iris/pupil position in our stimuli, we rendered three types of images (whole iris/visible iris/pupil) for each of the 108 stimuli consisting of face identity (4), head orientation (3), and eye orientation (9). We rendered two types of images depicting only the area of the iris (including the pupil) by making the color of the cornea covering both the iris and pupil black, and the color of everything else white in Blender. To create the whole iris images, we rendered images without the context of the face (Figure 10, second row), revealing the whole iris area including the areas hidden by the eyelid in the stimulus images used in the psychophysical experiments. To create the visible iris images, we rendered images in the context of the face (Figure 10, third row), revealing only the iris area visible in the stimulus images used in the psychophysical experiment. In addition, we rendered the pupil images by making the area behind the pupil hole black and everything else white (pupil images, Figure 10, bottom row). The pupil images were rendered without the context of the face, and depicted the whole pupil area. In each of these images, we measured the centroid position of the each of the two black areas corresponding to the eyes using MATLAB. 
By combining the estimates of eye opening areas and the centroid position of the iris and pupil, we calculated the relative horizontal position of the iris and pupil centroid within the eye opening. The relative position was calculated in such a way that 0 corresponded to the middle of the opening area, while −50 and 50 corresponded to the left and right edges of that area, respectively. As in the psychophysical data, we sorted the iris and pupil centroid data from each eye into the near-eye and far-eye. 
In addition, we analyzed the noncircularity of the iris for each of the 108 stimuli. This was done by measuring the aspect ratio (the minor axis length divided by the major axis length) of each of the two black areas corresponding to the two eyes in the whole iris images using MATLAB. We then calculated the difference from one (perfect circle) as a measure of noncircularity. 
Results and discussion
Figure 11 shows example plots of the estimated eye opening area and the centroids measured based on the whole iris and visible iris, superimposed on the corresponding stimulus images. The plots illustrate that the centroids estimated from the two types of iris image diverged in some combinations of head and eye orientation. 
Figure 11
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding stimulus images. Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 11
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding stimulus images. Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 12a shows the relative horizontal position of the iris and pupil centroids within the eye opening as a function of eye orientation for each head orientation. We performed multiple regression analyses on the relative position of iris and pupil with explanatory variables of eye and head orientation. The analysis was done separately for eye type (far eye and near eye) and iris/pupil image type (whole iris /visible iris/ pupil). 
Figure 12
 
Results from the analysis of the position of the iris and pupil in the images of the current study. (a) The relative horizontal position of the iris/pupil centroid within the eye opening, averaged across the four stimulus faces. The centroid of the whole iris (left two panels), visible iris (central two panels), and pupil (right two panels). (b) The relative weighting of the head orientation information in determining the relative position of the iris for the near eye and the far eye in each image type (left). For comparison, the weighting of head orientation on the eye region information inferred from the psychophysical results (Experiment 2A and B) is shown in the right panel.
Figure 12
 
Results from the analysis of the position of the iris and pupil in the images of the current study. (a) The relative horizontal position of the iris/pupil centroid within the eye opening, averaged across the four stimulus faces. The centroid of the whole iris (left two panels), visible iris (central two panels), and pupil (right two panels). (b) The relative weighting of the head orientation information in determining the relative position of the iris for the near eye and the far eye in each image type (left). For comparison, the weighting of head orientation on the eye region information inferred from the psychophysical results (Experiment 2A and B) is shown in the right panel.
The resulting equations were 
  •  
    Relative position of the whole iris (near eye) = 0.28 + 0.92 × eye – 0.39 × head
  •  
    Relative position of the whole iris (far eye) = 0.21+ 0.99 × eye – 0.65 × head
  •  
    Relative position of the visible iris (near eye) = 0.29 + 0.84 × eye – 0.33 × head
  •  
    Relative position of the visible iris (far eye) = 0.19 + 0.82 × eye – 0.41 × head
  •  
    Relative position of pupil (near eye) = 0.28 + 0.86 × eye – 0.39 × head
  •  
    Relative position of pupil (far eye) = 0.23 + 0.92 × eye – 0.65 × head
The percentage of variance explained by the model was greater than 99% across all six data sets. 
We used the weights of the linear regression for the relative position of the iris centroid to estimate the relative weighting of eye orientation, E, and head orientation, H, in determining the relative position of iris. The relative position of the iris, P, is modelled as  
\begin{equation}\tag{8}P = \beta E + \left( {1 - \beta } \right)H.\end{equation}
Weights were derived separately for each of the eye types (far eye and near eye) and the iris/pupil image type (whole iris/ visible iris/ pupil). Figure 12b (left panel) shows the calculated relative weightings of the head orientation information in determining the relative position of iris and pupil for the near eye and far eye in each image type. The weightings of the head orientation tended to be more negative for the far eye compared to the near eye; this trend was more pronounced for the weighting derived from the whole iris images and the pupil images than that from the visible-iris images. The greater repulsive effect evidenced by larger negativity of head weighting for the near eye was consistent with the psychophysical results (shown in the left panel of Figure 12b).  
Figure 13 shows the noncircularity of the whole iris measured as the deviation of the aspect ratio from one. The noncircularity is close to zero within the eye orientation range of ±5°, and increases symmetrically as eye orientation deviates further from 0°. There is little difference across the head orientations. This is consistent with the notion of Balsdon and Clifford (2017, ECVP) that iris/pupil circularity may serve as a cue for the directness of eye gaze across head orientation. The general trend is similar between the near eye and far eye, although the change is slightly sharper for the near eye. The deviation of the aspect ratio from one for the iris of the near eye is about 1.7% and 3.0% at eye orientations of ±10° and ±15°, respectively. Considering that the detection threshold of the aspect ratio change from a perfect circle is 2%–3% (Regan & Hamstra, 1992; Zanker & Quenzer, 1999), the noncircularity of the iris of the near eye would likely only be detectable with an eye deviation of over 10°. It should be noted that the above analysis was based on the whole iris. The circularity of the iris could act as a cue only if observers could somehow infer the whole iris area. 
Figure 13
 
Noncircularity of the whole iris area in the stimulus images as a function of eye orientation for each head orientation. As an aspect ratio of one indicates perfect circularity of the iris, the noncircularity was calculated as one minus the aspect ratio of the iris area.
Figure 13
 
Noncircularity of the whole iris area in the stimulus images as a function of eye orientation for each head orientation. As an aspect ratio of one indicates perfect circularity of the iris, the noncircularity was calculated as one minus the aspect ratio of the iris area.
Image Analysis: The influence of head orientation on eye region in a simulation of the stimulus of Noll (1976)
In both the Pointer and Categorization tasks in Experiment 1, we consistently found that the repulsive influence of head orientation on the perceived gaze direction was greater in the far-eye condition than in the near-eye condition. Both psychophysical results (Experiment 2) and image analysis consistently showed that such a difference occurs due to the greater influence of head orientation on the eye region of the far eye than on that of the near eye in the current stimulus setting. 
In contrast to the current results, Noll (1976) reported that the repulsive effect on the perceived gaze direction was greater when only the near eye of the model was visible, compared to when only the far eye was visible. The discrepancy between the findings might be explained by the differences in the stimuli. In our stimuli, the model's two eyes were perfectly converged to the gazing target 40 cm away from the face, and the viewing distance of the observer was 57 cm. In Noll (1976), on the other hand, the gazing target was located 145 cm from the model's face, while the viewing distance of the observer was 44 cm. In addition, since Noll used photographs of real human faces as stimuli, the model's eyes may not have appeared perfectly converged, but rather slightly diverged from the gazing target due to the normal discrepancy between the visual and pupillary axis of about 5° (Park et al., 2012). The distance between the model and the gazing target, and the structural characteristics of the human eye, suggest that the stimulus eyes in Noll's experiment would have been much less converged compared to those in the current study. Further, whereas the pupil in our stimuli was located in the center of the iris, the pupil center of a real human eye is shifted nasally compared to the limbal center by about 0.2 mm, on average (Mathur, Gehrmann, & Atchison, 2014; Tabernero, Atchison, & Markwell, 2009). Although the magnitude of the deviation of the pupil center from the iris center may seem trivial, a previous study reported that small pupil decentration influenced the perceived gaze direction from faces with light colored irises (West, 2011). 
In order to examine whether these stimulus differences could account for the discrepant findings between the studies, we examined the relative position of the iris and pupil within the eye opening in stimulus images simulating those used in Noll's experimental setting. 
Methods
We rendered iris and pupil images using one of our female face models. We chose a female face model for the simulation because Noll (1976) used photographs of two female faces as stimuli. Iris images were rendered in the same way as for the analysis of stimulus eye area, with the following modifications. The interpupillary distance of the model was set as 6.1 cm, to approximate the average interpupillary distance in the adult female population (Fesharaki, Rezaei, Farrahi, Banihashem, & Jahanbkhshi, 2012). The gazing target was positioned on a plane 145 cm away from the model for each gaze direction. In addition, we set each eye of the model to track 5° divergent direction from the gazing target to simulate the vergence of eyes with average angle kappa (Park et al., 2012). The relative position of the whole iris and the visible iris within the eye opening were analyzed in the same way as for the stimuli of the current study (Figure 14). 
Figure 14
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding facial images simulating the stimulus image settings of Noll (1976). Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 14
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding facial images simulating the stimulus image settings of Noll (1976). Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Pupil images were rendered with the following additional changes. As the pupil center of the real human eye is shifted nasally compared to the limbal center by about 0.2 mm, on average (Mathur et al., 2014; Tabernero et al. 2009), we added this 0.2-mm nasal displacement to the position of the pupil hole structure in our 3-D eye model. In addition, we added an index of refraction (IOR) of 1.4 to the material setting of the cornea structure to simulate the magnifying power of the cornea of the real eye (Atal, 2014; Patel, Marshall, & Fitzke, 1995). 
Results and discussion
Figure 14 shows example plots of the estimated eye opening area and the centroids measured based on the whole iris and visible iris, superimposed on the corresponding images simulating Noll's stimuli (Noll, 1976). Figure 15a summarizes the results obtained from the analysis of the simulated relative iris position in Noll's stimuli. We performed multiple regression analyses on the relative position of the iris and pupil within the eye opening with explanatory variables of eye orientation and head orientation. The analysis was done separately for eye type (far eye and near eye) and iris/pupil image type (whole iris/ visible iris/ pupil). 
Figure 15
 
The relative horizontal position of the iris centroid within the eye opening in the simulated stimuli of Noll (1976). (a) The centroid of the whole iris, visible iris, and pupil. (b) The relative weighting of head orientation determining the relative position of the iris and pupil.
Figure 15
 
The relative horizontal position of the iris centroid within the eye opening in the simulated stimuli of Noll (1976). (a) The centroid of the whole iris, visible iris, and pupil. (b) The relative weighting of head orientation determining the relative position of the iris and pupil.
The resulting equations were 
  •  
    Relative position of the whole iris (near eye) = −0.38 + 0.85 × eye – 0.57 × head
  •  
    Relative position of the whole iris (far eye) = −0.05 + 0.91 × eye – 0.28 × head
  •  
    Relative position of the visible iris (near eye) = −0.29 + 0.78 × eye – 0.51 × head
  •  
    Relative position of the visible iris (far eye) = −0.04 + 0.80 × eye – 0.16 × head
  •  
    Relative position of pupil (near eye) = −0.48 + 0.83 × eye – 0.53 × head
  •  
    Relative position of pupil (far eye) = 0.32 + 0.89 × eye – 0.34 × head
The percentage of variance explained by the model was greater than 99% across all six data sets. Using these weights of the linear regression for the relative position of the iris/pupil centroid, we estimated the relative weighting of eye orientation, E, and head orientation, H, in determining the relative position of the iris. 
The calculated weightings of the head orientation information in determining the relative position of the iris and pupil were more negative for the near eye compared to the far eye (Figure 15b). The pattern of results is consistent with the psychophysical results of Noll (1976), showing a greater repulsive effect when the model's near eye was visible than when the models' far eye was visible. On the other hand, the difference in the weighting of head orientation between the near eye and far eye is opposite to our psychophysical results and our stimulus properties. Taken together, analysis of a simulated stimulus based on that of Noll shows that the difference in stimulus properties can account for the difference between Noll (1976) and the current study. 
General discussion
In both Experiments 1A (pointer task) and 1B (categorization task), we consistently found that the repulsive influence of head orientation on the perceived gaze direction was greater in the far-eye visible condition than in the near-eye visible condition in the whole face images. In Experiment 2, using eyes-only images, we found a similar pattern of results. The weighting of the influence of head orientation as an indirect cue in the dual-route model, estimated based on the psychophysical data, tended to be greater for the far eye than the near eye across both Experiments 2A (pointer task) and 2B (categorization task). On the other hand, the weightings of the influence of head orientation as a direct cue were similar between the near eye and far eye in both tasks. These results suggest that the greater repulsive effect found for the perceived gaze direction of the far eye than the near eye in Experiment 1 occurred due to the differential influence of head orientation on the information within the eye region of the stimuli. Analysis of the relative position of the iris and pupil within the eye opening in our stimulus images revealed generally consistent results with the psychophysical data, showing that the repulsive influence of head orientation corresponding to the indirect cue was greater for the far eye than the near eye. 
For the image analysis, we used three image features to obtain estimates of the relative position of the iris and pupil within the eye opening: the whole iris, visible iris, and pupil. The weighting of the influence of head orientation for the near eye was similar across image features. On the other hand, there was a substantial difference between the weightings for the far eye, with the repulsive influence being most pronounced for the pupil and least pronounced for the visible part of the iris. The reduced repulsive influence for the visible part of the iris was related to the underestimation of the iris deviation from the center of the eye opening with this estimate, as discussed in the Introduction (Figure 2). It is interesting to note that the weighting of the indirect cue estimated based on the psychophysical data (Experiment 2A and B) for the far eye lay between those estimated based on the visible and whole iris centroids. Thus, the repulsive influence of head orientation estimated based on the psychophysical data may be too large to be accounted for by the visible iris in the image. 
In addition, if the visual system was using the position of the pupil, the weighting of the indirect cue for the far eye in perceptual performance might have been even more negative. As the irises in our stimulus images were relatively dark, iris position may not have been readily detectable given the short image presentation duration employed in the current study. When considered together, our results suggest that the visual system is making some effort to estimate the actual position of the iris (i.e., center of the whole iris) relative to the eye opening, albeit imperfectly. 
Estimation of the centroid of the iris may be achieved through a process of amodal completion of the whole iris behind the eyelid. Computationally, the visual system may employ a strategy similar to the Circular Hough Transform, which enables detection of an incomplete circle based on the distribution of edges within an image (Ballard, 1981). In fact, several computer vision algorithms that operate with facial images in the visible spectrum as input utilize the Circular Hough Transform for the accurate localization of the iris (e.g., George & Routray, 2016; Zadeh & Harimi, 2017). 
Although we focused on the geometrical analysis of the eye region, several previous studies have suggested that the luminance distribution in this region also provides a cue to eye orientation information (e.g., Ando, 2002, 2004; Langton, Watt, & Bruce, 2000; Watt, Craven, & Quinn, 2007; Weidenbacher, Layher, Strauss, & Neumann, 2007). In order to find out how head orientation influences luminance-based cues, we performed a luminance-based analysis of the eye regions in the stimulus images used for the psychophysical experiments. In so doing, we estimated eye orientation by convolving the eye region of the images with a complex Gabor filter, and calculating the phase of the output at the point of maximum response amplitude within the eye region. The resultant luminance-based estimates of eye orientation are shown in Figure 16a. Based on polynomial curve fitting to the data, we estimated the stimulus eye orientation corresponding to the zero-crossing of the luminance-based estimates of eye orientation for each head orientation (Figure 16b). On these points, we performed linear regression as a function of the degree of head orientation. We used the slope of the regression line, m, to estimate the relative weighting of eye orientation, E, and head orientation, H, in determining the luminance based estimates of eye orientation, LB. The luminance-based estimate of eye orientation was modelled as  
\begin{equation}\tag{9}LB = \left( {{1 \over {1 - m}}} \right)E + \left( {{m \over {m - 1}}} \right)H.\end{equation}
 
Figure 16
 
Results of the luminance-based analysis, averaged across four stimulus faces. (a) Estimated eye orientation based on the luminance distribution within eyes together with polynomial curve fitting to the data. (b) Stimulus eye orientation at the zero-crossing of the estimated eye orientation for each head orientation, together with the linear fit. (c) Calculated relative head weighting in determining the eye orientation based on the dual-route model.
Figure 16
 
Results of the luminance-based analysis, averaged across four stimulus faces. (a) Estimated eye orientation based on the luminance distribution within eyes together with polynomial curve fitting to the data. (b) Stimulus eye orientation at the zero-crossing of the estimated eye orientation for each head orientation, together with the linear fit. (c) Calculated relative head weighting in determining the eye orientation based on the dual-route model.
The calculated weighting of head orientation in determining the luminance-based estimates of eye orientation is shown in Figure 16c. The repulsive influence of head orientation corresponding to the indirect cue was greater for the far eye than for the near eye. The results show that luminance-based estimates of eye region can provide a pattern of results qualitatively similar to the geometrical analysis, although the difference between the near eye and far eye was much more pronounced than any of the estimates obtained from the geometrical analysis. This large difference between the weightings for the near and far eyes may have been partially dependent on the precise details of how eye orientation was calculated here. The results of the current study are thus consistent with employment of either geometrical cues or luminance distribution cues to eye gaze direction, or both. 
It is important to note that estimation of eye orientation based on the luminance distribution crucially depends on accurate localization of the eye region. In the luminance distribution analysis above, we used images that contained only the eye region (Figure 6) as input. This was because the filter would often return a point of maximum response amplitude outside the eye region if whole face images (Figure 3) were used as input. Such an observation suggests that luminance information alone may not be sufficient for the localization of the eye region. Thus, it is likely that the visual system requires some geometrical structural information, even for the processing of luminance-based cues of eye gaze direction. 
Our results in no way suggest that the influence of the repulsive indirect cue is always greater for the far eye than for the near eye. Rather, our findings suggest that the way in which the indirect cue influences the near and far eyes varies depending on aspects of the experimental setting such as the vergence angle of stimulus faces. Unlike in the current study, Noll (1976) reported that the repulsive effect was greater when only the near eye of the model was visible, compared to when only the far eye was visible. Our analysis of relative iris position in the images simulating the apparent vergence angle (angle between the pupillary axes of the two eyes) in Noll's stimuli, based on the typical adult female interpupillary distance and angle kappa, revealed a greater repulsive effect for the near eye than for the far eye. This result showed that the differences in the stimulus between Noll (1976) and the current study could account for the different pattern of results between studies. 
Our stimuli differed from Noll (1976) in terms of the apparent vergence angle, interpupillary distance, and the distance of the camera from the model. In order to find out how these factors affect the difference in the repulsive indirect cue between eyes, we performed image analysis on sets of simulated images that varied in the apparent vergence angle, interpupillary distance, and viewing distance (strength of linear perspective). We calculated head weighting for the near and far eyes of these image sets in the same way as for the analysis of the whole iris images of the current and simulated stimuli of Noll (1976). Figure 17 summarizes the results of this image analysis. The results suggest that the difference in head weighting between the two eyes depends on all of the factors considered here. In general, the greater repulsive effect for the far eye, as in the current stimuli (indicated as the darkened area of each graph in Figure 17), occurred for a narrower interpupillary distance, and for a farther viewing distance. In addition, the greater repulsive effect for the far eye occurred more consistently when the apparent vergence angle was positive (convergent) than when it was negative (divergent). 
Figure 17
 
Summary results of the image analysis on simulated images with 15 levels of apparent vergence angle (−7° to 7°, 1° steps), five levels of interpupillary distance (5–7cm, 0.5-cm steps), and three viewing distances (distance of camera from the model 57.3 cm, 120 cm, and orthographic projection = infinite distance). The change in relative weighting of head orientation in determining the relative position of the whole iris of each eye as a function of apparent vergence angle is depicted for each camera distance and interpupillary distance (IPD). The darkened area in each graph denotes the data points where greater negativity for the far eye than for the near eye was obtained, as in the current study.
Figure 17
 
Summary results of the image analysis on simulated images with 15 levels of apparent vergence angle (−7° to 7°, 1° steps), five levels of interpupillary distance (5–7cm, 0.5-cm steps), and three viewing distances (distance of camera from the model 57.3 cm, 120 cm, and orthographic projection = infinite distance). The change in relative weighting of head orientation in determining the relative position of the whole iris of each eye as a function of apparent vergence angle is depicted for each camera distance and interpupillary distance (IPD). The darkened area in each graph denotes the data points where greater negativity for the far eye than for the near eye was obtained, as in the current study.
Given that convergence occurs when a person fixates on objects at a close distance, the greater repulsive effect for the far eye as found in the current study is more likely to be observed when the model fixates at a close distance. On the other hand, the opposite pattern of results as in Noll (1976) is likely to occur when the model fixates at a greater distance. The apparent vergence angle is also affected by the magnitude of the discrepancy between the pupillary and visual axes (angle kappa). Although the typical angle kappa is about 5 (Hashemi et al., 2010; Park et al., 2012), there is a large variation in angle kappa among individuals. While angle kappa is positive in most people (fovea displaced temporally to the optical axis; Hashemi et al, 2010), it can be null or even negative in some individuals (Park et al., 2012). If the angle kappa of an individual was null or negative, the apparent vergence angle would not become negative even with extended fixation distance. Thus, if an individual with null or negative angle kappa was used as a model, a greater repulsive effect for the far eye would be likely for a wide range of fixation distances and viewing distances of the observer unless the model had an exceptionally wide interpupillary distance (≥ 7 cm). Further studies measuring perceived gaze direction would be required to validate the predictions made based on our simulated images. 
Although we have devoted much of our discussion to how the information in each eye is affected by head orientation, or how such information could be extracted by the visual system, it is likely that the visual system employs information from both eyes for the estimation of gaze direction when both are visible. Information from two eyes would provide not only information about the direction of gaze, but also about fixation distance through the vergence angle (Nguyen et al., 2018). 
On the other hand, some previous studies have suggested the possibility that information from one of the eyes is predominantly used in the processing of others' gaze direction. For example, Noll (1976) reported that the perceived gaze direction from both eyes was more similar to the near eye than the far eye. More recently, West (2015) suggested that the perceived gaze direction from the two eyes follows that of the temporally turned eye. In this study, we manipulated both eye orientation (the orientation of the gazing target of the model) and head orientation of our stimulus in the range of ±20° from the viewpoint of the observer, and the model's eyes were perfectly converged to the gazing target. In our stimuli, therefore, the far eye was always turned in the nasal direction, while the near eye was turned in the temporal direction except when the head and eye orientations were matched. 
In Experiment 1B, the head weighting was similar between the both-eyes condition and near-eye condition, although they differed significantly from the far-eye condition. This result would be consistent with the claim by West (2015). However, in Experiment 1A, we found that the head weighting for the both-eyes condition differed significantly from the near-eye condition, showing that the perceived gaze direction from both eyes does not always follow that of the near eye (temporally turned eye). Thus, West's notion that the perceived gaze direction from the two eyes follows that of the temporally turned eye may be limited to the stimulus settings and task used in that study. Based on their finding that the weighting of the direct attractive influence of head orientation varies depending on the task, Baldson and Clifford (2017a) suggested that judgment of gaze direction depends on how the perceptual information can be optimally processed for the task at hand. Consistent with such a notion, our result suggests flexibility in the use of the head and eye orientation information for eye gaze direction depending on the task requirements. 
Acknowledgments
This work was supported by the Australian Research Council Discovery Project [DP160102239] to CC, and by a Grant-in-Aid for Research Activity Start-up [15H06456], a Grant-in-Aid for Young Scientists (B) [17K13963] from the Japan Society for the Promotion of Science, and a Grant-in-Aid for Scientific Research on Innovative Areas [17H06344] “Construction of the Face-Body Studies in Transcultural Conditions” from MEXT, Japan to YO. We thank Alysha Nguyen, Tarryn Balsdon, and Sophia Kwan for their help in data collection. 
Commercial relationships: none. 
Corresponding author: Yumiko Otsuka. 
Address: Faculty of Law and Letters, Ehime University, Matsuyama, Ehime, Japan. 
References
Ando, S. (2002). Luminance-induced shift in the apparent direction of gaze. Perception, 31 (6), 657–674, https://doi.org/10.1068/p3332.
Ando, S. (2004). Perception of gaze direction based on luminance ratio. Perception, 33 (10), 1173–1184, https://doi.org/10.1068/p5297.
Anstis, S. (2018). The role of the pupil, corneal reflex, and iris in determining the perceived direction of gaze. i-Perception, 9 (4), 1–4, https://doi.org/10.1177/2041669518765852.
Anstis, S., Mayhew, J., & Morley, T. (1969). The perception of where a face or television ‘portrait’ is looking. The American Journal of Psychology, 82 (4), 474–489.
Atal, P. (2014). Optics of the eye and its impact in vision: A tutorial. Advances in Optics and Photonics, 6, 340–367, https://doi.org/10.1364/AOP.6.000340.
Balsdon, T., & Clifford, C. W. G. (2017a). A bias-minimising measure of the influence of head orientation on perceived gaze direction. Scientific Reports, 7, 1–10, https://doi.org/10.1038/srep41685.
Balsdon, T., & Clifford, C. W. G. (2017b). Detecting and identifying offset gaze. Attention, Perception, & Psychophysics, 79 (7), 1993–2006, https://doi.org/10.3758/s13414-017-1347-0.
Brainard, D. H. (1996). The Psychophysics Toobox. Spatial Vision, 4, 433–436.
Fesharaki, H., Rezaei, L., Farrahi, F., Banihashem, T., & Jahanbkhshi, A. (2012). Normal interpupillary distance values in an Iranian population. Journal of Ophthalmic and Vision Research, 7 (3), 231–234.
Florey, J., Clifford, C. W. G., Dakin, S. C., & Mareschal, I. (2015). Peripheral processing of gaze. Journal of Experimental Psychology: Human Perception and Performance, 41 (4), 1084–1094, https://doi.org/10.1037/xhp0000068.
Gamer, M., & Hecht, H. (2007). Are you looking at me? Measuring the cone of gaze. Journal of Experimental Psychology: Human Perception and Performance, 33 (3), 705–715, https://doi.org/10.1037/0096-1523.33.3.705.
Gibson, J. J., & Pick, A. D. (1963). Perception of another person's looking behavior. The American Journal of Psychology, 76 (3), 386–394, https://doi.org/10.2307/1419779.
Hashemi, H., Khabazkhoob, M. Yazdani, K., Mehravaran, S., Jafazadehpu, E., & Fotouhi, A. (2010). Distribution of angle kappa mearsurements with Orbscan II in a population-based survey. Journal of Refractive Surgery, 26 (12), 966–971, https://doi.org/10.3928/1081597X-20100114-06.
Kluttz, N. L., Mayes, B. R., West, R. W., & Kerby, D. S. (2009). The effect of head turn on the perception of gaze. Vision Research, 49 (15), 1979–1993, https://doi.org/10.1016/j.visres.2009.05.013.
Kobayashi, H., & Kohshima, S. (2001). Unique morphology of the human eye and its adaptive meaning: Comparative studies on external morphology of the primate eye. Journal of Human Evolution, 40 (5), 419–435, https://doi.org/10.1006/jhev.2001.0468.
Langton, S., Watt, R., & Bruce, I. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, 4 (2), 50–59.
Mathur, A., Gehrmann, J., Atchison, D. A. (2014). Influences of luminance and accommodation stimuli on pupil size and pupil center location. Investigative Ophthalmology and Vision Science, 55, 2166–2172, https://doi.org/10.1167/iovs.13-13492.
Nguyen, A. T. T., Palmer, C. J., Otsuka, Y., Clifford, C. W. G., Nguyen, A. T. T., Palmer, C. J., & Clifford, C. W. G. (2018). Biases in perceiving gaze vergence biases in perceiving gaze vergence. Journal of Experimental Psychology: General, 147 (8), 1125–1133, https://doi.org/10.1037/xge0000398.
Noll, A. M. (1976). The effects of visible eye and head turn on the perception of being looked at. The American Journal of Psychology, 89 (4), 631–644.
Otsuka, Y., Mareschal, I., Calder, A. J., & Clifford, C. W. G. (2014). Dual-route model of the effect of head orientation on perceived gaze direction. Journal of Experimental Psychology: Human Perception and Performance, 40(4), https://doi.org/10.1037/a0036151.
Otsuka, Y., Mareschal, I., & Clifford, C. W. G. (2015). Gaze constancy in upright and inverted faces. Journal of Vision, 15 (1): 21, 1–14, https://doi.org/10.1167/15.1.21. [PubMed] [Article]
Otsuka, Y., Mareschal, I., & Clifford, C. W. G. (2016). Testing the dual-route model of perceived gaze direction: Linear combination of eye and head cues. Journal of Vision, 16 (8): 8, 1–12, https://doi.org/10.1167/16.8.8. [PubMed] [Article]
Palmer, C. J., & Clifford, C. W. G. (2017). The visual system encodes others' direction of gaze in a first-person frame of reference. Cognition, 168, 256–266, https://doi.org/10.1016/j.cognition.2017.07.007.
Park, C. Y., Oh, S. Y., & Chuck, R. S. (2012). Measurement of angle kappa and centration in refractive surgery. Current Opinion in Ophthalmology, 23 (4), 269–275, https://doi.org/10.1097/ICU.0b013e3283543c41.
Patel, S., Marshall, J., Fitzke, F. W. III (1995). Refractive index of the human corneal epithelium and stroma. Journal of Refractive Surgery, 11 (2): 100–141, https://doi.org/10.3928/1081-597X-19950301-09.
Regan, D., & Hamstra, S. J. (1992). Shape discrimination and the judgement of perfect symmetry: Dissociation of shape from size. Vision Research, 32 (10), 1845–1864.
Symons, L. A., Lee, K., Cedrone, C. Nishimura, C.M. (2004). What are you looking at? Acuity for triadic eye gaze. Journal of General Psychology, 131 (4), 451–469, https://doi.org/10.1109/TMI.2012.2196707.Separate.
Tabernero, J., Atchison, D. A., Markwell, E. L. (2009) Aberrations and pupil location under corneal topography and Hartmann-Shack Illumination Conditions. Investigative Ophthalmology & Visual Science, 50 (4), 1964–1970.
Todorović, D. (2006). Geometrical basis of perception of gaze direction. Vision Research, 46 (21), 3549–3562, https://doi.org/10.1016/j.visres.2006.04.011.
Todorović, D. (2009). The effect of face eccentricity on the perception of gaze direction. Perception, 38 (1), 109–132, https://doi.org/10.1068/p5930.
Watt, R., Craven, B., & Quinn, S. (2007). A role for eyebrows in regulating the visibility of eye gaze direction. Quarterly Journal of Experimental Psychology, 60 (9), 1169–1177, https://doi.org/10.1080/17470210701396798.
Weidenbacher, U., Layher, G., Strauss, P.-M., & Neumann, H. (2007). A comprehensive head pose and gaze database. 3rd IET International Conference on Intelligent Environments (IE 07), 2007, 455–458, https://doi.org/10.1049/cp:20070407.
West, R. W. (2010). Differences in the perception of monocular and binocular gaze. Optometry and Vision Science, 87 (2), E112–E119, https://doi.org/10.1097/OPX.0b013e3181ca345b.
West, R. W. (2011). Perceived direction of gaze from eyes with dark vs. light irises. Optometry and Vision Science, 88 (2), 303–311.
West, R. W. (2013). The effect of head turn and illumination on the perceived direction of gaze. Perception, 42 (5), 495–507. https://doi.org/10.1068/p7343.
West, R. W. (2015). Differences in the judged direction of gaze from heads imaged in 3-d versus 2-d. Perception, 44 (7), 727–742, https://doi.org/10.1177/0301006615594702.
West, R. W., Salmon, T. O., & Sawyer, J. K. (2008). Influence of the epicanthal fold on the perceived direction of gaze. Optometry and Vision Science, 85 (11), 1064–1073, https://doi.org/10.1097/OPX.0b013e31818b963b.
Wollaston, W. H. (1824). On the apparent direction of eyes in a portrait. Philosophical Transactions of the Royal Society of London, 114, 247–256, https://doi.org/10.1017/CBO9781107415324.004.
Wyatt, H. J. (1995). The form of the human pupil. Vision Research, 35 (14), 2021–2036, https://doi.org/10.1016/0042-6989(94)00268-Q.
Zanker, J. M., & Quenzer, T. (1999). How to tell circles from ellipses: Perceiving the regularity of simple shapes. Naturwissenschaften, 86 (10), 492–495, https://doi.org/10.1007/s001140050661.
Figure 1
 
The influence of head orientation on perceived gaze direction. (a) Schematic of the dual-route model (Otsuka et al., 2014). The arrows represent the two distinct routes of head orientation influence. First, head orientation has an indirect effect on perceived gaze direction via changes in the eye region. As the relative position of the iris/pupil for a given eye orientation shifts opposite to head turn, this indirect influence acts to bias perceived gaze direction opposite to head orientation (indirect route). Second, head orientation also has a direct effect on perceived gaze direction, biasing the perceived gaze direction toward the head orientation (direct route). (b) Illustration of the influence of head orientation on the eye-region information (indirect route). Three faces with different head orientations but fixed eye orientation relative to the observer are shown on the left side. The eyes to the right of each face are the enlarged view of the eye region of the left eye of each face (dotted square area). The eyeball above these images shows the actual orientation of the eye behind the eyelid. The eyes on the top and bottom eye region images each appear to be gazing in a direction slightly opposite to the head turn (repulsive effect), even though these eyes have an identical orientation.
Figure 1
 
The influence of head orientation on perceived gaze direction. (a) Schematic of the dual-route model (Otsuka et al., 2014). The arrows represent the two distinct routes of head orientation influence. First, head orientation has an indirect effect on perceived gaze direction via changes in the eye region. As the relative position of the iris/pupil for a given eye orientation shifts opposite to head turn, this indirect influence acts to bias perceived gaze direction opposite to head orientation (indirect route). Second, head orientation also has a direct effect on perceived gaze direction, biasing the perceived gaze direction toward the head orientation (direct route). (b) Illustration of the influence of head orientation on the eye-region information (indirect route). Three faces with different head orientations but fixed eye orientation relative to the observer are shown on the left side. The eyes to the right of each face are the enlarged view of the eye region of the left eye of each face (dotted square area). The eyeball above these images shows the actual orientation of the eye behind the eyelid. The eyes on the top and bottom eye region images each appear to be gazing in a direction slightly opposite to the head turn (repulsive effect), even though these eyes have an identical orientation.
Figure 2
 
Schematic illustration of the position of the center of the iris based on only the visible part (square), and on the whole iris (cross). The dotted line represents the contours of the iris occluded by the eyelids. (a) Iris near the center within the eye opening; (b) Iris at the corner of the eye opening.
Figure 2
 
Schematic illustration of the position of the center of the iris based on only the visible part (square), and on the whole iris (cross). The dotted line represents the contours of the iris occluded by the eyelids. (a) Iris near the center within the eye opening; (b) Iris at the corner of the eye opening.
Figure 3
 
Stimuli used in Experiments 1A and 1B. (a) Examples of each image type with 0°eye orientation; (b) examples of the other three faces with 0°eye orientation; (c) on-screen pointer.
Figure 3
 
Stimuli used in Experiments 1A and 1B. (a) Examples of each image type with 0°eye orientation; (b) examples of the other three faces with 0°eye orientation; (c) on-screen pointer.
Figure 4
 
Results from Experiment 1A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represents bootstrapped 95% CIs.
Figure 4
 
Results from Experiment 1A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represents bootstrapped 95% CIs.
Figure 5
 
Results from Experiment 1B. Data averaged across participants (n = 19). (a) The logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the logistic fitted data. Solid lines are best-fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 5
 
Results from Experiment 1B. Data averaged across participants (n = 19). (a) The logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the logistic fitted data. Solid lines are best-fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 6
 
Stimuli used in Experiments 2A and 2B. Examples of eyes-only images with 0°eye orientation.
Figure 6
 
Stimuli used in Experiments 2A and 2B. Examples of eyes-only images with 0°eye orientation.
Figure 7
 
Results from Experiment 2A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represent bootstrapped 95% CIs.
Figure 7
 
Results from Experiment 2A. Data averaged across participants (n = 20). (a) Averaged adjusted pointer angle as a function of eye orientation for each head orientation in each image condition together with the linear fits; (b) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the pointer task. Error bars represent bootstrapped 95% CIs.
Figure 8
 
Results from Experiment 2B. Data averaged across participants (n =18). (a) Logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the fitted data. Solid lines are best fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 8
 
Results from Experiment 2B. Data averaged across participants (n =18). (a) Logistic fits to the categorization data recoded as the proportion of the rightwards responses for each head orientation and condition. (b) Points of subjectively direct gaze derived from the fitted data. Solid lines are best fitting linear regression across head rotation for each condition. (c) The relative weighting of the head orientation information for each condition in determining perceived direction of gaze as measured by the categorization task. Error bars represent bootstrapped 95% CIs.
Figure 9
 
Estimated direct cue based on the results of the pointer task (Experiment 1A and 2A), and categorization task (Experiment 1B and 2B).
Figure 9
 
Estimated direct cue based on the results of the pointer task (Experiment 1A and 2A), and categorization task (Experiment 1B and 2B).
Figure 10
 
Examples of images rendered for the analysis of the stimulus eye area.
Figure 10
 
Examples of images rendered for the analysis of the stimulus eye area.
Figure 11
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding stimulus images. Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 11
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding stimulus images. Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 12
 
Results from the analysis of the position of the iris and pupil in the images of the current study. (a) The relative horizontal position of the iris/pupil centroid within the eye opening, averaged across the four stimulus faces. The centroid of the whole iris (left two panels), visible iris (central two panels), and pupil (right two panels). (b) The relative weighting of the head orientation information in determining the relative position of the iris for the near eye and the far eye in each image type (left). For comparison, the weighting of head orientation on the eye region information inferred from the psychophysical results (Experiment 2A and B) is shown in the right panel.
Figure 12
 
Results from the analysis of the position of the iris and pupil in the images of the current study. (a) The relative horizontal position of the iris/pupil centroid within the eye opening, averaged across the four stimulus faces. The centroid of the whole iris (left two panels), visible iris (central two panels), and pupil (right two panels). (b) The relative weighting of the head orientation information in determining the relative position of the iris for the near eye and the far eye in each image type (left). For comparison, the weighting of head orientation on the eye region information inferred from the psychophysical results (Experiment 2A and B) is shown in the right panel.
Figure 13
 
Noncircularity of the whole iris area in the stimulus images as a function of eye orientation for each head orientation. As an aspect ratio of one indicates perfect circularity of the iris, the noncircularity was calculated as one minus the aspect ratio of the iris area.
Figure 13
 
Noncircularity of the whole iris area in the stimulus images as a function of eye orientation for each head orientation. As an aspect ratio of one indicates perfect circularity of the iris, the noncircularity was calculated as one minus the aspect ratio of the iris area.
Figure 14
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding facial images simulating the stimulus image settings of Noll (1976). Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 14
 
Example plots of the estimated eye area and the centroids of the whole iris and visible iris, superimposed on the corresponding facial images simulating the stimulus image settings of Noll (1976). Square outline: eye opening area; circle: centroid estimated based on the whole iris; cross: centroid estimated based on the visible iris.
Figure 15
 
The relative horizontal position of the iris centroid within the eye opening in the simulated stimuli of Noll (1976). (a) The centroid of the whole iris, visible iris, and pupil. (b) The relative weighting of head orientation determining the relative position of the iris and pupil.
Figure 15
 
The relative horizontal position of the iris centroid within the eye opening in the simulated stimuli of Noll (1976). (a) The centroid of the whole iris, visible iris, and pupil. (b) The relative weighting of head orientation determining the relative position of the iris and pupil.
Figure 16
 
Results of the luminance-based analysis, averaged across four stimulus faces. (a) Estimated eye orientation based on the luminance distribution within eyes together with polynomial curve fitting to the data. (b) Stimulus eye orientation at the zero-crossing of the estimated eye orientation for each head orientation, together with the linear fit. (c) Calculated relative head weighting in determining the eye orientation based on the dual-route model.
Figure 16
 
Results of the luminance-based analysis, averaged across four stimulus faces. (a) Estimated eye orientation based on the luminance distribution within eyes together with polynomial curve fitting to the data. (b) Stimulus eye orientation at the zero-crossing of the estimated eye orientation for each head orientation, together with the linear fit. (c) Calculated relative head weighting in determining the eye orientation based on the dual-route model.
Figure 17
 
Summary results of the image analysis on simulated images with 15 levels of apparent vergence angle (−7° to 7°, 1° steps), five levels of interpupillary distance (5–7cm, 0.5-cm steps), and three viewing distances (distance of camera from the model 57.3 cm, 120 cm, and orthographic projection = infinite distance). The change in relative weighting of head orientation in determining the relative position of the whole iris of each eye as a function of apparent vergence angle is depicted for each camera distance and interpupillary distance (IPD). The darkened area in each graph denotes the data points where greater negativity for the far eye than for the near eye was obtained, as in the current study.
Figure 17
 
Summary results of the image analysis on simulated images with 15 levels of apparent vergence angle (−7° to 7°, 1° steps), five levels of interpupillary distance (5–7cm, 0.5-cm steps), and three viewing distances (distance of camera from the model 57.3 cm, 120 cm, and orthographic projection = infinite distance). The change in relative weighting of head orientation in determining the relative position of the whole iris of each eye as a function of apparent vergence angle is depicted for each camera distance and interpupillary distance (IPD). The darkened area in each graph denotes the data points where greater negativity for the far eye than for the near eye was obtained, as in the current study.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×