April 2018
Volume 18, Issue 4
Open Access
Article  |   April 2018
Eye gaze direction shows a positive serial dependency
Author Affiliations
  • David Alais
    School of Psychology, The University of Sydney, New South Wales, Australia
    David.Alais@sydney.edu.au
  • Garry Kong
    School of Psychology, The University of Sydney, New South Wales, Australia
    Science Division, New York University - Abu Dhabi, Abu Dhabi, United Arab Emirates
  • Colin Palmer
    School of Psychology, UNSW Sydney, Sydney, Australia
  • Colin Clifford
    School of Psychology, UNSW Sydney, Sydney, Australia
Journal of Vision April 2018, Vol.18, 11. doi:10.1167/18.4.11
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      David Alais, Garry Kong, Colin Palmer, Colin Clifford; Eye gaze direction shows a positive serial dependency. Journal of Vision 2018;18(4):11. doi: 10.1167/18.4.11.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent work from several groups has shown that perception of various visual attributes in human observers at a given moment is biased towards what was recently seen. This positive serial dependency is a kind of temporal averaging which exploits short-term correlations in visual scenes to reduce noise and stabilize perception. Here we test for serial dependencies in perception of head and eye direction using a simple reproduction method to measure perceived head/eye gaze direction in rapid sequences of briefly presented face stimuli. In a series of three experiments, our results reveal that perceived eye gaze direction shows a positive serial dependency for changes in eye direction, along both the vertical and horizontal dimensions, although more strongly for horizontal gaze shifts. By contrast, we found no serial dependency at all for horizontal changes in head position. These findings show that a perception-stabilizing ‘continuity field' operates on eye position—well known to be quite variable over short timescales—while the more inherently stable signal from head position does not.

Introduction
The retinal image is surprisingly noisy, with sources such as shading, occlusion, eye movements, blinks, etc., adding to neural noise to increase variability and degrade signal strength. Yet, despite this noise, our perception of the world tends to be stable and reliable. One process that might help improve signal reliability is positive serial dependence (Corbett, Fischer, & Whitney, 2011). A number of recent studies have shown that perception of current stimuli is biased towards the recent past, effectively a short-term temporal averaging process which improves signal-to-noise ratio and stabilizes perception (Kiyonaga, Scimeca, Bliss, & Whitney, 2017). A bias to the recent past (a positive serial dependency) has been demonstrated for basic stimuli such as orientation (Fischer & Whitney, 2014) and motion (Alais, Leung, & Van der Burg, 2017), for various aspects of face perception (Kok, Taubert, Van der Burg, Rhodes, & Alais, 2017; Liberman, Fischer, & Whitney, 2014; Taubert & Alais, 2016; Taubert, Alais, & Burr, 2016; Taubert, Van der Burg, & Alais, 2016; Xia, Leib, & Whitney, 2016), as well as for numerosity (Cicchini, Anobile, & Burr, 2014; Corbett et al., 2011) and scene perception (Manassi, Liberman, Chaney, & Whitney, 2017). In all these cases, current perception exhibits an attractive bias towards recently seen stimuli, an effective way to discount moment to moment fluctuations in favor of a temporally stable percept. 
Not all stimulus sequences elicit positive serial dependencies. For some stimuli, there is a repulsion from the previously seen stimulus, so that the difference between the current and previous stimulus is exaggerated. This effect is similar to traditional repulsive aftereffects seen after sustained exposure to an adapting stimulus, such as the tilt (Clifford, Wenderoth, & Spehar, 2000; Gibson & Radner, 1937) and motion aftereffects (Addams, 1834; Mather, Anstis, & Verstraten, 1998). Sequences of varying auditory frequency sweeps, for example, cause the perceived direction of a given frequency sweep to exhibit a repulsive (negative) dependency on the preceding one (Alais, Orchard-Mills, & Van der Burg, 2015). Similarly, sequences of brief audio-visual stimuli varying in relative timing cause a repulsive shift in temporal order perception (Van der Burg, Alais, & Cass, 2013; Van der Burg, Orchard-Mills, & Alais, 2015). As with positive serial dependencies, negative dependencies may also serve a useful perceptual function, in this case by helping individuate successive stimuli and improving our sensitivity to change. Both positive and negative dependencies are thus useful, and there are examples of positive and negative dependencies arising simultaneously from different attributes of a single stimulus, as observed in motion perception (Alais et al., 2017) and face perception (Taubert, Alais, et al., 2016). In Taubert, Alais, et al.'s study, face gender showed a positive serial dependency while face expression showed a negative dependency. In Alais et al.'s study, the motion component showed a positive perceptual dependency while the orientation component showed a negative one. 
One aspect of perception that might benefit a positive dependency is gaze perception, the perceived direction of the eyes of another person. Gaze is very important because as primates we are very social beings who are keenly interested in determining the direction of gaze of other humans with whom we interact. Perceiving the eye gaze of another person is important in evaluating their intentions, mood, and their level of engagement. Detecting gaze direction is thus a critical component in social cognition as it helps us understand the mental state of the gazer (Becchio, Bertone, & Castiello, 2008; Itier & Batty, 2009). It can also alert us to sudden changes in the visual environment that capture the gazers' attention and trigger a reflexive gaze shift in their eyes towards a new location. Because eye position varies continually, due to saccades at a rate of 3–4 Hz and to microsaccades at much higher (though typically not perceptible) frequencies (Rolfs, 2009), a stabilizing recency bias could be beneficial. Moreover, large amplitude gaze shifts involve both eye and head movements (Freedman & Sparks, 2000), and our perception of others' gaze direction depends on both their eye deviation and head orientation (Otsuka, Mareschal, Calder, & Clifford, 2014). In this study, we examine how changes in eye and head position are influenced by preceding stimuli, testing whether the separate components of eye and head orientation show a serial dependency, and whether any such dependency is positive or negative. To preview the results, our findings show a significant positive dependency for eye gaze, but no dependency for head position. 
Methods
Participants
Fifteen participants (four male, 11 female; mean age = 20.2, ranging from 18 to 30) took part in Experiment 1; 15 participants (five male, 10 female; mean age = 21.1, ranging from 19 to 30) took part in Experiment 2; and 15 participants (one male, 14 female; mean age = 22.0, ranging from 19 to 30) took part in Experiment 3. Participants had normal or corrected-to-normal vision. All participants were naïve as to the purpose of the experiment. 
Apparatus and stimuli
Stimuli were presented on a 19-in. DiamondDigital CRT monitor (100 Hz refresh rate, 1,024 × 768 pixels), which was placed 50 cm away from the participant. Face stimuli were generated using 3D graphical modelling software. Textures and models for six identities (three male, three female) were generated using FaceGen Modeller 3.5. These were imported into a scene-based modelling program (Blender 2.70), in which the eyes were modelled separately from the rest of the face. This separation allowed the deviation of the eyes and head relative to the viewer to be set precisely. 
Images were presented on a gray screen (11.05 cd m−2) at life-size, defined by a distance between pupils corresponding to the human average of approximately 6.3 cm (Rosenfeld & Logan, 2009). The convergence of the eyes was set such that the face stimulus was fixating at a depth that matched the participant viewing distance. This meant that a stimulus with direct gaze appeared to be looking at the viewer, rather than at a point in front or behind the viewer. In Experiments 1 and 2, the head faced directly towards the viewer while the deviation of the eyes varied across trials. In Experiment 1, eye deviation was varied in the horizontal plane in 3° increments between 9° leftwards and 9° rightwards (Figure 1A). In Experiment 2, eye deviation was varied in the vertical plane in 3° increments between 9° leftwards and 9° rightwards (Figure 1B). In Experiment 3, the deviation of the head varied across trials while the eyes were always visible and directed towards the viewer (previous studies of head orientation perception have varied in whether the eyes were occluded or not, e.g., Fang & He, 2005; Lawson, Clifford, & Calder, 2011). Head deviation was varied in the horizontal plane in 2° increments between 6° leftwards and 6° rightwards (Figure 1C). 
Figure 1
 
(A–C) Exemplars of the face stimuli used in Experiments 1–3. The difference in gaze directions is more apparent when presented at life-size (as was the case in our experiments). These examples have enough resolution to allow zooming-in to life-sizes scale. (D) Participants reported the perceived direction of gaze (Experiments 1 and 2) or head deviation (Experiment 3) by rotating an on-screen pointer.
Figure 1
 
(A–C) Exemplars of the face stimuli used in Experiments 1–3. The difference in gaze directions is more apparent when presented at life-size (as was the case in our experiments). These examples have enough resolution to allow zooming-in to life-sizes scale. (D) Participants reported the perceived direction of gaze (Experiments 1 and 2) or head deviation (Experiment 3) by rotating an on-screen pointer.
Images were presented so that the participant's eyes were at the same level as that of the image. Then, to discourage change detection strategies, images were jittered at random by moving the image up to 0.8° left or right of, and up to 0.8° above or below eye level, a possible nine locations in total. 
Participants responded by indicating the perceived eye or head deviation using a 3D pointer (as used in a previous study by Otsuka, Mareschal, & Clifford, 2016), consisting of a shaded sphere with a red dot, initially facing the participant (neutral). Participants manipulated the rotation of the sphere using a mouse and were instructed to point the red dot towards the direction in which either the eyes were looking towards (Experiments 1 and 2) or the direction in which the head was facing (Experiment 3). Rotation of the sphere was restricted to the horizontal plane in Experiments 1 and 3, and to the vertical plane in Experiment 2. 
Design and procedure
Each trial began with the presentation of the face stimuli for 300 ms, followed by the presentation of the 3D pointer in the center of the screen to record the participant's response. A blank screen was then presented for 150 ms, before the next trial was initiated. Participants completed 490 trials. Both the direction of the eye (or head in Experiment 3) and the identity of the face were chosen at random for each trial. Participants were provided with the opportunity to take a break every 30 displays. The experiment was preceded by a practice session of up to 30 trials to get familiar with the task. 
Computational modelling
In modelling the perceived direction of eye gaze or the perceived orientation of a head, we considered two potential sources of bias captured by a total of four model parameters. The first source of bias was the possibility that participants would have a systematic tendency to misperceive gaze deviation (Calder, Jenkins, Cassel, & Clifford, 2008), and that this tendency might itself vary with the deviation of the stimulus (Anstis, Mayhew, & Morley, 1969; Otsuka et al., 2016). To this end, we modelled perceived direction as a linear function of stimulus deviation,Display Formula\(\def\upalpha{\unicode[Times]{x3B1}}\)\(\def\upbeta{\unicode[Times]{x3B2}}\)\(\def\upgamma{\unicode[Times]{x3B3}}\)\(\def\updelta{\unicode[Times]{x3B4}}\)\(\def\upvarepsilon{\unicode[Times]{x3B5}}\)\(\def\upzeta{\unicode[Times]{x3B6}}\)\(\def\upeta{\unicode[Times]{x3B7}}\)\(\def\uptheta{\unicode[Times]{x3B8}}\)\(\def\upiota{\unicode[Times]{x3B9}}\)\(\def\upkappa{\unicode[Times]{x3BA}}\)\(\def\uplambda{\unicode[Times]{x3BB}}\)\(\def\upmu{\unicode[Times]{x3BC}}\)\(\def\upnu{\unicode[Times]{x3BD}}\)\(\def\upxi{\unicode[Times]{x3BE}}\)\(\def\upomicron{\unicode[Times]{x3BF}}\)\(\def\uppi{\unicode[Times]{x3C0}}\)\(\def\uprho{\unicode[Times]{x3C1}}\)\(\def\upsigma{\unicode[Times]{x3C3}}\)\(\def\uptau{\unicode[Times]{x3C4}}\)\(\def\upupsilon{\unicode[Times]{x3C5}}\)\(\def\upphi{\unicode[Times]{x3C6}}\)\(\def\upchi{\unicode[Times]{x3C7}}\)\(\def\uppsy{\unicode[Times]{x3C8}}\)\(\def\upomega{\unicode[Times]{x3C9}}\)\(\def\bialpha{\boldsymbol{\alpha}}\)\(\def\bibeta{\boldsymbol{\beta}}\)\(\def\bigamma{\boldsymbol{\gamma}}\)\(\def\bidelta{\boldsymbol{\delta}}\)\(\def\bivarepsilon{\boldsymbol{\varepsilon}}\)\(\def\bizeta{\boldsymbol{\zeta}}\)\(\def\bieta{\boldsymbol{\eta}}\)\(\def\bitheta{\boldsymbol{\theta}}\)\(\def\biiota{\boldsymbol{\iota}}\)\(\def\bikappa{\boldsymbol{\kappa}}\)\(\def\bilambda{\boldsymbol{\lambda}}\)\(\def\bimu{\boldsymbol{\mu}}\)\(\def\binu{\boldsymbol{\nu}}\)\(\def\bixi{\boldsymbol{\xi}}\)\(\def\biomicron{\boldsymbol{\micron}}\)\(\def\bipi{\boldsymbol{\pi}}\)\(\def\birho{\boldsymbol{\rho}}\)\(\def\bisigma{\boldsymbol{\sigma}}\)\(\def\bitau{\boldsymbol{\tau}}\)\(\def\biupsilon{\boldsymbol{\upsilon}}\)\(\def\biphi{\boldsymbol{\phi}}\)\(\def\bichi{\boldsymbol{\chi}}\)\(\def\bipsy{\boldsymbol{\psy}}\)\(\def\biomega{\boldsymbol{\omega}}\)\(\def\bupalpha{\unicode[Times]{x1D6C2}}\)\(\def\bupbeta{\unicode[Times]{x1D6C3}}\)\(\def\bupgamma{\unicode[Times]{x1D6C4}}\)\(\def\bupdelta{\unicode[Times]{x1D6C5}}\)\(\def\bupepsilon{\unicode[Times]{x1D6C6}}\)\(\def\bupvarepsilon{\unicode[Times]{x1D6DC}}\)\(\def\bupzeta{\unicode[Times]{x1D6C7}}\)\(\def\bupeta{\unicode[Times]{x1D6C8}}\)\(\def\buptheta{\unicode[Times]{x1D6C9}}\)\(\def\bupiota{\unicode[Times]{x1D6CA}}\)\(\def\bupkappa{\unicode[Times]{x1D6CB}}\)\(\def\buplambda{\unicode[Times]{x1D6CC}}\)\(\def\bupmu{\unicode[Times]{x1D6CD}}\)\(\def\bupnu{\unicode[Times]{x1D6CE}}\)\(\def\bupxi{\unicode[Times]{x1D6CF}}\)\(\def\bupomicron{\unicode[Times]{x1D6D0}}\)\(\def\buppi{\unicode[Times]{x1D6D1}}\)\(\def\buprho{\unicode[Times]{x1D6D2}}\)\(\def\bupsigma{\unicode[Times]{x1D6D4}}\)\(\def\buptau{\unicode[Times]{x1D6D5}}\)\(\def\bupupsilon{\unicode[Times]{x1D6D6}}\)\(\def\bupphi{\unicode[Times]{x1D6D7}}\)\(\def\bupchi{\unicode[Times]{x1D6D8}}\)\(\def\buppsy{\unicode[Times]{x1D6D9}}\)\(\def\bupomega{\unicode[Times]{x1D6DA}}\)\(\def\bupvartheta{\unicode[Times]{x1D6DD}}\)\(\def\bGamma{\bf{\Gamma}}\)\(\def\bDelta{\bf{\Delta}}\)\(\def\bTheta{\bf{\Theta}}\)\(\def\bLambda{\bf{\Lambda}}\)\(\def\bXi{\bf{\Xi}}\)\(\def\bPi{\bf{\Pi}}\)\(\def\bSigma{\bf{\Sigma}}\)\(\def\bUpsilon{\bf{\Upsilon}}\)\(\def\bPhi{\bf{\Phi}}\)\(\def\bPsi{\bf{\Psi}}\)\(\def\bOmega{\bf{\Omega}}\)\(\def\iGamma{\unicode[Times]{x1D6E4}}\)\(\def\iDelta{\unicode[Times]{x1D6E5}}\)\(\def\iTheta{\unicode[Times]{x1D6E9}}\)\(\def\iLambda{\unicode[Times]{x1D6EC}}\)\(\def\iXi{\unicode[Times]{x1D6EF}}\)\(\def\iPi{\unicode[Times]{x1D6F1}}\)\(\def\iSigma{\unicode[Times]{x1D6F4}}\)\(\def\iUpsilon{\unicode[Times]{x1D6F6}}\)\(\def\iPhi{\unicode[Times]{x1D6F7}}\)\(\def\iPsi{\unicode[Times]{x1D6F9}}\)\(\def\iOmega{\unicode[Times]{x1D6FA}}\)\(\def\biGamma{\unicode[Times]{x1D71E}}\)\(\def\biDelta{\unicode[Times]{x1D71F}}\)\(\def\biTheta{\unicode[Times]{x1D723}}\)\(\def\biLambda{\unicode[Times]{x1D726}}\)\(\def\biXi{\unicode[Times]{x1D729}}\)\(\def\biPi{\unicode[Times]{x1D72B}}\)\(\def\biSigma{\unicode[Times]{x1D72E}}\)\(\def\biUpsilon{\unicode[Times]{x1D730}}\)\(\def\biPhi{\unicode[Times]{x1D731}}\)\(\def\biPsi{\unicode[Times]{x1D733}}\)\(\def\biOmega{\unicode[Times]{x1D734}}\)\({f_0}\). This linear function was described by two model parameters, the “overall bias,” Display Formula\({b_{overall}}\), in perceived direction across the range of stimulus deviations and the “slope,”Display Formula\(s\), such that the systematic bias in direction, Display Formula\({b_{systematic}}\), is given by  
\begin{equation}{b_{systematic}}\left( {f_0} \right) = {b_{overall}} + {f_0} \times s{\rm {.}}\end{equation}
Veridical perception would correspond to an overall bias of zero and a slope of one.  
The second source of bias that we modelled was serial dependence, i.e., the tendency for the perception of the current stimulus to be influenced by the immediately preceding stimulus (e.g., Fischer & Whitney, 2014; Liberman et al., 2014; Taubert, Alais, et al., 2016). To capture serial dependence in the data, we modelled the (signed) error in perceived direction as a derivative of Gaussian function of the difference between the previous and the present stimulus. Underlying the use of a derivative of Gaussian function to model serial dependence (Fischer & Whitney, 2014) is the assumption that the strength of any interaction between successive stimuli will decrease monotonically and symmetrically with the difference between those stimuli. This qualitative behavior is conveniently described by a Gaussian function, such that the perceived direction of a stimulus is effectively modelled as a weighted sum of the present and previous stimuli where the magnitude of the weight attached to the previous stimulus is a Gaussian function of the difference in their directions. For a given weighting, w, of the preceding stimulus, Display Formula\({f_{ - 1}}\), the bias it induces, Display Formula\({b_{serial}}\), on the perceived direction of the present stimulus, Display Formula\({f_0}\), will be proportional to the difference in their directions:  
\begin{equation}{b_{serial}}\left( {{f_{ - 1}} \to {f_0}} \right) = \left[ {w \times {f_{ - 1}} + \left( {1 - w} \right) \times {f_0}} \right] - {f_0} = w \times ({f_{ - 1}} - {f_0}){\rm {.}}\end{equation}
Thus, if the weight attached to the previous stimulus is a Gaussian function of the difference in their directions, then the bias it induces will be a derivative of Gaussian function of that difference, i.e., if  
\begin{equation}w\sim \exp \left[ { - {1 \over 2}{{\left( {{{{f_{ - 1}} - {f_0}} \over \sigma }} \right)}^2}} \right]{\rm {,}}\end{equation}
then  
\begin{equation}{b_{serial}}\left( {{f_{ - 1}} \to {f_0}} \right) = \left( {M \over \sigma } \right)\exp \left[ {1 \over 2} \right] \times \left( {{f_{ - 1}} - {f_0}} \right) \times \exp \left[ { - {1 \over 2}{{\left( {{{{f_{ - 1}} - {f_0}} \over \sigma }} \right)}^2}} \right]{}\end{equation}
where Display Formula\(\sigma \) is the model parameter representing the direction difference at which the effect of serial dependence is maximal. The fourth model parameter, M, represents the peak magnitude of this serial dependence. Veridical perception (i.e., no influence from the previous trial) would correspond to a peak magnitude of zero. Together, the effects of these two sources of bias sum such that their net effect, Display Formula\({b_{net}}\), is given by  
\begin{equation}{b_{net}}\left( {{f_0}|{f_{ - 1}}} \right) = {b_{systematic}}\left( {f_0} \right) + {b_{serial}}\left( {{f_{ - 1}} \to {f_0}} \right){\rm {.}}\end{equation}
 
Geometrically, the fitted model corresponds to a surface in three dimensions where perceived direction is modelled as a function of the direction of the present stimulus, Display Formula\({f_0}\), and the direction difference, Display Formula\({f_{ - 1}} - {f_0}\), between the previous and present stimulus. However, for the purpose of visualization, the experimental data will be presented in the Results section collapsed across either (a) the possible range of stimulus directions, Display Formula\({f_0}\), for each difference in direction, Display Formula\({f_{ - 1}} - {f_0}\), between the previous and present stimulus, or (b) the possible range of direction differences, Display Formula\({f_{ - 1}} - {f_0}\), between the previous and present stimulus for each direction, Display Formula\({f_0}\), of the present stimulus. 
For each experiment, model fitting was performed by minimizing the sum of squared error to the ensemble of data from all participants. To establish 95% CI on the resulting parameter estimates, bootstrap resampling with replacement (Efron & Tibshirani, 1993) was carried out across participants to create a surrogate data set of the same size as the original. The process was repeated to create 10,000 such data sets. The same analyses were then run on these surrogate data sets as on the original and the 2.5 and 97.5 percentiles of the resulting parameter distributions were taken as the respective lower and upper limits of the confidence interval. 
Results
Experiment 1: Horizontal gaze direction
Experiment 1 investigated the perception of horizontal gaze direction. The resulting data are shown in Figure 2, represented in two different ways. Figure 2A shows perceived direction of gaze, as indicated by the mean pointer response across subjects (N = 15), as a function of the gaze deviation of the stimulus. The data show a monotonic and near linear relationship between response and stimulus. However, responding is far from veridical (compare data with dotted line in Figure 2A), showing a marked tendency to overestimate gaze deviation. 
Figure 2
 
Data from Experiment 1 (horizontal gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Difference is represented as the direction of the previous stimulus minus the direction of the present stimulus, such that a positive difference indicates that the previous stimulus was more rightwards than the present one. Error is represented as the pointer response minus the actual gaze deviation of the stimulus, such that a positive error indicates a response more rightwards than the actual gaze direction of the stimulus. Dotted line represents the linear component of the model fit capturing the systematic bias to overestimate gaze deviation. In both panels, solid lines show the fit of the model; error bars are ± one standard error of the mean across subjects.
Figure 2
 
Data from Experiment 1 (horizontal gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Difference is represented as the direction of the previous stimulus minus the direction of the present stimulus, such that a positive difference indicates that the previous stimulus was more rightwards than the present one. Error is represented as the pointer response minus the actual gaze deviation of the stimulus, such that a positive error indicates a response more rightwards than the actual gaze direction of the stimulus. Dotted line represents the linear component of the model fit capturing the systematic bias to overestimate gaze deviation. In both panels, solid lines show the fit of the model; error bars are ± one standard error of the mean across subjects.
Figure 2B shows the same data now in terms of the mean (signed) error in response as a function of the difference in gaze direction between the current and previous stimulus. Specifically, error is represented as the pointer response minus the actual gaze deviation of the stimulus, such that a positive error indicates a response more rightwards than the actual gaze direction of the stimulus. The data can be seen to exhibit systematic biases as a function of the difference between the current and previous stimulus. However, it is important to understand that this is to be expected on the basis of the systematic overestimation of gaze evident in Figure 2A and is not, of itself, evidence of serial dependency in gaze perception. This point merits further explanation. 
By way of illustration, consider the data point on the far left of Figure 2B. This represents the response to stimuli preceded by a stimulus that was gazing 18° further to the left. Given that the range of stimuli was limited to ±9°, a gaze difference of 18° could only be produced by presenting a stimulus at −9° followed by one at +9°. Since the data from Figure 2A show a strong tendency to overestimate the magnitude of gaze deviation of averted stimuli, this necessarily translates to a systematic bias in the perception of stimuli at extreme gaze deviations. Importantly, this potential bias contamination is inevitably present not only for extremes but, to a lesser extent, throughout any data set collected as a function of an extensive (as opposed to circular) dimension. 
Parenthetically, a uniform distribution of stimulus trials presented in pseudorandom order, as employed here, leads to a triangular distribution of trials as a function of the difference in the direction of the present and previous stimulus such that the extremes are underrepresented. Thus, the extreme data points do little to constrain the model fit and might be expected to show larger intersubject variability, as evident in Figure 2B
By modelling the data as a linear function of current stimulus gaze deviation (“systematic bias”) summed with a term dependent on the difference between previous and present stimuli (“serial bias”) we are able to parcel out the effect of systematic bias (indicated by the dotted line in Figure 2B). The data in Figure 2B can be seen to deviate systematically from the dotted line in a manner captured by the solid line, which represents the modelled net effect of systematic and serial bias. Quantitatively, the best-fitting model parameters are shown in Table 1 (variance accounted for: 96.0%). 
Table 1
 
Fitted parameters to the data from Experiment 1 (horizontal gaze deviation).
Table 1
 
Fitted parameters to the data from Experiment 1 (horizontal gaze deviation).
The model fits indicate that there was significant serial dependency in the data in that the (signed) peak magnitude was significantly greater than zero, indicating a tendency for a stimulus to be reported as more similar to the previous stimulus than was in fact the case. This effect reached its peak magnitude of 1.43° for gaze differences of around ±4° between the previous and present stimulus. Subjects displayed no significant overall leftwards or rightwards bias in their perception of gaze. The mean of Display Formula\({b_{overall}}\) was 1.03°, indicating a small rightward value, but it was nonsignificant as the 95% confidence limits for Display Formula\({b_{overall}}\) bracketed zero [−0.57°, 2.75°]. However, there was a significant tendency to overestimate the magnitude of gaze deviation from direct, as indicated by a slope, Display Formula\(s\), significantly greater than 1. The best-fitting slope parameter of 1.97 represents an overestimation of 97%. 
Experiment 2: Vertical gaze direction
Experiment 2 followed the same procedure as Experiment 1 except that it investigated the perception of vertical rather than horizontal gaze deviation. This is an important extension, not only because the geometry of the eyes and head is very different along the horizontal and vertical directions, but also because of the different social signals conveyed by horizontally and vertically averted gaze. For example, downwards gaze can signal shame or embarrassment (Darwin, 1872), and gaze can be averted downwards while still being directed at the viewer (Lawson et al., 2011). 
The data from Experiment 2 are shown in Figure 3; best-fitting model parameters and their associated confidence intervals are given in Table 2 (variance accounted for 94.4%). Qualitatively, the pattern of results is very similar to that for Experiment 1. There is again a monotonic and near linear relationship between response and stimulus with a significant tendency to overestimate the deviation of gaze from subjective direct (Figure 3A). There is also a significant serial bias, indicated by the systematic deviation of the data in Figure 3B from the dotted line, corresponding to a tendency for a stimulus to be reported as similar to the previous one. Quantitatively, the mean overestimation of gaze deviation was 56%, and the serial dependency reached its peak magnitude of 0.61° for gaze differences of around ±4° between the previous and present stimulus. 
Figure 3
 
Data from Experiment 2 (vertical gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Format as for Figure 2.
Figure 3
 
Data from Experiment 2 (vertical gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Format as for Figure 2.
Table 2
 
Fitted parameters to the data from Experiment 2 (vertical gaze deviation).
Table 2
 
Fitted parameters to the data from Experiment 2 (vertical gaze deviation).
One apparent difference from the first Experiment 1, which with hindsight we might have expected, was a significant overall bias of 6.21°. This appears to be a consequence of positioning the stimulus images so that eyes were at the same level as the participant's but presenting the pointer in the center of the screen. This resulted in the pointer being positioned approximately 6.8° of visual angle below the eyes of the stimulus. Thus, to be seen as pointing in the same direction as the stimulus's eyes from the point of view of the participant, the pointer would have to be rotated upwards by approximately 6.8°. This is well within the 95% CI for our measure of overall bias [4.92°, 7.78°], indicating that there is no evidence for any genuine overall bias in the perception of vertical eye deviation. The data plotted in Figure 3 have been shifted vertically by subtracting 6.8°. 
Experiment 3: Horizontal head direction
Experiment 3 followed a similar procedure to the previous experiments except that participants were now required to report the direction of the stimulus' head rather than its eye gaze. The horizontal deviation of the head varied across trials while the eyes were always directed towards the viewer. The resulting data are shown in Figure 4. As with eye gaze, the data for head direction show a monotonic and near-linear relationship between response and stimulus (Figure 4A). However, a purely linear fit was able to account for 97.3% of the variance in the data plotted in Figure 4B, indicating that there was no appreciable serial dependency in the data. 
Figure 4
 
Data from Experiment 3 (horizontal head direction). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the direction of the stimulus's head. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in head direction of the previous and present stimulus. Format as for Figures 2 and 3.
Figure 4
 
Data from Experiment 3 (horizontal head direction). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the direction of the stimulus's head. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in head direction of the previous and present stimulus. Format as for Figures 2 and 3.
Fitting the model to bootstrapped data yielded confidence intervals that bracketed zero for both serial dependency parameters, confirming a lack of significant serial dependence in the data. Thus, the parameters presented in Table 3 are for a linear fit to systematic bias. They reveal a significant overestimation of the magnitude of deviation of head direction by 62% but no significant left-right bias. 
Table 3
 
Fitted parameters to the data from Experiment 3 (horizontal head deviation).
Table 3
 
Fitted parameters to the data from Experiment 3 (horizontal head deviation).
Response times for Experiments 1, 2, and 3
Finally, we analyzed reaction times and found that reaction times (RTs) for head direction trials were not longer than for gaze direction trials. The mean RTs (and standard deviation) for each experiment were as follows: Experiment 1, mean RT = 1,529 ms (258 ms); Experiment 2, mean RT = 1,342 ms (204 ms); Experiment 3, mean RT = 1,572 ms (191 ms). A one-way ANOVA revealed a significant main effect of experiment, F(2, 42) = 4.6, p = 0.015, and posthoc tests indicated that RTs were significantly faster in Experiment 2 (vertical gaze deviation) compared to Experiment 3 (horizontal head deviation), p = 0.006, but not compared to Experiment 1 (horizontal gaze deviation), p = 0.024 (Bonferroni-corrected alpha level = 0.017). There was also no significant difference in RTs between horizontal gaze deviation (Experiment 1) and horizontal head deviation (Experiment 3), p = 0.60. 
Discussion
In three experiments testing for serial dependencies in perceived eye and head direction, we find clear evidence of a positive dependency for eye gaze direction in the horizontal dimension (Figure 2B) but not for horizontal changes in head direction (Figure 4B). We also found a positive serial dependency for eye gaze in the vertical dimension (Figure 3B), although it was weaker than for horizontal gaze shifts. These findings add to other recent studies reporting that perception exhibits a positive serial dependency for a wide range of perceptual stimuli (Alais et al., 2017; Chang, Kim, & Cho, 2017; Cicchini et al., 2014; Corbett et al., 2011; Fischer & Whitney, 2014; Kiyonaga et al., 2017; Liberman et al., 2014; Manassi et al., 2017; Taubert, Alais, et al., 2016; Taubert, Van der Burg, et al., 2016; Xia et al., 2016). The advantage of this strategy is that integrating current stimulus estimates with preceding estimates can improve signal-to-noise ratios and stabilize vision. This would be particularly useful for gaze perception as the eyes move about jerkily from moment to moment, even when focused on a single object, task, or interlocutor, and a strategy of averaging over the recent past would therefore stabilize perception. This strategy of biasing perception to the recent past minimizes the chances of treating fluctuations in stimulus input as meaningful and thus limits the possibility of the visual system “overfitting” the data and treating all variations as meaningful signal. A positive dependency thus functions like a Kalman filter in system control models (Kalman, 1960) or like a predictive coding strategy, operating over a short time-frame to build stability by using the previous input to predict current stimuli (Friston, 2005; Rao & Ballard, 1999). 
While we found a clear serial dependency for eye gaze direction, head direction showed no hint of a serial dependency and was overwhelmingly accounted for by a simple linear model (Figure 4B). Serial dependency effects are time dependent, but timing differences can be ruled out as an explanation for the lack of head direction dependency because response times did not differ significantly between Experiment 1 (horizontal eye direction) and Experiment 3 (horizontal head direction), and the intertrial timing was the same in both experiments; yet there was an effect for eye direction but not for head direction. Based on the view that a positive dependency functions as a “continuity field” to stabilize momentary fluctuations in perception (Fischer & Whitney, 2014; Kiyonaga et al., 2017), it would be more beneficial to perception of eye direction than to head direction. This theory follows simply because eye movements occur much more frequently than head movements. Saccades occur at a rate of 3–4 Hz while the head is often stable for periods of several seconds at a time, and the much higher mass and inertia of the head relative to the eye means the head can be held still without jittery fluctuations from moment to moment (Cullen, 2009). The most common context in which we view someone making head movements is during conversation. Head movements are most frequent when viewing another person who is speaking, but the rate is low relative to saccades (0.75–1 Hz); and the rate is several times lower again when it is that person's turn to listen (Hadar, Steiner, Grant, & Rose, 1984; Hadar, Steiner, & Rose, 1985). These differences are informative in relation to the failure to find a head direction dependency for two reasons. First, there is less reason to integrate the recent past into current perception because, relative to the eyes, the head moves less often and is more stable between movements. Second, the elapsed time between head movements is large relative to eye movements and may well exceed the integration period of the continuity field operation. 
While there may be less benefit of a positive serial dependency for head position compared to eye position given the relative timescales of change typical for these stimuli, a negative (repulsive) serial dependency is thought to improve sensitivity to change. This follows as a consequence of repulsion from the recent stimulus increasing the apparent magnitude of change, as in the tilt and direction-of-motion aftereffects (Clifford, Wyatt, Arnold, Smith, & Wenderoth, 2001; Curran, Clifford, & Benton, 2006), which should increase discrimination performance provided noise variability does not increase by the same factor (Clifford et al., 2001). Given that head movements may be a more stable signal than eye movements, a negative dependency would be a useful way to improve perception of changes in head orientation. This would mean having two opposite dependencies at work within the same stimulus, which is possible if each operates on a different stimulus attribute. This has been reported in two recent studies, one finding opposite dependencies for motion (positive) and orientation (negative) in sequences of translating dot images (Alais et al., 2017), and another finding opposite dependencies for gender (positive) and expression (negative) in sequences of face images (Taubert, Alais, et al., 2016). Although we did not find any serial dependency for head position, the theoretical point remains that a negative dependency would be beneficial strategy to detect change and future studies may yet find such an effect (or a positive serial dependency effect). It may be, for example, that a larger range of head angles would reveal a repulsive dependency for head direction, or perhaps longer stimulus presentations. The latter point is relevant as serial dependencies can be positive for short durations (<1 s) and negative for longer durations of several seconds (Fischer & Whitney, 2014). It is a possibility, therefore, that the null result for head direction occurred because our stimulus duration was at the point of the time-course where positive and negative dependencies cancel each other, although this seems unlikely given our brief stimuli (300 ms). 
The range of angles used in this study also provides important context for interpreting the magnitude of the effects reported here. In the horizontal gaze data (see Table 1), we find that the perceptual attraction in the present trial towards the preceding trial occurs for a gaze difference between the trials of 4.03°. At this point, the magnitude of the attractive dependency peaks with a magnitude of 1.43°. Given that the range of gaze directions used in these experiments was relatively small (−9° to +9°), and that the average gaze direction difference between trials is just 6.86°, an effect size of 1.43° is considerable. The average gaze difference of 6.86° is a weighted average, taking account of the fact that the distribution of gaze differences between trials is triangular. Thus, while gaze differences as large as 18° are possible, smaller differences are more likely. Put into this context, an effect of 1.43° therefore represents 20.8% of the average intertrial gaze difference. An open question that remains is whether the peak attraction effect we found at 4° would generalize to other stimulus ranges. While the peak effect could conceivably occur at higher (or lower) angular differences if the stimulus range were to increase (or decrease), our conjecture is that it would remain more or less constant as it probably reflects a limit within which averaging over previous gaze angles is beneficial yet beyond which the positive dependency would impair perception of significant changes in gaze. 
It is important to note that for a quantity such as horizontal gaze, or any quantity that varies along a linear dimension (e.g., from left to right, as opposed to a circular variable such as orientation), the sampling of present and past stimuli will exhibit a degree of covariance. For example, whatever the range of stimulus gazes that is used, if the current trial presents the leftmost gaze direction, it will almost always have been preceded by a more rightward gaze direction. Thus, current gaze direction, Display Formula\({f_0}\), and the gaze direction difference from the previous trial, Display Formula\({f_{ - 1}} - {f_0}\), inevitably covary. Consequently, plotting perceived gaze as a function of the direction of the current stimulus would be influenced by any influence of serial dependence in the data. For example, a positive serial dependency might, in isolation, produce a range effect resembling “regression to the mean,” which would be most evident for large gaze differences (although such an effect is not evident in our data: See Figures 2A and 3A). Conversely, plotting gaze error as a function of the gaze difference between the previous and current stimuli (as in Figures 2B and 3B) would be influenced by any systematic bias in perceived gaze that depends on the present stimulus alone. Note that this is not a problem for circular quantities such as orientation (Fischer & Whitney, 2014) as it easy to balance the joint distribution of the present stimulus and its difference from the previous stimulus. It is, however, relevant to all serial dependence studies investigating stimuli on linear dimensions because it is not necessarily true that all bias evident in a plot of response error as a function of intertrial difference can be attributed to serial dependence alone. Thus we anticipate that the modelling approach taken here to separate out serial from systematic biases will be of general applicability to studies across a broad range of stimulus domains. 
In each of the three experiments, we observed a significant tendency to overestimate the deviation from direct of the stimulus eye gaze or head direction. Previous studies have consistently reported that when the stimulus is clearly visible, as in the current experiments, there is a tendency to overestimate horizontal gaze deviation (Anstis et al., 1969; Otsuka et al., 2016; Vine, 1971), and the analogous observation has been made for head rotation (Loomis, Kelly, Pusch, Bailenson, & Beall, 2008). A similar tendency to overestimate vertical gaze deviation is evident in the data of Mareschal, Otsuka, and Clifford (2014), although not commented upon there. Interestingly, Anstis et al. (1969) found no such overestimation of vertical gaze deviation when subjects were required to indicate the gaze direction of an actual person. They attributed this to the strong cue provided by the upper eyelid tracking the upper edge of the pupil during vertical eye movements. No such cue was present in our study or that of Mareschal et al. (2014). 
Under conditions of high uncertainty, such as when the eye region is indistinct, gaze tends to appear more direct (Mareschal, Calder, Dadds, & Clifford, 2013; Mareschal et al., 2014; Martin & Rovira, 1982; Vine, 1971) such that the overestimation bias is reduced or reversed. This effect of uncertainty on perceived gaze direction has been taken as evidence that humans have a prior expectation that gaze is directed towards them (Mareschal et al., 2013; Mareschal et al., 2014). It would be interesting to see whether this prior expectation extends to other cues to the direction of another's attention, such as head orientation, given that when clearly visible it is overestimated in a similar manner to gaze direction. 
In principle, biases indicated by the setting of a pointer could result not from biases in perception but from biases in the reproduction of the perceived direction. Indeed, using the same pointer, Otsuka et al. (2016) found a slight but significant tendency for overestimation even when the stimulus whose (horizontal) orientation was to be reproduced was identical to the pointer itself: slope 1.17, 95% CI [1.07, 1.30]. However, the upper bound on the 95% CI on the slope of their regression line for reproduced versus actual pointer orientation was smaller than the slope in each of the three experiments reported here, indicating the existence of genuine perceptual overestimation biases that cannot be attributed to the process of reproducing the perceived direction with a pointer. 
Our modelling of the current data-sets assumed that biases in perceived gaze direction can be decomposed into systematic and serial factors, and furthermore that any systematic bias is a linear function of the actual gaze direction. While this is likely a simplification, the model accounted for around 95% of the variance of the data in each of Experiments 1 and 2. Furthermore, it has previously been shown that the dependence of perceived gaze direction on eye deviation can be well accounted for a linear model (Otsuka et al., 2016), although other work has reported a modest but significant nonlinearity in the vicinity of the category boundary between direct and averted gaze (Sweeny & Whitney, 2017). 
In sum, we have shown evidence of a positive or “attractive” serial dependency of perceived gaze direction using sequences of brief stimuli and a reproduction task. Both vertical and horizontal gaze exhibit the effect, with horizontal being stronger. A model combining a systematic bias plus a serial dependency component accounts very well for the data. This assimilative serial dependency contrasts with previous reports of repulsive gaze aftereffects following prolonged adaptation to gaze (Jenkins, Beaver, & Calder, 2006; Palmer & Clifford, 2017; Seyama & Nagayama, 2006) and head direction (Fang & He, 2005; Lawson & Calder, 2016). This difference is consistent with the notion that weak or brief stimulation (e.g., by the test stimulus on the previous trial) serves to prime perception and leads to attractive effects whereas strong, prolonged stimulation tends to habituate neural mechanisms and leads to repulsive effects (Pearson, Clifford, & Tong, 2008). 
Acknowledgments
This work is supported by Australian Research Council grants DP150101731 to DA and DP160102239 to CC. We are grateful to Yumiko Otsuka for helpful comments during the writing of this manuscript. 
Commercial relationships: none. 
Corresponding author: David Alais. 
Address: The University of Sydney, School of Psychology, New South Wales, Australia. 
References
Addams, R. (1834). An account of a peculiar optical phenomenon seen after having looked at a moving body. London and Edinburgh Philosophical Magazine and Journal of Science, 5, 373–374.
Alais, D., Leung, J., & Van der Burg, E. (2017). Linear Summation of Repulsive and Attractive Serial Dependencies: Orientation and Motion Dependencies Sum in Motion Perception. Journal of Neuroscience, 37 (16), 4381–4390.
Alais, D., Orchard-Mills, E., & Van der Burg, E. (2015). Auditory frequency perception adapts rapidly to the immediate past. Attention, Perception, & Psychophysics, 77 (3), 896–906.
Anstis, S. M., Mayhew, J. W., & Morley, T. (1969). The perception of where a face or television “portrait” is looking. American Journal of Psychology, 82 (4), 474–489.
Becchio, C., Bertone, C., & Castiello, U. (2008). How the gaze of others influences object processing. Trends in Cognitive Sciences, 12 (7), 254–258.
Calder, A. J., Jenkins, R., Cassel, A., & Clifford, C. W. (2008). Visual representation of eye gaze is coded by a nonopponent multichannel system. Journal of Experimental Psychology: General, 137 (2), 244–261.
Chang, S., Kim, C. Y., & Cho, Y. S. (2017). Sequential effects in preference decision: Prior preference assimilates current preference. PLoS One, 12 (8), e0182442.
Cicchini, G. M., Anobile, G., & Burr, D. C. (2014). Compressive mapping of number to space reflects dynamic encoding mechanisms, not static logarithmic transform. Proceedings of the National Academy of Sciences, USA, 111 (21), 7867–7872.
Clifford, C. W., Wenderoth, P., & Spehar, B. (2000). A functional angle on some after-effects in cortical vision. Proceedings. Biological sciences, The Royal Society, 267 (1454), 1705–1710.
Clifford, C. W., Wyatt, A. M., Arnold, D. H., Smith, S. T., & Wenderoth, P. (2001). Orthogonal adaptation improves orientation discrimination. Vision Research, 41 (2), 151–159.
Corbett, J. E., Fischer, J., & Whitney, D. (2011). Facilitating stable representations: Serial dependence in vision. PLoS One, 6 (1), e16701.
Cullen, K. E. (2009). Eye and head movements. In Squire L. R. (Ed.), Encyclopedia of neuroscience (Vol. 10, pp. 157–167). Oxford, UK: Academic Press.
Curran, W., Clifford, C. W., & Benton, C. P. (2006). The direction aftereffect is driven by adaptation of local motion detectors. Vision Research, 46 (25), 4270–4278.
Darwin, C. (1872). The expression of the emotions in man and animals (3rd ed.). New York, NY: Oxford University Press.
Efron, B., & Tibshirani, R. J. (1993). An introduction to the bootstrap (3rd ed.). New York, NY: Chapman & Hall.
Fang, F., & He, S. (2005). Viewer-centered object representation in the human visual system revealed by viewpoint aftereffects. Neuron, 45 (5), 793–800.
Fischer, J., & Whitney, D. (2014). Serial dependence in visual perception. Nature Neuroscience, 17 (5), 738–743.
Freedman, E. G., & Sparks, D. L. (2000). Coordination of the eyes and head: Movement kinematics. Experimental Brain Research, 131 (1), 22–32.
Friston, K. (2005). A theory of cortical responses. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 360 (1456), 815–836.
Gibson, J. J., & Radner, M. (1937). Adaptation, after-effect and contrast in the perception of tilted lines. I. Quantitative studies. Journal of Experimental Psychology, 20, 453–467.
Hadar, U., Steiner, T. J., Grant, E. C., & Rose, F. C. (1984). The timing of shifts in head posture during conversation. Human Movement Science, 3, 237–245.
Hadar, U., Steiner, T. J., & Rose, F. C. (1985). Head movement during listening turns in conversation. Journal of Nonverbal Behavior, 9, 214–228.
Itier, R. J., & Batty, M. (2009). Neural bases of eye and gaze processing: The core of social cognition. Neuroscience Biobehavioral Review, 33 (6), 843–863.
Jenkins, R., Beaver, J. D., & Calder, A. J. (2006). I thought you were looking at me: Direction-specific aftereffects in gaze perception. Psychological Science, 17 (6), 506–513.
Kalman, R. E. (1960). A new approach to linear filtering and prediction problems. Journal of Basic Engineering, 82, 35–45.
Kiyonaga, A., Scimeca, J. M., Bliss, D. P., & Whitney, D. (2017). Serial dependence across perception, attention, and memory. Trends in Cognitive Sciences, 21 (7), 493–497.
Kok, R., Taubert, J., Van der Burg, E., Rhodes, G., & Alais, D. (2017). Face familiarity promotes stable identity recognition: Exploring face perception using serial dependence. Royal Society Open Science, 4 (3), 160685.
Lawson, R. P., & Calder, A. J. (2016). The “where” of social attention: Head and body direction aftereffects arise from representations specific to cue type and not direction alone. Cognitve Neuroscience, 7 (1–4), 103–113.
Lawson, R. P., Clifford, C. W., & Calder, A. J. (2011). A real head turner: Horizontal and vertical head directions are multichannel coded. Journal of Vision, 11 (9): 17, 1–17, https://doi.org/10.1167/11.9.17. [PubMed] [Article]
Liberman, A., Fischer, J., & Whitney, D. (2014). Serial dependence in the perception of faces. Current Biology, 24 (21), 2569–2574.
Loomis, J. M., Kelly, J. W., Pusch, M., Bailenson, J. N., & Beall, A. C. (2008). Psychophysics of perceiving eye-gaze and head direction with peripheral vision: Implications for the dynamics of eye-gaze behavior. Perception, 37 (9), 1443–1457.
Manassi, M., Liberman, A., Chaney, W., & Whitney, D. (2017). The perceived stability of scenes: Serial dependence in ensemble representations. Scientific Reports, 7 (1): 1971.
Mareschal, I., Calder, A. J., Dadds, M. R., & Clifford, C. W. (2013). Gaze categorization under uncertainty: Psychophysics and modeling. Journal of Vision, 13 (5): 18, 1–10, https://doi.org/10.1167/13.5.18. [PubMed] [Article]
Mareschal, I., Otsuka, Y., & Clifford, C. W. (2014). A generalized tendency toward direct gaze with uncertainty. Journal of Vision, 14 (12): 27, 1–9, https://doi.org/10.1167/14.12.27. [PubMed] [Article]
Martin, W., & Rovira, M. (1982). Response biases in eye-gaze perception. Journal of Psychology, 110 (2), 203–209.
Mather, G., Anstis, S. M., & Verstraten, F. A. (Eds.). (1998). The motion after-effect: A modern prospective. Cambridge, MA: MIT Press.
Otsuka, Y., Mareschal, I., Calder, A. J., & Clifford, C. W. (2014). Dual-route model of the effect of head orientation on perceived gaze direction. Journal of Experimental Psychology: Human Perception & Performance 40 (4), 1425–1439.
Otsuka, Y., Mareschal, I., & Clifford, C. W. (2016). Testing the dual-route model of perceived gaze direction: Linear combination of eye and head cues. Journal of Vision, 16 (8): 8, 1–12, https://doi.org/10.1167/16.8.8. [PubMed] [Article]
Palmer, C. J., & Clifford, C. W. G. (2017). Functional mechanisms encoding others' direction of gaze in the human nervous system. Journal of Cognitive Neuroscience, 29 (10), 1725–1738.
Pearson, J., Clifford, C. W., & Tong, F. (2008). The functional impact of mental imagery on conscious perception. Current Biology, 18 (13), 982–986.
Rao, R. P., & Ballard, D. H. (1999). Predictive coding in the visual cortex: A functional interpretation of some extra-classical receptive-field effects. Nature Neuroscience, 2 (1), 79–87.
Rolfs, M. (2009). Microsaccades: Small steps on a long way. Vision Research, 49 (20), 2415–2441.
Rosenfeld, M., & Logan, N. (2009). Optometry: Science, Techniques and Clinical Management (2nd ed.). New York, NY: Elsevier Health Sciences.
Seyama, J., & Nagayama, R. S. (2006). Eye direction aftereffect. Psychological Research, 70 (1), 59–67.
Sweeny, T. D., & Whitney, D. (2017). The center of attention: Metamers, sensitivity, and bias in the emergent perception of gaze. Vision Research, 131, 67–74.
Taubert, J., & Alais, D. (2016). Serial dependence in face attractiveness judgements tolerates rotations around the yaw axis but not the roll axis. Visual Cognition, 24, 103–114.
Taubert, J., Alais, D., & Burr, D. (2016). Different coding strategies for the perception of stable and changeable facial attributes. Scientific Reports, 6: 32239.
Taubert, J., Van der Burg, E., & Alais, D. (2016). Love at second sight: Sequential dependence of facial attractiveness in an on-line dating paradigm. Scientific Reports, 6: 22740.
Van der Burg, E., Alais, D., & Cass, J. (2013). Rapid recalibration to audiovisual asynchrony. Journal of Neuroscience, 33 (37), 14633–14637.
Van der Burg, E., Orchard-Mills, E., & Alais, D. (2015). Rapid temporal recalibration is unique to audiovisual stimuli. Experimental Brain Research, 233 (1), 53–59.
Vine, I. (1971). Judgement of direction of gaze: An interpretation of discrepant results. The British Journal of Social and Clinical Psychology, 10 (4), 320–331.
Xia, Y., Leib, A. Y., & Whitney, D. (2016). Serial dependence in the perception of attractiveness. Journal of Vision, 16 (15): 28, 1–8, https://doi.org/10.1167/16.15.28. [PubMed] [Article]
Figure 1
 
(A–C) Exemplars of the face stimuli used in Experiments 1–3. The difference in gaze directions is more apparent when presented at life-size (as was the case in our experiments). These examples have enough resolution to allow zooming-in to life-sizes scale. (D) Participants reported the perceived direction of gaze (Experiments 1 and 2) or head deviation (Experiment 3) by rotating an on-screen pointer.
Figure 1
 
(A–C) Exemplars of the face stimuli used in Experiments 1–3. The difference in gaze directions is more apparent when presented at life-size (as was the case in our experiments). These examples have enough resolution to allow zooming-in to life-sizes scale. (D) Participants reported the perceived direction of gaze (Experiments 1 and 2) or head deviation (Experiment 3) by rotating an on-screen pointer.
Figure 2
 
Data from Experiment 1 (horizontal gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Difference is represented as the direction of the previous stimulus minus the direction of the present stimulus, such that a positive difference indicates that the previous stimulus was more rightwards than the present one. Error is represented as the pointer response minus the actual gaze deviation of the stimulus, such that a positive error indicates a response more rightwards than the actual gaze direction of the stimulus. Dotted line represents the linear component of the model fit capturing the systematic bias to overestimate gaze deviation. In both panels, solid lines show the fit of the model; error bars are ± one standard error of the mean across subjects.
Figure 2
 
Data from Experiment 1 (horizontal gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Difference is represented as the direction of the previous stimulus minus the direction of the present stimulus, such that a positive difference indicates that the previous stimulus was more rightwards than the present one. Error is represented as the pointer response minus the actual gaze deviation of the stimulus, such that a positive error indicates a response more rightwards than the actual gaze direction of the stimulus. Dotted line represents the linear component of the model fit capturing the systematic bias to overestimate gaze deviation. In both panels, solid lines show the fit of the model; error bars are ± one standard error of the mean across subjects.
Figure 3
 
Data from Experiment 2 (vertical gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Format as for Figure 2.
Figure 3
 
Data from Experiment 2 (vertical gaze deviation). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the gaze deviation of the stimulus. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in gaze deviation of the previous and present stimulus. Format as for Figure 2.
Figure 4
 
Data from Experiment 3 (horizontal head direction). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the direction of the stimulus's head. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in head direction of the previous and present stimulus. Format as for Figures 2 and 3.
Figure 4
 
Data from Experiment 3 (horizontal head direction). (A) Mean direction of the pointer response across subjects (N = 15) as a function of the direction of the stimulus's head. Dotted line indicates veridical responding. (B) Mean (signed) error in response across subjects as a function of the difference in head direction of the previous and present stimulus. Format as for Figures 2 and 3.
Table 1
 
Fitted parameters to the data from Experiment 1 (horizontal gaze deviation).
Table 1
 
Fitted parameters to the data from Experiment 1 (horizontal gaze deviation).
Table 2
 
Fitted parameters to the data from Experiment 2 (vertical gaze deviation).
Table 2
 
Fitted parameters to the data from Experiment 2 (vertical gaze deviation).
Table 3
 
Fitted parameters to the data from Experiment 3 (horizontal head deviation).
Table 3
 
Fitted parameters to the data from Experiment 3 (horizontal head deviation).
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×