October 2014
Volume 14, Issue 12
Free
Article  |   October 2014
A generalized tendency toward direct gaze with uncertainty
Author Affiliations
  • Isabelle Mareschal
    School of Biological and Chemical Sciences, Psychology, Queen Mary University of London, London, UK
    School of Psychology, The University of Sydney, Sydney, Australia
    i.mareschal@qmul.ac.uk
  • Yumiko Otsuka
    School of Psychology, UNSW Australia, Sydney, Australia
    yumikoot@gmail.com
  • Colin W. G. Clifford
    School of Psychology, UNSW Australia, Sydney, Australia
    colin.clifford@unsw.edu.au
Journal of Vision October 2014, Vol.14, 27. doi:https://doi.org/10.1167/14.12.27
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Isabelle Mareschal, Yumiko Otsuka, Colin W. G. Clifford; A generalized tendency toward direct gaze with uncertainty. Journal of Vision 2014;14(12):27. https://doi.org/10.1167/14.12.27.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Joint gaze behavior plays a crucial role in nonverbal communication and enhances group interactions. We recently reported that under conditions of uncertainty, observers assume that another person's (left/right averted) gaze is directed towards them, a prior for direct gaze. Given that people's gaze can deviate in many directions during social interactions, we developed a versatile method to examine how the influence of the prior for direct gaze varies across a range of gaze directions in both forward facing and rotated heads. We find that observers tend to report another's gaze along all axes as being more direct when uncertainty is introduced by adding noise to the stimulus. We also find that the influence of the prior is stronger in rotated heads than direct (forward) heads. This is consistent with the idea that, when uncertain, humans tend to judge gaze deviations as being directed at them, regardless of head rotation or axis of deviation.

Introduction
Social interactions require the ability to parse and interpret the plethora of cues that reveal other people's intentions. Of the nonverbal signals driving this task, estimating the direction of another person's gaze plays a critical role (George & Conty, 2008; Kleinke, 1986; Senju & Johnson, 2009). This fundamental role of gaze is supported by the discovery of a specific brain area dedicated to encoding directions of gaze (Calder et al., 2007; Pelphrey, Morris, & McCarthy, 2005; Perret et al., 1985). Of particular relevance to joint communication is when we judge that another person's gaze is directed at us (direct gaze; Emery, 2000; Kleinke, 1986; Pfeiffer, Vogeley, & Schilbach, 2013). Direct gaze has been reported to draw attention quickly (Conty et al., 2006; von Grunau & Anston, 1995), eliciting observer responses (Adams & Kleck, 2005; Senju & Johnson, 2009) and reaching conscious awareness (Stein & Sterzer, 2011) faster than averted gaze, possibly not even requiring attention to be accurately processed in the periphery (Yokoyama, Sakai, Noguchi, & Kita, 2014). Supporting this idea of a special role for direct gaze, we recently found that when observers were uncertain as to the direction of another person's direction of gaze (noise added to the stimulus), they tended to perceive it as being directed toward them (Mareschal, Calder, & Clifford, 2013a). 
It is well known that people use more than just eye information when they judge a person's direction of gaze. Wollaston (1824) reported that head rotation influences gaze perception by showing that identically configured eyes placed into differently rotated heads appeared to be gazing in different directions. Since then, some authors have reported repulsive effects of the head rotation (eyes appeared shifted away from the head direction, e.g., Anstis, Mayhew, & Morely, 1969; Gamer, Hecht, Seipp, & Hiller, 2011; Gibson & Pick, 1963; Mareschal et al., 2013a) while others have reported attractive effects (gaze deviation appears shifted toward the head rotation; e.g., Cline, 1967; Langton, Honeyman, & Tessler, 2004; Todorovic, 2006). We recently showed how observers combine eye deviation and head rotation cues when making judgments of eye gaze, with the repulsive effect prevailing in conditions where a whole head is viewed (Otsuka, Mareschal, Calder, & Clifford, 2014). Importantly, we also reported an associated increase in observers' uncertainty with head rotation, suggesting that the prior may have a greater influence for rotated heads. 
Given that in natural interactions another person's gaze can deviate in many directions, we sought to examine the influence of the prior for direct when gaze was presented along cardinal (horizontal and vertical) and noncardinal (oblique) axes, in forward facing and rotated heads. We were interested in assessing whether increased uncertainty on the stimulus led to noncardinal directions of gaze being perceived as more “direct” in forward and rotated heads, and whether there was any evidence of a prior toward cardinal directions (as has been reported for the orientation of gratings or arrays of texture elements; Girshick, Landy, & Simoncelli, 2011; Tomassini, Morgan, & Solomon, 2010). In order to test for this, we developed a new method of measuring perceived gaze direction that can be easily applied to populations that are more difficult to test, such as children (Vida & Maurer, 2012) or certain clinical populations (e.g., autism spectrum disorder, Langdon et al., 2006; Ristic et al., 2005). 
Methods
Observers
Two of the authors (IM, YO) and 10 naïve observers served as subjects. Four observers (IM, JF, HJ, RP) performed both experiments. All wore optical correction as necessary. Approval for the experiments was obtained from the Institutional Ethics Committees and experiments adhered to the provisions of the Declaration of Helsinki. All observers gave informed consent. 
Apparatus and stimuli
A Dell XPS computer running Matlab (MathWorks Ltd., Natick, MA) was used for stimulus generation, experiment control, and recording subjects' responses. The programs controlling the experiment incorporated elements of the PsychToolbox (Brainard, 1997). Stimuli were displayed on a Sony Trinitron 20SE monitor (1024 × 768 pixels, refresh rate: 75 Hz; Sony, Tokyo, Japan) driven by the computer's built-in NVIDIA GeForce GTS 240 graphics card or on a LaCie electron blue monitor (1024 × 768 pixels, refresh rate: 75 Hz; LaCie, Paris, France) graphics card AMD Radeon HD7470 (Dell, Round Rock, TX). Displays were calibrated using a photometer and linearized using look-up tables in software. At the viewing distance of 57 cm, one pixel subtended 2.2 arcmin. 
Stimuli
Face stimuli
Eight gray-scale faces (four male,four female) with neutral expressions were created with Daz software (http://www.daz3d.com/). One of the female faces is shown in Figure 1a. The hair was cropped and the face was presented in the middle of the monitor. The faces were either forward facing (Experiment 1) or were rotated 15° to the left using FaceGen software (Experiment 2). The stimuli subtended on average 15.1° × 11.2° and were viewed at 57 cm in a dimly-lit room. In order to control the direction of gaze, the original eyes in the faces were replaced using Gimp software by grayscale eye stimuli created using Matlab. The deviation of each eye was independently controlled using Matlab procedures that gave us precision down to the nearest pixel. 
Figure 1
 
Sample stimuli and procedure. (a) Example of a computer-generated female face looking 9° to the right. (b) Same face with noise added to the eyes. (c) Example of pointer used to indicate direction of gaze, with a red line to facilitate indicating the gaze direction. Note the pointer always appeared after the face stimulus was extinguished, at the same location as the face. (d) Schematic of the eight axes along which the stimuli were presented.
Figure 1
 
Sample stimuli and procedure. (a) Example of a computer-generated female face looking 9° to the right. (b) Same face with noise added to the eyes. (c) Example of pointer used to indicate direction of gaze, with a red line to facilitate indicating the gaze direction. Note the pointer always appeared after the face stimulus was extinguished, at the same location as the face. (d) Schematic of the eight axes along which the stimuli were presented.
Noisy eyes
Fractal noise (1/f amplitude spectrum) was added to the eyes of the same faces (e.g., Figure 1b). The noise was held constant at 6% r.m.s contrast and all observers used a Michelson contrast of 7.5% between sclera and pupil. 
Procedure
The observers' task was to indicate the direction of gaze using a pointer that appeared on the screen subsequent to the stimulus extinction (Figure 1c). The direction of gaze of the stimuli was made to vary along eight axes: Left/Right (LR); LeftUp/ RightDown by 157.5° (LU1/RD1); LeftUp/ RightDown by 135° (LU2/RD2); LeftUp/ RightDown by 112.5° (LU3/RD3); Up/Down (UD); LeftDown/ RightUp by 22.5 (LD1/RU1); LeftDown/ RightUp by 45 (LD2/RU2); LeftDown/ RightUp by 67.5 (LD3/RU3) (Figure 1c). 
Each gazing stimulus was presented for 500 ms and then extinguished, followed by the presentation of a pointer whose direction could vary 360° and whose length indicated the magnitude of the gaze deviation. The pointer appeared at the same location as where the face had appeared. Stimuli were presented using a method of constant stimuli with seven different gaze deviations selected from the set: {−9°, −6°, −3°, 0°, 3°, 6°, 9°}. Each gaze deviation was sampled four times within a run with equal presentation of the male and female faces. Observers performed four runs per experiment, resulting in a total of 1,792 trials (7 gaze deviations × 8 direction axes × 4 faces × 4 run × 2 conditions). 
Results
Figure 2 shows the distribution of responses to the stimulus along this axis for the seven gaze deviations tested {−9°, −6°, −3°, 0°, 3°, 6°, 9°}. Gray-filled squares represent the vector average of the data in the noiseless condition (top) and noisy condition (bottom). In order to characterize how responses changed when noise was added to the stimulus we calculated the shift in the vector average between the noiseless and noisy data. 
Figure 2
 
Pointer responses to stimuli presented along the LU3/RD3 (112.5°) axis in the noiseless condition (top) and noisy condition (bottom). Gray squares are average of data points. Red points in each polar plot are the 16 pointer positions to leftwards deviated stimuli; green are pointer positions to the direct stimuli and blue are the pointer positions to the rightwards stimuli.
Figure 2
 
Pointer responses to stimuli presented along the LU3/RD3 (112.5°) axis in the noiseless condition (top) and noisy condition (bottom). Gray squares are average of data points. Red points in each polar plot are the 16 pointer positions to leftwards deviated stimuli; green are pointer positions to the direct stimuli and blue are the pointer positions to the rightwards stimuli.
This is plotted for each observer in Figure 3a. In this format each dot represents the average pointer position in a given noiseless condition; the line projecting from the dot extends to the average position calculated for the corresponding noisy condition. This provides a straightforward visualization of the change in each observer's perceived gaze deviation when noise was added to the stimuli. We first estimated the bias in the noiseless condition by calculating the average pointer positions (solid dots in Figure 3b) for physically direct stimuli, resulting in a bias of −2.12° (or a bias of −1.97° when responses are averaged across all stimuli). This indicates that a physically direct gaze looks slightly leftwards, consistent with previous reports (Calder et al., 2008; Mareschal et al., 2013a; Mareschal, Calder, Dadds, & Clifford, 2013b). 
Figure 3
 
Flow fields derived from forward facing heads. (a) Individual observers' flow fields. Each black dot represents the average position of the pointer in the noiseless trials with the line extending to the average position in the corresponding noisy condition. The change in perceived gaze with noise is therefore represented by the overall direction of the lines, creating a “flow field.” (b) Averaged flow field data, across all observers. Affine fit to the average flow field and Residuals of the affine fit.
Figure 3
 
Flow fields derived from forward facing heads. (a) Individual observers' flow fields. Each black dot represents the average position of the pointer in the noiseless trials with the line extending to the average position in the corresponding noisy condition. The change in perceived gaze with noise is therefore represented by the overall direction of the lines, creating a “flow field.” (b) Averaged flow field data, across all observers. Affine fit to the average flow field and Residuals of the affine fit.
In order to quantify the direction of the flow fields, we fit an affine transformation to the average data (Figure 3b):  where n and z are the responses to the noisy and noiseless stimuli, respectively, and c is the central focus of the affine transformation, A. Note that n, z, and c are two-element vectors and A is a 2 × 2 matrix. The elements on the leading diagonal of A describe the scaling of the data in the x- and y-dimensions. The other two elements are involved in rotational or shearing transformations but are close to zero in the fits to the data here so will not be discussed further.  
Visual inspection of the average data (Figure 3b, left panel) shows the lines converging toward a central focus when noise is added to the stimulus. This is consistent with the calculated central focus, c, of the affine fit slightly to the right of direct (x = 3.86°; y = 0.02°), with a scaling of the data 0.66 along the x-axis and 0.75 along the y-axis. If there were no scaling of the data (e.g., no influence of a prior), the affine transformation would return the identity matrix (e.g., x-axis and y-axis scaling would be 1). We also note that the focus of the scaling is shifted slightly to the right of direct, in accord with our previous finding of a prior measured using a forced-choice discrimination task that also tended to peak slightly to the right of direct (Mareschal et al., 2013a). There is little structure in the residuals (what is left in the data after we remove the affine component), consistent with a general tendency to perceive gaze closer to direct (0°) with noise capturing the major trend in the data. The lack of residual structure also indicates that gaze deviations in all directions are simply drawn toward direct and not biased toward cardinal directions. This is further reinforced by the affine fit to the averaged data accounting for 80.2% of the variance. In order to examine the possibility that only a handful of observers' data were producing the overall trend we report, particularly since there is some interobserver variability in the spread of the data, we performed the same analysis, but on normalized data. In this case each observer's data were normalized by the spread of their noiseless data. This had little effect on the averaged results, with the central focus of the affine fit slightly to the right of direct (x = 3.68°; y = 0.00°), with a scaling of the data 0.69 along the x-axis and 0.80 along the y-axis. 
When a rotated head is used (head oriented 15° to the left) without noise, all observers' perceived direction of gaze shifts away from the head rotation as evidenced by most dots being located to the right of direct (Figure 4a). The mean bias in the average (noiseless data) to physically direct gaze deviations is 7.23° to the right (or 7.67° averaged across all deviations). However, adding noise to the eyes doesn't simply increase the weighting of the head orientation that was at −15° (leftwards) since many of the arrows point away from the direction of the head. Rather it acts to enhance a tendency to see the gaze as being direct (see Mareschal et al., 2013a). We fit an affine function to the data, and find that the central focus is again slightly to the right of direct (x = 2.92°; y = 0.26°), which is much nearer to direct than it is to the head orientation (−15°). In this rotated heads condition, the affine fit to the averaged data accounted for 84% of the variance with no systematic variation in the residuals with gaze direction. There is a greater scaling of the data, 0.44 along the x-axis and 0.37 along the y-axis. Normalization did not alter the data, with the central focus located at (x = 3.04°; y = −0.11°), and the scaling of the data 0.45 along the x-axis and 0.38 along the y-axis. These scaling values are further from the identity matrix than for direct heads, suggesting that there is a greater influence of the prior in the rotated heads than in the direct heads, affecting both the cardinal and noncardinal directions. The finding of a greater influence of the prior in rotated heads is consistent with Otsuka et al. (2014) who have reported greater uncertainty in gaze judgments with rotated heads than direct heads. 
Figure 4
 
Flow fields derived using rotated heads (left 15°) (a) in eight observers. (b) Average response, Affine fit to the average flow field and Residuals of the affine fit.
Figure 4
 
Flow fields derived using rotated heads (left 15°) (a) in eight observers. (b) Average response, Affine fit to the average flow field and Residuals of the affine fit.
Discussion
In this study we developed a novel method for testing biases in gaze perception and find that uncertainty makes gaze look more direct regardless of head rotation, consistent with an earlier finding using a different task (gaze discrimination; Mareschal et al., 2013a). We also report that the tendency to perceive uncertain gaze as direct that we observed previously for horizontally averted eye gaze generalizes across all directions of gaze deviation, and to an intuitively simpler task that we believe would be well suited to test a wider range of participants. Crucially, our estimates of bias are very similar using either method, supporting the generality of this method. Finally, consistent with our previous study where we demonstrated the existence of cardinal and noncardinal mechanisms that code for gaze (Cheleski, Mareschal, Calder, & Clifford, 2013), we find here, too, no special status for cardinal directions. It is worth noting that the biases we report here do not appear to reflect a general bias for centering or symmetry (e.g., of the iris within the sclera). In an earlier study we found that increasing uncertainty on a nonsocial stimulus (a gray circle within a larger white one) resulted in only very small (nonsignificant) biases (Mareschal et al., 2013a). 
We recently proposed a Bayesian framework to account for gaze perception under conditions of uncertainty, and suggested that a prior for direct gaze could serve an ecological role, since the cost of missing when someone is looking at you (e.g., a possible threat) would exceed the cost of making a false alarm that someone was looking at you when in fact they were not (Mareschal et al., 2013a; see also Langton et al., 2004). As such, relying on the prior would be the wisest strategy when unsure about the surroundings (in this case, someone or possibly something). There is also a growing interest in linking the existence of a prior to the statistics of our natural environment, something that has already been established for the perception of basic visual attributes, such as lighting (e.g., we assume light comes from above us; Mamassian & Goutcher, 2001), motion (e.g., we assume slow speed for moving objects; Stocker & Simoncelli, 2006), and orientation (e.g., we assume orientation is mainly along cardinal axes; Girshick et al., 2011; Tomassini et al., 2010). 
However, quantifying the types of gaze we experience in a natural environment is intrinsically complicated, and would require one person to wear a head mounted eye tracker to measure with very high precision the fine eye movements other people make. As a first approximation of this, most researchers have used static images and quantified what people look at. In this case, it has been shown that people's gaze deviations depend on a number of things, such as the task they are performing (Ballard & Hayhoe, 2009), the presence of social cues (faces or gaze; e.g., Birmingham, Bischof, & Kingstone, 2009), as well as how important the object is (‘t Hart, Schmidt, Roth, & Einhauser, 2013). When watching films, observers' eye movements are mainly along the horizontal axis (Dorr, Martinez, Gegenfurtner, & Barth, 2010), and when watching other people (in static or dynamic settings), observers' eye movements are concentrated on the eye region (Risko, Laidlaw, Freeth, Foulsham, & Kingstone, 2012). At this stage, we can only speculate that the interaction between head orientation and prior results from the types of head rotation / gaze deviation that we encounter in the real world. Our experience of social interactions may also account for the differences we find between the rotated and forward facing heads. Specifically, we typically don't attend to people looking away from us in the same manner as we do to people facing us. This could result in higher overall uncertainty about a number of the features that we use to judge their direction of gaze (e.g., their eye direction, their head direction, their body direction, their intentionality, etc.) leading to a greater influence of the prior for direct gaze. 
In this study, we introduced uncertainty in our stimuli by adding noise to the eyes of the avatar stimuli. Interestingly, it has recently been shown that attributing intentionality or mental states to a social stimulus can modulate the strength of basic low-level perceptual processes such adaptation (Teufel et al., 2009) or gaze cueing (Wiese, Wykowska, Zwickel, & Muller, 2012; Wykowska, Wiese, Prosser, & Muller, 2014). For example, Teufel et al. (2009) found that simply thinking that the glasses worn by their avatar were opaque (and hence that the avatar could not see the observers) was sufficient to reduce the size of adaptation. It is likely therefore, that rather than adding noise to our stimuli we could introduce uncertainty by adding glasses on our stimuli that observers think are opaque (so the avatar cannot see them). 
We find that in forward facing heads the peak of the prior is to the right of direct, consistent with our earlier study (Mareschal et al., 2013a) using a very different method. In our earlier study, we estimated the standard deviation of the prior for our subjects and determined that it was quite broad across observers (on average about 10° SD). Although we expect that different observers may have a prior that peaks to the left or right of direct, this does not appear to be the case (indeed only one observer out of a total of six had a peak to the left of direct in our previous study). It appears that here, too, the bias is to the right of direct for the majority of observers, although we have as yet no functional explanation for why this may be. 
Finally, we developed a method for measuring gaze along cardinal and noncardinal axes for two specific reasons. First, some (categorization) tasks are inevitably limited by the number of response options that a participant can remember. As such, response options sometimes fail to encompass the range of possible responses an observer may wish to make. The second reason was to develop a simpler and more natural task that is particularly useful for testing children or certain clinical populations. Although gaze categorization has been successfully applied to adults and children (e.g., Mareschal et al., 2013b; Vida & Maurer, 2012), we suggest that this may prove to be a more intuitive task, which allows us to test along a greater range of deviations. We also anticipate that it would be more easily applied to clinical populations with reported visual deficits for judging other people's gaze (e.g., Gamer et al., 2011; Jun et al., 2013; Neumann, Spezio, Piven, & Adolphs, 2006; Spezio, Adolphs, Hurley, & Piven, 2006). 
Acknowledgments
This work is supported by an Australian Research Council Discovery Project (DP120102589) and an Australian Research Council Future Fellowship to CC. 
Commercial relationships: none. 
Corresponding author: Isabelle Mareschal. 
Email: i.mareschal@qmul.ac.uk. 
Address: School of Biological and Chemical Sciences, Psychology, Queen Mary University of London, London, UK. 
References
Adams R. B. Jr. Kleck R. E. (2005). Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion, 5, 3–11. [CrossRef] [PubMed]
Anstis S. M. Mayhew J. W. Morely T. (1969). The perception of where a face or television “portrait” is looking. American Journal of Psychology, 82, 474–489. [CrossRef] [PubMed]
Ballard D. H. Hayhoe M. M. (2009). Modelling the role of task in the control of gaze. Visual Cognition, 17, 1185–1204. [CrossRef] [PubMed]
Birmingham E. Bischof W. F. Kingstone A. (2009). Saliency does not account for fixations to eyes within social scenes. Vision Research, 49, 2992–3000. [CrossRef] [PubMed]
Brainard D. H. (1997). The psychophysics toolbox. Spatial Vision, 10, 433–436. [CrossRef] [PubMed]
Calder A. J. Beaver J. D. Winston J. S. Dolan R. J. Jenkins R. R. Eger E. (2007). Separate coding of different directions in the superior temporal sulcus and inferior parietal lobule. Current Biology, 17, 20–25. [CrossRef] [PubMed]
Calder A. J. Jenkins R. Cassel A. Clifford C. W. (2008). Visual representation of eye gaze is coded by a nonopponent multichannel system. Journal of Experimental Psychology: General, 137, 244–261. [CrossRef] [PubMed]
Cheleski D. J. Mareschal I. Calder A. J. Clifford C. W. (2013). Eye gaze is not coded by cardinal mechanisms alone. Proceedings of the Royal Society, B, 280 (1764).
Cline M. G. (1967). The perception of where a person is looking. American Journal of Psychology, 80, 41–50. [CrossRef] [PubMed]
Conty L. Tijus C. Hugueville L. Coelho E. George N. (2006). Searching for asymmetries in the detection of gaze contact versus averted gaze under different head views: a behavioural study. Spatial Vision, 19, 529–545. [CrossRef] [PubMed]
Dorr M. Martinez T. Gegenfurtner K. R. Barth E. (2010). Variability of eye movements when viewing dynamic natural scenes. Journal of Vision, 10 (10): 28, 1–17, http://www.journalofvision.org/content/10/10/28, doi:10.1167/10.10.28. [PubMed] [Article]
Emery N. J. (2000). The eyes have it: the neuroethology, function, and evolution of social gaze. Neuroscience Biobehavioural Reviews, 24, 581–604. [CrossRef]
Gamer M. Hecht H. Seipp N. Hiller W. (2011). Who is looking at me? The cone of gaze widens in social phobia. Cognition and Emotion, 25, 756–764. [CrossRef] [PubMed]
George N. Conty L. (2008). Facing the gaze of others. Neurophysiologie Clinique, 38, 197–207. [CrossRef] [PubMed]
Gibson J. T. Pick A. D. (1963). Perception of another person's looking behavior. The American Journal of Psychology, 76, 386–394. [CrossRef] [PubMed]
Girshick A. R. Landy M. S. Simoncelli E. P. (2011). Cardinal rules: Visual orientation perception reflects knowledge of environmental statistics. Nature Neuroscience, 14 (7), 926–932. [CrossRef] [PubMed]
Jun Y. Y. Mareschal I. Clifford C. W. Dadds M. R. (2013). Cone of direct gaze as a marker of social anxiety. Psychiatry Research, 30, 193–198. [CrossRef]
Kleinke C. L. (1986). Gaze and eye contact: A research review. Psychological Bulletin, 100, 78–100. [CrossRef] [PubMed]
Langdon R. Corner T. McLaren J. Coltheart M. Ward P. B. (2006). Attentional orienting triggered by gaze in schizophrenia. Neuropsychologia, 44, 417–429. [CrossRef] [PubMed]
Langton S. R. H. Honeyman H. Tessler E. (2004). The influence of head contour and nose angle on the perception of eye gaze direction. Perception & Psychophysics, 66, 752–771. [CrossRef] [PubMed]
Mamassian P. Goutcher R. (2001). Prior knowledge on the illumination position. Cognition, 81. B1–9. [CrossRef] [PubMed]
Mareschal I. Calder A. J. Clifford C. W. G. (2013a). Humans have an expectation that gaze is directed at them. Current Biology, 23 (8), 717–721. [CrossRef]
Mareschal I. Calder A. J. Dadds M. R. Clifford C. W. G. (2013b). Gaze categorization under uncertainty: Psychophysics and modelling. Journal of Vision, 13 (5): 18, 1–10, http://www.journalofvision.org/content/13/5/18, doi:10.1167/13.5.18. [PubMed] [Article]
Neumann D. Spezio M. L. Piven J. Adolphs R. (2006). Looking you in the mouth: Abnormal gaze in autism resulting from impaired top-down modulation of visual attention. Social Cognitive and Affective Neuroscience, 1, 194–202. [CrossRef] [PubMed]
Otsuka Y. Mareschal I. Calder A. J. Clifford C. W. G. (2014). Dual route model of the effect of head orientation on perceived gaze direction. Journal of Experimental Psychology: Human perception and performance, 40, 1425–1439. [CrossRef] [PubMed]
Pelphrey K. A. Morris J. P. McCarthy G. (2005). Neural basis of eye gaze processing deficits in autism. Brain, 128, 1038–1048. [CrossRef] [PubMed]
Perret D. I. Smith P. A. J. Potter D. D. Mistlin A. J. Head A. S. Milner A. D. Jeeves M. A. (1985). Visual cells in the temporal cortex are sensitive to face and gaze direction. Proceedings of the Royal Society of London B, 223, 292–317.
Pfeiffer U. J. Vogeley K. Schilbach L. (2013). From gaze cueing to dual eye-tracking: Novel approaches to investigate the neural correlates of gaze in social interaction. Neuroscience & Biobehavioral Reviews, 37, 2516–2528. [CrossRef]
Risko E. F. Laidlaw K. E. W. Freeth M. Foulsham T. Kingstone A. (2012). Social attention with real versus reel stimuli: Toward an empirical approach to concerns about ecological validity. Frontiers in Human Neuroscience, 6, 1–11. [CrossRef] [PubMed]
Ristic J. Mottron L. Friesen C. K. Iarocci G. Burack J. A. Kingstone A. (2005). Eyes are special but not for everyone: The case of autism. Cognitive Brain Research, 24, 715–718. [CrossRef] [PubMed]
Senju A. Johnson M. H. (2009). The eye contact effect: Mechanisms and development. Trends in Cognitive Science, 13, 127–134. [CrossRef]
Spezio M. L. Adolphs R. Hurley R. S. E. Piven J. (2006). Abnormal use of facial information in high-functioning autism. Journal of Autism and Developmental Disorders, 37, 929–939. [CrossRef]
Stein T. Sterzer P. (2011). High-level face shape adaptation depends on visual awareness: Evidence from continuous flash suppression. Journal of Vision, 11 (8): 5, 1–14, http://www.journalofvision.org/content/11/8/5, doi:10.1167/11.8.5. [PubMed] [Article]
Stocker A. A. Simoncelli E. P. (2006). Noise characteristics and prior expectations in human visual speed perception. Nature Neuroscience, 9, 578–585. [CrossRef] [PubMed]
Teufel C. Alexis D. M. Todd H. Lawrence-Owen A. J. Clayton N. S. Davis G. (2009). Social cognition modulates the sensory coding of observed gaze direction. Current Biology, 19, 1274–1277. [CrossRef] [PubMed]
Todorovic D. (2006). Geometric basis of perception of gaze direction. Vision Research, 46, 3549–3562. [CrossRef] [PubMed]
Tomassini A. Morgan M. J. Solomon J. A. (2010). Orientation uncertainty reduces perceived obliquity. Vision Research, 5, 541–547. [CrossRef]
‘t Hart B. M. Schmidt H. C. Roth C. Einhauser W. (2013). Fixations on objects in natural scenes: dissociating importance from salience. Frontiers in Psychology, 19, 455.
Vida M. D. Maurer D. (2012). The development of fine-grained sensitivity to eye contact after 6 years of age. Journal of Experimental Child Psychology, 112, 243–256. [CrossRef] [PubMed]
von Grunau M. Anston C. (1995). The detection of gaze direction: A stare-in-the-crowd effect. Perception, 24, 1297–1313. [CrossRef] [PubMed]
Wiese E. Wykowska A. Zwickel J. Muller H. J. (2012). I see what you mean: How attentional selection is shaped by ascribing intentions to others. PLoS One, 7, e45391.
Wollaston W. H. (1824). On the apparent direction of eyes in a portrait. Philosophical Transactions of the Royal Society of London, 114, 247–256. [CrossRef]
Wykowska A. Wiese E. Prosser A. Muller H. (2014). Beliefs about the minds of others influence how we process sensory information. PLoS One, 9, e94339.
Yokoyama T. Sakai H. Noguchi Y. Kita S. (2014). Perception of direct gaze does not require focus of attention. Scientific Reports, 4, 3858. [PubMed]
Figure 1
 
Sample stimuli and procedure. (a) Example of a computer-generated female face looking 9° to the right. (b) Same face with noise added to the eyes. (c) Example of pointer used to indicate direction of gaze, with a red line to facilitate indicating the gaze direction. Note the pointer always appeared after the face stimulus was extinguished, at the same location as the face. (d) Schematic of the eight axes along which the stimuli were presented.
Figure 1
 
Sample stimuli and procedure. (a) Example of a computer-generated female face looking 9° to the right. (b) Same face with noise added to the eyes. (c) Example of pointer used to indicate direction of gaze, with a red line to facilitate indicating the gaze direction. Note the pointer always appeared after the face stimulus was extinguished, at the same location as the face. (d) Schematic of the eight axes along which the stimuli were presented.
Figure 2
 
Pointer responses to stimuli presented along the LU3/RD3 (112.5°) axis in the noiseless condition (top) and noisy condition (bottom). Gray squares are average of data points. Red points in each polar plot are the 16 pointer positions to leftwards deviated stimuli; green are pointer positions to the direct stimuli and blue are the pointer positions to the rightwards stimuli.
Figure 2
 
Pointer responses to stimuli presented along the LU3/RD3 (112.5°) axis in the noiseless condition (top) and noisy condition (bottom). Gray squares are average of data points. Red points in each polar plot are the 16 pointer positions to leftwards deviated stimuli; green are pointer positions to the direct stimuli and blue are the pointer positions to the rightwards stimuli.
Figure 3
 
Flow fields derived from forward facing heads. (a) Individual observers' flow fields. Each black dot represents the average position of the pointer in the noiseless trials with the line extending to the average position in the corresponding noisy condition. The change in perceived gaze with noise is therefore represented by the overall direction of the lines, creating a “flow field.” (b) Averaged flow field data, across all observers. Affine fit to the average flow field and Residuals of the affine fit.
Figure 3
 
Flow fields derived from forward facing heads. (a) Individual observers' flow fields. Each black dot represents the average position of the pointer in the noiseless trials with the line extending to the average position in the corresponding noisy condition. The change in perceived gaze with noise is therefore represented by the overall direction of the lines, creating a “flow field.” (b) Averaged flow field data, across all observers. Affine fit to the average flow field and Residuals of the affine fit.
Figure 4
 
Flow fields derived using rotated heads (left 15°) (a) in eight observers. (b) Average response, Affine fit to the average flow field and Residuals of the affine fit.
Figure 4
 
Flow fields derived using rotated heads (left 15°) (a) in eight observers. (b) Average response, Affine fit to the average flow field and Residuals of the affine fit.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×