Free
Article  |   December 2011
Roles of the upper and lower bodies in direction discrimination of point-light walkers
Author Affiliations
Journal of Vision December 2011, Vol.11, 8. doi:10.1167/11.14.8
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kohske Takahashi, Haruaki Fukuda, Hanako Ikeda, Hirokazu Doi, Katsumi Watanabe, Kazuhiro Ueda, Kazuyuki Shinohara; Roles of the upper and lower bodies in direction discrimination of point-light walkers. Journal of Vision 2011;11(14):8. doi: 10.1167/11.14.8.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We can easily recognize human movements from very limited visual information (biological motion perception). The present study investigated how upper and lower body areas contribute to direction discrimination of a point-light (PL) walker. Observers judged the direction that the PL walker was facing. The walker performed either normal walking or hakobi, a walking style used in traditional Japanese performing arts, in which the amount of the local motion of extremities is much smaller than that in normal walking. Either the upper, lower, or full body of the PL walker was presented. Discrimination performance was found to be better for the lower body than for the upper body. We also found that discrimination performance for the lower body was affected by walking style and/or the amount of local motion signals. Additional eye movement analyses indicated that the observers initially inspected the region corresponding to the upper body, and then the gaze shifted toward the lower body. This held true even when the upper body was absent. We conjectured that the upper body subserved to localize the PL walker and the lower body to discriminate walking direction. We concluded that the upper and lower bodies play different roles in direction discrimination of a PL walker.

Introduction
Observing human movements does more than simply provide us with kinematic information. The human ability to perceive various aspects of the state of another from their motions is incredibly robust. In fact, motion itself is sufficient to infer the gender (Kozlowski & Cutting, 1977; Troje, 2002), emotion (Atkinson, Dittrich, Gemmell, & Young, 2004; Dittrich, Troscianko, Lea, & Morgan, 1996; Pollick, Paterson, Bruderlin, & Sanford, 2001), and animacy of others (Chang & Troje, 2008). Johansson (1973) developed a point-light (PL) display where human motion was visually represented only by a small number of dots on the joints. The PL display demonstrated that the presence of a person could be easily detected on the basis of PL movements only. A number of psychological and neuroscientific studies have investigated the underlying mechanisms and neural bases of the robust perception of biological motion (for a review, see Blake & Shiffrar, 2007; Giese & Poggio, 2003). 
Existing studies aimed to reveal how much and what kind of information is minimal and sufficient to support biological motion perception. For example, the importance of local motion (Hirai, Saunders, & Troje, 2011; Thurman & Grossman, 2008; Troje & Westhoff, 2006), global configuration (Beintema, Georg, & Lappe, 2006; Beintema & Lappe, 2002; Lange, Georg, & Lappe, 2006; McKay, Simmons, McAleer, & Pollick, 2009), or both (Baccus, Mozgova, & Thompson, 2009; Thurman, Giese, & Grossman, 2010) in perceiving PL motion has been suggested. Other studies examined task dependence of the dominant cue (Chang & Troje, 2009; Thirkettle, Benton, & Scott-Samuel, 2009). Although it is not yet fully understood how local motion and global configuration are processed, this minimalistic approach to biological motion perception has begun to reveal the complicated mechanisms of human motion perception successfully. 
Another line of research has focused on revealing which body parts are crucial in biological motion perception by omitting a part of the PL display. Some studies have suggested that omitting extremities drastically impairs direction discrimination of a PL walker and that hence the PL display of the extremities (particularly the feet) seems to be crucial (Mather, Radford, & West, 1992; Troje & Westhoff, 2006), However, this would not necessarily mean that information from body parts other than extremities is left unused. Rather, the task determines which body parts are the crucial one. For example, omission of mid-limb points (elbows and knees) and central points (shoulders and hips) impaired detection of the PL walker much more than the omission of extremities (wrists and ankles; Pinto & Shiffrar, 1999). The upper and middle bodies are more informative with regard to ascertaining the gender of the PL walker (Johnson & Tassinary, 2005; Kozlowski & Cutting, 1977). More recently, classification image, bubbles techniques, and gaze pattern analyses have directly examined information from different body parts, thereby establishing which are used in biological motion perception. Lu and Liu (2006) used the classification image technique in order to investigate which points were correlated with task performance in forward/backward discrimination of a PL walker. The resulting classification image suggested that all PL points had comparable effects and, hence, implied that global processing of the PL display subserves biological motion perception. Thurman et al. (2010) adopted the bubbles technique and indicated that observers relied not only on lower body motion but also on upper body posture in direction discrimination of the PL walker. Saunders, Williamson, and Troje (2010) examined gaze pattern in direction and gender discrimination of the PL walker. Although the feet were inspected more frequently in direction discrimination than in gender discrimination, the shoulders and hip were inspected more frequently than the feet in both tasks. Thus, while the importance of extremities was suggested for direction discrimination, gaze pattern and classification image analyses showed that the entire body was inspected in biological motion perception. 
It is obvious that each body part plays a different role in producing different actions, and action experience trains biological motion perception (Jacobs, Pinto, & Shiffrar, 2004; Pinto & Shiffrar, 2009). Despite the fact that the crucial body part was task-dependent, the entire body was inspected in biological motion perception; information from each body part might play different roles in the perception of biological motion. For example, the upper and lower bodies have a different effect on the facing bias of a depth-ambiguous PL walker, depending on spatiotemporal characteristics and type of action performed (Schouten, Troje, & Verfaillie, 2011; Vanrie & Verfaillie, 2006). 
In the present study, we aimed to investigate the roles of, as well as information usage for, the upper and lower body areas in direction discrimination of PL walkers. For this purpose, apart from normal walking, we also used the biological motion characteristic of hakobi walking. Hakobi is a variant of walking performed in Kyogen, a Japanese traditional performing art. Hakobi walking is described as “designed to avoid up and down movements while moving along. The heels are kept in contact with the floor at all times and slide along the stage” (Cavaye, Griffith, & Senda, 2005, p. 188). The hakobi walking performed by professional Kyogen actors can be recognized as a walking motion, yet the trajectories and magnitudes of local motion are different from those of normal walking (see Figures 1 and 2). In particular, while the amount of local motion of head and trunk (head, shoulder, hip, and knee) are not largely different between normal and hakobi walking, that of the extremities (e.g., elbow, wrist, ankle, and toe) in hakobi walking is much smaller than that in normal walking. Therefore, comparing these two walking motions enables us to investigate how the local motion of the upper and lower bodies contributes to directional discrimination of biological motion. 
Figure 1
 
Visual representation of the amount of local motion (i.e., the distance a dot travels along its local trajectory as it completes one step cycle) at each point of a normal and a hakobi PL walker, respectively. The right panel shows the amount of local motion, its ratio, and difference between normal and hakobi walking.
Figure 1
 
Visual representation of the amount of local motion (i.e., the distance a dot travels along its local trajectory as it completes one step cycle) at each point of a normal and a hakobi PL walker, respectively. The right panel shows the amount of local motion, its ratio, and difference between normal and hakobi walking.
Figure 2
 
Motion trajectories of normal and hakobi walking.
Figure 2
 
Motion trajectories of normal and hakobi walking.
In Experiment 1, in order to confirm that the basic characteristics of biological motion perception are common between hakobi and normal walking, we examined the effect of body area as well as the inversion effect of hakobi and normal walking on a PL walker. In Experiment 2, the effects of upper and lower bodies on direction discrimination were examined by combining the upper and lower bodies of PL walkers performing the two walking types. Finally, in Experiments 3A and 3B, we investigated observers' visual strategies for direction discrimination by measuring eye movements. 
Experiment 1
In Experiment 1, we examined qualitative and quantitative differences in direction discrimination of hakobi and normal PL walkers. PL stimuli of the full or lower body were presented in upright or inverted forms, and we measured discrimination thresholds for each stimulus condition. 
Methods
Participants
Five observers participated in the experiment. All participants reported normal or corrected-to-normal vision. 
Apparatus and stimuli
Prior to the experiment, we recorded the 3D motion trajectories of 16 joint points 1 of normal and hakobi walking at a 200-Hz sampling rate (Hawk Digital System, Motion Analysis). Hakobi walking was performed by three professional male Kyogen actors, and normal walking was performed by four male adults. Thereafter, we converted the 3D motion trajectory into a 2D trajectory from an edge-on viewpoint, normalized walking velocity to 1 cycle/s, and resampled the trajectory at a sampling rate of 85 Hz (see Figure 2). 
Visual stimuli were presented on a 21-inch CRT monitor (refresh rate: 85 Hz, viewing distance: 57 cm). A red fixation point was presented at the center of the screen throughout the experiment. The PL walker was represented by white dots on the black background. The height of the full-body stimuli was approximately 6.9°. The initial frame of the PL animation was randomly determined for each trial, with a range from 0 to 530 ms. The location of the PL stimulus was also randomly determined. The walking direction was either leftward or rightward. Further, we presented random motion noises by adding the extra moving dots identical in size and color to those of the PL walker. The trajectories of the noise dots were randomly sampled from all other PL stimuli; hence, the noise itself was not predictive as for the facing direction. The initial position of each noise dot was randomly determined. The trajectory of each noise dot was randomly rotated on the 2D image plane. 
Procedure
In each trial, a blank display for 250 ms was followed by a PL animation for 1 s, after which the observer judged the direction that the PL walker was facing. The next trial began immediately after the observer's response. The participants were tested with all combinations of walking type (normal vs. hakobi), vertical orientation (upright vs. inverted), and body areas (full body vs. lower body). A PL stimulus was randomly chosen out of four normal actors in the normal walker condition and out of three hakobi actors in the hakobi walker condition. In the inverted condition, the PL walker was mirror-reversed along the horizontal axis. In the lower body condition, only the 8 joint points of the lower body were presented. The full-body and lower body conditions were conducted in separate sessions. The other experimental conditions (walking type and vertical orientation) were randomly presented in the experimental session. 
We measured the discrimination threshold by using the 3-up 1-down staircase method. The noise level (i.e., the number of random motion noise dots) was increased after three consecutive correct responses and decreased after one incorrect response. The amount of increase/decrease of noise level was 4 dots until 12 reversals, which was reduced to 2 dots thereafter. The initial noise level was zero, and the staircase was terminated after 18 reversals or after the noise level reached 192 dots; hence, the number of trials varied depending on the participant's performance. A single staircase run was performed for each experimental condition (on average, 114 trials for each condition and 916 trials for each participant). We defined the discrimination threshold as the logarithms of the average noise level of the last 6 reversals, which converge to a correct discrimination rate of approximately 79.4%. 
Results
Figure 3 illustrates the discrimination threshold of each experimental condition. A three-way repeated-measures ANOVA revealed a significant main effect of walking type (F(1,4) = 18.8, p < 0.05), vertical orientation (F(1,4) = 22.9, p < 0.01), body area (F(1,4) = 16.0, p < 0.05), and three-way interaction (F(1,4) = 12.0, p < 0.05). The discrimination threshold was higher in normal walking than in hakobi walking, in the full-body display than in the lower body display and in the upright display than in the inverted display. Further tests revealed that the two-way interaction between walking type and vertical orientation was significant in the full-body condition (F(1,4) = 16.3, p < 0.05) but not in the lower body condition (F(1,4) = 1.11, p = 0.35). This indicated that the inversion effect of full-body display was larger for hakobi walking than for normal walking, although the inversion effect was significant or marginally significant in all combinations of walking type and body area (normal–full body: t(4) = 3.97, p < 0.05; normal–lower body: t(4) = 2.76, p = 0.05; hakobi–full body: t(4) = 6.53, p < 0.01; hakobi–lower body: t(4) = 6.37, p < 0.01). These results suggested that the PL hakobi walking provided less information with regard to facing direction than normal walking did; however, the participants perceived the walking direction of the normal and hakobi walkers in a qualitatively similar way: Inversion as well as lower body display impaired direction discrimination. Furthermore, the fact that the omission of the upper body impaired discrimination performance implies that the upper body subserved direction discrimination of the PL walker. 
Figure 3
 
Discrimination threshold in Experiment 1. Error bars indicate the standard errors of the mean.
Figure 3
 
Discrimination threshold in Experiment 1. Error bars indicate the standard errors of the mean.
Experiment 2
The results of Experiment 1 suggested that the upper body as well as the lower body contributed to direction discrimination for both normal and hakobi walking. In the following experiments, therefore, we focused on the roles of and information usage for lower body and upper body in direction discrimination. First, in Experiment 2, in order to investigate the effects of the upper and lower bodies directly, we presented PL stimuli of the lower body, upper body, or both and separately measured the discrimination thresholds for each body area. Moreover, we examined whether discrimination performance in the full-body condition could be predicted by a simple probabilistic summation of performance in the lower and upper body conditions. 
Methods
Nine observers participated in Experiment 2. All participants reported that they had normal or corrected-to-normal vision. Stimuli and apparatus were identical to those in Experiment 1. In Experiment 2, we tested discrimination thresholds of all combinations of four types of upper body condition (normal, hakobi, static, and absent) and three types of lower body condition (normal, hakobi, and absent) except for the absent–absent combination; this resulted in 11 experimental conditions (on average, 137 trials for each condition and 1505 trails for each participant). In the static upper body condition, the display frame was randomly chosen from the hakobi walking and was presented without motion. In order to combine the lower and upper bodies of different walkers, the heights of all walkers were standardized and the positions of the lower and upper bodies were aligned by the PL hip point. 
Results
Figure 4 presents the discrimination threshold of each experimental condition. Since the experimental design was quasi-factorial and the absent–absent condition was missing, we performed two-way repeated-measures ANOVAs separately for the lower body-present conditions (4 upper body × 2 lower body) and upper body-present conditions (3 upper body × 3 lower body). In the lower body-present conditions, we found a significant main effect of lower body (F(1, 8) = 79.5, p < 0.001) as well as upper body (F(3, 24) = 8.85, p < 0.001); however, the interaction was not significant (F(3, 24) = 1.91, p = 0.15). Pairwise comparison (Ryan's method) of the upper body conditions revealed that the discrimination threshold was lower in the upper body-absent condition as compared to the other conditions (ps < 0.05). As for the effect of lower body, the discrimination threshold of the hakobi walking was lower that that of normal walking. 
Figure 4
 
Discrimination threshold in Experiment 2. Error bars indicate the standard errors of the mean.
Figure 4
 
Discrimination threshold in Experiment 2. Error bars indicate the standard errors of the mean.
With regard to the upper body-present conditions, we found a significant main effect of lower body (F(2, 16) = 123.8, p < 0.001) but not upper body (F(2, 16) = 0.08, p = 0.92) as well as an interaction (F(4,32) = 0.53, p = 0.72). Pairwise comparison of the lower body conditions revealed that the discrimination threshold of normal walking was the highest, followed by hakobi walking, and then by the absent condition. 
Thus, discrimination performance was affected by the presence and type of lower body stimulus. In contrast, the effect of upper body stimulus was different from that of lower body stimulus. While the presence of the upper body improved discrimination performance, the type of upper body stimulus did not matter. Moreover, the congruency of the upper and lower bodies had no effect on direction discrimination. 
Thereafter, we examined whether discrimination performance in the full-body conditions could be predicted by performance in the upper and lower body conditions. For this purpose, we performed probit analyses for each participant and each condition and compared the discrimination threshold in the full-body conditions to those predicted by a probabilistic summation of performances from the upper and lower body conditions (Watson, 1979, see 1). Figure 5 indicates a 75% correct discrimination threshold estimated with the results of the full-body condition (i.e., the real data, indicated by circles) and those predicted in the corresponding upper and lower body conditions (indicated by triangles). A three-way repeated-measures ANOVA of upper body, lower body, and estimation type (observation vs. prediction) revealed that the main effect of estimation type (F(1, 8) = 43.6, p < 0.001) and lower body (F(1, 8) = 31.2, p < 0.001), as well as the two-way interaction of lower body and estimation type (F(1, 8) = 5.32, p < 0.05), was significant. With regard to the interaction, further analyses revealed that the effect of estimation type was significant when the lower body was hakobi walking (F(1, 8) = 36.1, p < 0.001) and marginally significant when the lower body was normal walking (F(1, 8) = 4.59, p = 0.06). These results indicated that discrimination performance in the full-body conditions was higher than the discrimination performance in the upper and lower body conditions. The benefit of full-body presentation was larger for the lower body in hakobi walking than the lower body in normal walking. Furthermore, congruency of the type of upper and lower body stimuli did not affect the benefit of full-body presentation. 
Figure 5
 
Observed and predicted discrimination thresholds for full-body PL walker. Error bars indicate the standard errors of the mean.
Figure 5
 
Observed and predicted discrimination thresholds for full-body PL walker. Error bars indicate the standard errors of the mean.
Experiment 3A
In Experiments 3A and 3B, we recorded eye movements during the direction discrimination task in order to examine how observers inspected the upper and lower bodies of the PL walker. 
Methods
Fourteen observers participated in the experiment. The stimuli and apparatus were identical to those in Experiment 2. In Experiment 3A, we tested all combinations of three types of upper body stimulus (normal, hakobi, and absent) and 3 types of lower body stimulus (normal, hakobi, and absent) except for the absent–absent combination, thereby leading to 8 experimental conditions (on average, 111 trials for each condition and 884 trials for each participant). We tracked observers' gaze position with an Eyelink II tracker (SR Research, sampling rate: 250 Hz). We removed the fixation points, and the participants were instructed to feel free to view any position on the PL display. A calibration of the eye tracker was performed before the experimental session, and each trial began after a drift correction of the eye tracker. 
Results
Figure 6 illustrates the discrimination threshold of each experimental condition. Three participants were excluded from the following analyses since the discrimination threshold was zero in some conditions. As in Experiment 2, we performed two-way repeated-measures ANOVAs separately for the lower body-present and upper body-present conditions. In the lower body-present conditions, we found significant main effects of lower body (F(1, 10) = 28.8, p < 0.001) and upper body (F(2, 20) = 9.18, p < 0.01) and a significant interaction between them (F(2, 20) = 3.97, p < 0.05). Pairwise comparisons of the upper body conditions revealed that the discrimination threshold was lower in the upper body-absent condition than in the other conditions. With regard to the upper body-present conditions, the main effect of lower body (F(2, 20) = 22.5, p < 0.001) and the two-way interaction (F(2, 20) = 6.24, p < 0.01) were significant; however, the main effect of upper body (F(2, 16) = 0.08, p = 0.92) was not. Pairwise comparisons of the lower body conditions revealed that the discrimination threshold was lower in the lower body-absent conditions than in the other conditions. The effect of upper body did not attain significance in any lower body conditions. Thus, for the most part, discrimination performance results of Experiments 3A and 3B replicated those of Experiment 2
Figure 6
 
Discrimination threshold in Experiments 3A and 3B. Error bars indicate the standard errors of the mean.
Figure 6
 
Discrimination threshold in Experiments 3A and 3B. Error bars indicate the standard errors of the mean.
In the gaze analyses, as the position of PL stimulus varied across trails, we first offset the vertical gaze position so that the hip point of the initial PL motion frame was zero for each trial. We then pooled the vertical positions of all valid samples for each condition and divided them into 2 bins of first (0–500 ms) and second (500–1000 ms) periods of the PL display. Finally, we fitted the gaze density by unimodal and bimodal normal distributions. Figure 7 illustrates the density of vertical gaze position of each condition. 2 Visual inspection of gaze density (Figure 7) and model selection using AIC (Table 1) indicated that there were two peaks in the lower body-present conditions and one peak in the lower body-absent conditions. In the lower body-present conditions, one peak was positioned around the vertical center (elbows, wrists, and hip) and the other was in the lower body (ankle and toe). In the lower body-absent conditions, the peak was positioned around the vertical center, which was slightly higher than the upper peak of the lower body-present conditions. Thus, it was clear that the gaze patterns differed qualitatively depending on the absence or presence of the lower body. 
Figure 7
 
Probability density of vertical gaze position (pooled across all participants) in Experiment 3A. The horizontal axes indicate the gaze density (probability per 1 pixel). The vertical axes indicate the vertical position (pixel). For example, a gaze density of 0.005 (5 × 10−3) indicates that 0.5% of all gaze samples was in the 1-pixel bin. Red and blue areas indicate the density during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position (normal walking at the time of 765 ms; see Figure 2), which helps visual comparison of vertical gaze position over the PL.
Figure 7
 
Probability density of vertical gaze position (pooled across all participants) in Experiment 3A. The horizontal axes indicate the gaze density (probability per 1 pixel). The vertical axes indicate the vertical position (pixel). For example, a gaze density of 0.005 (5 × 10−3) indicates that 0.5% of all gaze samples was in the 1-pixel bin. Red and blue areas indicate the density during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position (normal walking at the time of 765 ms; see Figure 2), which helps visual comparison of vertical gaze position over the PL.
Table 1
 
Coefficients of fitting of gaze density with unimodal and bimodal distribution in Experiments 3A and 3B. In each row, the top and bottom values indicate the first and second periods, respectively. AIC indicates Akaike Information Criterion.
Table 1
 
Coefficients of fitting of gaze density with unimodal and bimodal distribution in Experiments 3A and 3B. In each row, the top and bottom values indicate the first and second periods, respectively. AIC indicates Akaike Information Criterion.
Upper body Lower body Vertical position (pixel) Peak density (×10−3) AIC (×104)
Upper mode Lower mode Upper mode Lower mode Unimodal Bimodal
Experiment 3A Normal Normal 22.61 −167.74 4.54 1.97 −1.25 −1.51
13.74 −179.37 2.90 3.21 −1.27 −1.47
Hakobi 23.83 −170.20 4.64 2.06 −1.23 −1.51
12.15 −176.88 3.04 3.33 −1.26 −1.45
None 32.67 5.48 −1.40
49.52 4.50 −1.48
Hakobi Normal 22.57 −176.40 4.58 1.76 −1.26 −1.60
6.93 −184.88 2.90 3.04 −1.28 −1.43
Hakobi 27.06 −163.55 4.44 1.79 −1.28 −1.64
10.13 −178.45 2.75 3.05 −1.32 −1.53
None 34.49 5.57 −1.41
66.66 4.52 −1.52
None Normal 19.68 −167.32 4.49 2.12 −1.25 −1.63
−0.14 −168.33 3.32 3.52 −1.28 −1.52
Hakobi 22.68 −165.07 4.84 1.95 −1.23 −1.49
1.35 −163.65 3.19 3.46 −1.31 −1.49
Experiment 3B Normal Normal 6.48 −95.91 4.59 3.90 −1.36 −1.61
4.52 −125.08 3.79 3.38 −1.35 −1.47
Hakobi 5.23 −113.03 4.43 3.48 −1.36 −1.60
−20.27 −165.96 3.50 3.14 −1.39 −1.52
Hakobi Normal 10.85 −81.67 4.60 3.91 −1.35 −1.59
−8.71 −149.13 3.62 3.42 −1.38 −1.54
Hakobi 7.99 −109.33 4.44 3.70 −1.35 −1.52
3.16 −142.37 3.23 3.73 −1.34 −1.45
Static Normal 9.55 −96.74 4.63 3.97 −1.33 −1.55
5.75 −129.82 3.67 3.78 −1.34 −1.51
Hakobi 8.37 −113.57 4.62 3.62 −1.31 −1.54
8.18 −138.93 3.37 3.56 −1.33 −1.46
None Normal 7.34 −99.75 4.49 4.08 −1.35 −1.57
1.90 −105.21 3.64 3.99 −1.39 −1.54
Hakobi 9.59 −107.01 4.43 4.03 −1.34 −1.62
9.04 −127.30 3.34 4.17 −1.34 −1.56
In order to confirm the gaze patterns over time, we analyzed gaze density of each participant separately and performed a 3-way (condition, period: first vs. second, and peak: upper vs. lower) repeated-measures ANOVA for the lower body-present conditions and a 2-way (condition and period) ANOVA for the lower and upper body conditions (Figure 8). With regard to the time course of the gaze patterns, the position of the two peaks in the lower body-present conditions largely did not shift between the first and second periods (F(1, 10) = 2.86, p = 0.12); however, peak density changed over time. The upper peak densities were far higher than the lower peak density in the first period (F(1, 10) = 40.8, p < 0.001); the upper and lower peak densities then became similar in the second period (F(1, 10) = 0.29, p = 0.60). In the lower body-absent condition, the position of the peak moved up slightly (F(1, 10) = 21.7, p < 0.001), and peak density decreased over time (F(1, 10) = 4.94, p = 0.05). Thus, the gaze analyses suggested that observers initially looked at the middle or upper body area, and then the gaze position moved to the lower body area if the lower body was present, while it stayed around the upper body when the lower body was absent. It must be noted that the observers were able to determine whether the lower body was present or missing with barely orienting to the expected location of the lower body (Figures 7 and 8). Finally, neither the type and presence of upper body stimulus nor the type of lower body stimulus modulated gaze patterns; only the presence or absence of the lower body modulated them. 
Figure 8
 
Vertical position and density of peaks in Experiments 3A and 3B. Lower body-absent (one peak depicted with triangle) and lower body-present conditions (two peaks depicted with circle) were averaged separately among the participants. Red and blue symbols indicate the peak during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position for visual comparison of vertical gaze position over the PL. Error bars indicate the standard errors of the mean.
Figure 8
 
Vertical position and density of peaks in Experiments 3A and 3B. Lower body-absent (one peak depicted with triangle) and lower body-present conditions (two peaks depicted with circle) were averaged separately among the participants. Red and blue symbols indicate the peak during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position for visual comparison of vertical gaze position over the PL. Error bars indicate the standard errors of the mean.
Experiment 3B
In Experiment 3A, the lower body was absent in some trials. Although it was clearly shown that the upper body area was always observed, even when it was absent, the knowledge that the lower body could be absent might lead to a particular strategic bias toward the upper body area. In Experiment 3B, the lower body was present throughout the experiment, and we investigated whether gaze patterns would change. In addition, in this experiment, we also presented the static upper body condition that was used in Experiment 2 to examine if the gaze patterns are similar between the dynamic and static upper bodies. 
Methods
Thirteen observers participated in the experiment. The stimuli and apparatus were identical to those in Experiment 3A. In Experiment 3B, we tested the discrimination threshold of all combinations of 4 types of upper body stimulus (normal, hakobi, static, and absent) and 2 types of lower body (normal and hakobi), thereby leading to 8 experimental conditions (on average, 120 trials for each condition and 963 trials for each participant). 
Results
Figure 6 illustrates the discrimination threshold of each experimental condition. A two-way repeated-measures ANOVA revealed significant main effects of the upper body (F(3, 36) = 9.17, p < 0.001) and lower body (F(1,12) = 48.2, p < 0.001). Pairwise comparisons of the upper body conditions revealed that the discrimination threshold was lower in the upper body-absent condition than in the other conditions. 
Figure 9 illustrates the density of vertical gaze position. 3 Although the peaks were apparently less distinct than those in Experiment 3A, bimodal distribution was identified as a better model than unimodal distribution in all conditions (see Table 1). As with Experiment 3A, we analyzed gaze density of each participant separately and performed a 3-way repeated-measures ANOVA (Figure 8). The upper peak did not move along time (F(1, 12) = 1.09, p = 0.32), while the lower peaks moved slightly down over time (−108 pixels in the first period and −124 pixels in the second period). The difference was marginally significant (F(1, 12) = 4.63, p = 0.05). The density of the upper peak was higher than that of the lower peak (F(1, 12) = 7.37, p < 0.05) in both the first and second periods, which indicated that the upper body was preferentially inspected throughout the stimulus presentation. The density of the upper peak was slightly higher in the first period than in the second period (F(1, 12) = 3.61, p = 0.08). Thus, these gaze analyses suggested that the initial gaze bias toward the upper body was weakened as compared to Experiment 3A; however, the upper body was, nevertheless, inspected in both the first and second periods. 
Figure 9
 
Density of vertical gaze position (pixel) in Experiment 3B. Light gray and dark gray areas indicate the density during first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots indicate the PL position.
Figure 9
 
Density of vertical gaze position (pixel) in Experiment 3B. Light gray and dark gray areas indicate the density during first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots indicate the PL position.
General discussion
In the present study, we investigated the relative contributions of the upper and lower bodies to direction discrimination in normal and hakobi walking. First, in Experiment 1, we confirmed that hakobi and normal walking shared the common characteristics of biological motion perception, that is, the inversion of the PL display and the removal of the upper body impaired direction discrimination. This would imply that it is unlikely that direction discrimination for the normal and hakobi PL walkers were qualitatively different. In Experiments 2 and 3A, the omission of the lower body impaired discrimination performance far more than the omission of the upper body. This was true for both normal and hakobi walking, thereby suggesting that it was a general tendency to depend largely on information from the lower body for direction discrimination of the PL walker. Perhaps the local motion of the lower body, particularly the ankle and toe, would be most informative with regard to the facing direction of the PL walker (Mather et al., 1992; Troje & Westhoff, 2006). 
However, the upper body was also informative in direction discrimination of the PL walker, since the removal of the upper body impaired direction discrimination and since the observers showed performance above chance even for the static upper body. However, unlike the lower body, global configuration will play a key role in the case of the upper body. Discrimination performance for the static upper body was comparable with that for the upper body with local motion (Experiments 2 and 3B). Moreover, upper body performance for a normal walker was no better than that for a hakobi walker, despite the fact that the magnitude of local motion of the hakobi walker was smaller than that of the normal walker (see Figure 1). Thus, discrimination performance did not seem to depend on the magnitude of local motion of the upper body; as a result, we conjectured that global configuration is a crucial piece of information in direction discrimination of the upper body, as local motion was for the lower body (Thurman et al., 2010). While there has been a long discussion regarding which factor, local motion or global configuration, was dominant in biological motion perception (Blake & Shiffrar, 2007), some research has suggested that both are used in a task-dependent manner (Chang & Troje, 2009; Thirkettle et al., 2009). Our results implied that the relative importance of local motion and global configuration would also depend on the body areas in question. 
In addition to the dissociation of dominant information between the upper and lower bodies, our dynamic gaze pattern analyses indicated that the roles of the upper and lower bodies in direction discrimination might be different. In Experiment 3A, the gaze was initially biased toward the upper or middle body area and then shifted toward the lower body area if it was present. Thus, initially, the upper body, perhaps specifically the configurational information of the upper body, was preferentially processed, and then the lower body came into use. Further, the initial inspection of the upper body would involve judgment of the presence or absence of the lower body without the use of the fovea (Ikeda, Blake, & Watanabe, 2005), since the region of the lower body was rarely visited when the lower body was absent. In Experiment 3B, where the lower body was present in all trials, the initial gaze bias to the upper body was weakened (see Figure 8). The initial gaze bias to the upper body can be partially attributed to the top-down attentional strategy (Thornton, Rensink, & Shiffrar, 2002). Nevertheless, the region of the upper body was observed in comparable amount with that of the lower body. This was the case even when the upper body was absent and the observers were aware that the lower body was certainly present (Experiment 3B). The initial gaze preference for the upper body is surprising given the fact that the region of the lower body was little observed when it was absent (Experiment 3A). Moreover, the upper body provided much less information than the lower body for direction discrimination. 
Why was the less informative upper body initially observed? Perhaps the upper body serves to localize a PL walker in dynamic motion noise. Since the position of the PL walker randomly varied from trial to trial, the observers were required to localize the PL walker prior to direction discrimination. Therefore, intrinsically or strategically, the upper body would be initially observed in order to determine the location of the PL walker, which is consistent with previous research that indicates that the upper body contributed to the detection of the PL walker (Pinto & Shiffrar, 1999). 
This conjecture would also be supported by the enhanced discrimination of the full-body PL walker (Experiment 2). If the upper and lower bodies were processed independently in the full-body condition, discrimination performance in this condition could not exceed the prediction by performance for the upper and lower body conditions. However, we found that discrimination performance for the full-body PL walker was better than predicted by the probability summation of discrimination performance for upper body alone and lower body alone (see Figure 5, observation vs. prediction). The superiority of full-body display would be consistent with previous research that demonstrated that perceptual sensitivity for biological motion could be above probability summation over space and time (Neri, Morrone, & Burr, 1998). Since we observed this “full-body enhancement” even in the combination of dynamic lower body with static upper body, the enhancement cannot be attributed to increase in motion coherency of upper and lower bodies. Rather, in conjunction with the gaze pattern analyses, our current interpretation is that full-body display would facilitate the localization of the PL walker, mostly through the initial gaze bias to the upper body. In the case of the full-body PL walker, the detection of the upper body could reduce the positional uncertainty of the lower body and vice versa. Therefore, the initial inspection of the upper body could determine the location of the lower body and lead to the dedication of resources to directional discrimination of the lower body. On the other hand, in the case of upper body-absent conditions, where the region of the upper body was still inspected (Experiments 3A and 3B), identification of the absence of the upper body and localization of the lower body needed to precede direction discrimination of the lower body, which obviously required extra processing cost. Thus, although the lower body provided a large portion of the directional information, observers effectively used full-body information in direction discrimination of the PL walker (Lu & Liu, 2006; Thurman et al., 2010). Further investigation, examining misaligned upper and lower bodies, would subserve to reveal greater detail on how localization and direction discrimination are performed for PL display. 
In Experiments 2, 3A, and 3B, we presented only upright normal and hakobi walkers. Therefore, it is still open if “full-body enhancement” (Figure 5) could take place and if dynamic gaze pattern from upper to lower body area could be observed when the PL display is presented with inversion. Troje and Westhoff (2006) showed that the inversion effect took place even for the scrambled PL display and hypothesized that the impairment for inverted PL display would be due to the inefficiency of a visual filter that is tuned to the upright local motion (particularly of feet). However, their results will not deny the possibility that inversion impairs efficient process of upper and lower bodies (e.g., efficient gaze guidance by upper body). In order to reveal how the local visual filters and global mechanisms interactively work in biological motion perception, it would be worthwhile to address the effect of body areas in directional discrimination for inverted PL walker where the spatial relation of upper and lower bodies is unnatural. 
In summary, the present study demonstrated that there is a qualitative difference in information usage for upper body and lower body in biological motion perception. Global configuration is the dominant information in the upper body, while local motion is dominant in the lower body. Furthermore, gaze pattern analyses indicated that the upper and lower bodies had different roles in direction discrimination. We conjectured that initially the upper body was inspected for localization of the PL walker, and then the lower body was examined for direction discrimination. Thus, it seemed that observers effectively used full-body information in perceiving biological motion. Along with examining minimal information that is sufficient for biological motion perception, this holistic approach can provide further insights into the way we understand the physical actions of others in our daily life. 
Appendix A
In order to calculate the probability summation of the lower and upper body conditions, we applied a GLM estimation with a binomial probit model. The response, correct or incorrect, is a random variable that follows a binomial distribution whose probability was described as a function of noise strength: 
R e s p o n s e ( n ) B e ( Pr ( n ) ) , Pr ( n ) = Φ ( α + β n ) + 1 2 ,
(A1)
where n is noise strength, Be is Bernoulli distribution, Φ indicates the cumulative Gaussian distribution, and Pr(n) yields a psychometric function of the correct response rate as a function of noise strength. The prediction of psychometric function is derived using a probability summation of psychometric function in lower and upper body conditions, in the following manner: 
Pr p r e d . ( n ) = ( 1 { 1 ( 2 Pr l o w e r ( n ) 1 } ) { 1 ( 2 Pr u p p e r ( n ) 1 ) } ) + 1 2 ,
(A2)
where Prlower and Prupper indicate the psychometric function in the lower body and upper body conditions, respectively. 
Finally, the predicted and observed thresholds for full-body condition were derived from these psychometric functions (Figure 5): 
T h r e s 0.75 = a r g m i n | Pr ( n ) 0.75 | .
(A3)
 
Acknowledgments
This research was partially supported by the Japan Society for the Promotion of Science, the Japan Science and Technology Agency, and a Toray Science and Technology Grant. We thank Norishige Yamamoto, Norihide Yamamoto, and Takashi Wakamatsu—all of whom are Kyogen actors of the Yamamoto family in the Ohkura School—for permitting us to measure their hakobi walking. We also thank NAC Image Technology for their cooperation in measuring walking motion using their motion capture device. 
Commercial relationships: none. 
Corresponding authors: Kohske Takahashi and Kazuhiro Ueda. 
Emails: ktakahashi@fennel.rcast.u-tokyo.ac.jp; ueda@gregorio.c.u-tokyo.ac.jp. 
Addresses: Research Center for Advanced Science and Technology, The University of Tokyo, 4-6-1, Komaba, Meguro-ku 153-8904, Tokyo, Japan; Interfaculty Initiative in Information Studies, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, 153-8902, Tokyo, Japan. 
Footnotes
Footnotes
1  We measured 8 points of the upper body (top and rear heads, left and right shoulders, elbows, and wrists) and 8 points of the lower body (left and right hips, knees, ankles, and toes). The markers were placed as closely as possible to these joints.
Footnotes
2  Sample size was 186,605 on average.
Footnotes
3  Sample size was 186,076 on average.
References
Atkinson A. P. Dittrich W. H. Gemmell A. J. Young A. W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33, 717–746. [CrossRef] [PubMed]
Baccus W. Mozgova O. Thompson J. C. (2009). Early integration of form and motion in the neural response to biological motion. Neuroreport, 20, 1334–1338. [CrossRef] [PubMed]
Beintema J. A. Georg K. Lappe M. (2006). Perception of biological motion from limited-lifetime stimuli. Perception & Psychophysics, 68, 613–624. [CrossRef] [PubMed]
Beintema J. A. Lappe M. (2002). Perception of biological motion without local image motion. Proceedings of the National Academy of Sciences of the United States of America, 99, 5661–5663. [CrossRef] [PubMed]
Blake R. Shiffrar M. (2007). Perception of human motion. Annual Review of Psychology, 58, 47–73. [CrossRef] [PubMed]
Cavaye R. Griffith P. Senda A. (2005). A guide to the Japanese stage: From traditional to cutting edge. Kodansha International, Tokyo, Japan.
Chang D. H. F. Troje N. F. (2008). Perception of animacy and direction from local biological motion signals. Journal of Vision, 8(5):3, 1–10, http://www.journalofvision.org/content/8/5/3, doi:10.1167/8.5.3. [PubMed] [Article] [CrossRef] [PubMed]
Chang D. H. F. Troje N. F. (2009). Characterizing global and local mechanisms in biological motion perception. Journal of Vision, 9(5):8, 1–10, http://www.journalofvision.org/content/9/5/8, doi:10.1167/9.5.8. [PubMed] [Article] [CrossRef] [PubMed]
Dittrich W. H. Troscianko T. Lea S. E. Morgan D. (1996). Perception of emotion from dynamic point-light displays represented in dance. Perception, 25, 727–738. [CrossRef] [PubMed]
Giese M. A. Poggio T. (2003). Neural mechanisms for the recognition of biological movements. Nature Reviews Neuroscience, 4, 179–192. [CrossRef] [PubMed]
Hirai M. Saunders D. R. Troje N. F. (2011). Allocation of attention to biological motion: Local motion dominates global shape. Journal of Vision, 11(3):4, 1–11, http://www.journalofvision.org/content/11/3/4, doi:10.1167/11.3.4. [PubMed] [Article] [CrossRef] [PubMed]
Ikeda H. Blake R. Watanabe K. (2005). Eccentric perception of biological motion is unscalably poor. Vision Research, 45, 1935–1943. [CrossRef] [PubMed]
Jacobs A. Pinto J. Shiffrar M. (2004). Experience, context, and the visual perception of human movement. Journal of Experimental Psychology: Human Perception and Performance, 30, 822–835. [CrossRef] [PubMed]
Johansson G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14, 201–211. [CrossRef]
Johnson K. L. Tassinary L. G. (2005). Perceiving sex directly and indirectly: Meaning in motion and morphology. Psychological Science, 16, 890–897. [CrossRef] [PubMed]
Kozlowski L. Cutting J. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21, 575–580. [Article] [CrossRef]
Lange J. Georg K. Lappe M. (2006). Visual perception of biological motion by form: A template-matching analysis. Journal of Vision, 6(8):6, 836–849, http://www.journalofvision.org/content/6/8/6, doi:10.1167/6.8.6. [PubMed] [Article] [CrossRef]
Lu H. Liu Z. (2006). Computing dynamic classification images from correlation maps. Journal of Vision, 6(4):12, 475–483, http://www.journalofvision.org/content/6/4/12, doi:10.1167/6.4.12. [PubMed] [Article] [CrossRef]
Mather G. Radford K. West S. (1992). Low-level visual processing of biological motion. Proceedings Biological Sciences, 249, 149–155. [CrossRef]
McKay L. S. Simmons D. R. McAleer P. Pollick F. E. (2009). Contribution of configural information in a direction discrimination task: Evidence using a novel masking paradigm. Vision Research, 49, 2503–2508. [CrossRef] [PubMed]
Neri P. Morrone M. C. Burr D. C. (1998). Seeing biological motion. Nature, 395, 894–896. [CrossRef] [PubMed]
Pinto J. Shiffrar M. (1999). Subconfigurations of the human form in the perception of biological motion displays. Acta Psychologica, 102, 293–318. [CrossRef] [PubMed]
Pinto J. Shiffrar M. (2009). The visual perception of human and animal motion in point-light displays. Social Neuroscience, 4, 332–346. [CrossRef] [PubMed]
Pollick F. E. Paterson H. M. Bruderlin A. Sanford A. J. (2001). Perceiving affect from arm movement. Cognition, 82, B51–B61. [CrossRef] [PubMed]
Saunders D. R. Williamson D. K. Troje N. F. (2010). Gaze patterns during perception of direction and gender from biological motion. Journal of Vision, 10(11):9, 1–10, http://www.journalofvision.org/content/10/11/9, doi:10.1167/10.11.9. [PubMed] [Article] [CrossRef] [PubMed]
Schouten B. Troje N. F. Verfaillie K. (2011). The facing bias in biological motion perception: Structure, kinematics, and body parts. Attention, Perception & Psychophysics, 73, 130–143. [CrossRef] [PubMed]
Thirkettle M. Benton C. P. Scott-Samuel N. E. (2009). Contributions of form, motion and task to biological motion perception. Journal of Vision, 9(3):28, 1–11, http://www.journalofvision.org/content/9/3/28, doi:10.1167/9.3.28. [PubMed] [Article] [CrossRef] [PubMed]
Thornton I. M. Rensink R. A. Shiffrar M. (2002). Active versus passive processing of biological motion. Perception, 31, 837–853. [CrossRef] [PubMed]
Thurman S. M. Giese M. A. Grossman E. D. (2010). Perceptual and computational analysis of critical features for biological motion. Journal of Vision, 10(12):15, 1–14, http://www.journalofvision.org/content/10/12/15, doi:10.1167/10.12.15. [PubMed] [Article] [CrossRef] [PubMed]
Thurman S. M. Grossman E. D. (2008). Temporal “Bubbles” reveal key features for point-light biological motion perception. Journal of Vision, 8(3):28, 1–11, http://www.journalofvision.org/content/8/3/28, doi:10.1167/8.3.28. [PubMed] [Article] [CrossRef] [PubMed]
Troje N. F. (2002). Decomposing biological motion: A framework for analysis and synthesis of human gait patterns. Journal of Vision, 2(5):2, 371–387, http://www.journalofvision.org/content/2/5/2, doi:10.1167/2.5.2. [PubMed] [Article] [CrossRef]
Troje N. F. Westhoff C. (2006). The inversion effect in biological motion perception: Evidence for a “life detector.” Current Biology, 16, 821–824. [CrossRef] [PubMed]
Vanrie J. Verfaillie K. (2006). Perceiving depth in point-light actions. Perception & Psychophysics, 68, 601–612. [CrossRef] [PubMed]
Watson A. B. (1979). Probability summation over time. Vision Research, 19, 515–522. [CrossRef] [PubMed]
Figure 1
 
Visual representation of the amount of local motion (i.e., the distance a dot travels along its local trajectory as it completes one step cycle) at each point of a normal and a hakobi PL walker, respectively. The right panel shows the amount of local motion, its ratio, and difference between normal and hakobi walking.
Figure 1
 
Visual representation of the amount of local motion (i.e., the distance a dot travels along its local trajectory as it completes one step cycle) at each point of a normal and a hakobi PL walker, respectively. The right panel shows the amount of local motion, its ratio, and difference between normal and hakobi walking.
Figure 2
 
Motion trajectories of normal and hakobi walking.
Figure 2
 
Motion trajectories of normal and hakobi walking.
Figure 3
 
Discrimination threshold in Experiment 1. Error bars indicate the standard errors of the mean.
Figure 3
 
Discrimination threshold in Experiment 1. Error bars indicate the standard errors of the mean.
Figure 4
 
Discrimination threshold in Experiment 2. Error bars indicate the standard errors of the mean.
Figure 4
 
Discrimination threshold in Experiment 2. Error bars indicate the standard errors of the mean.
Figure 5
 
Observed and predicted discrimination thresholds for full-body PL walker. Error bars indicate the standard errors of the mean.
Figure 5
 
Observed and predicted discrimination thresholds for full-body PL walker. Error bars indicate the standard errors of the mean.
Figure 6
 
Discrimination threshold in Experiments 3A and 3B. Error bars indicate the standard errors of the mean.
Figure 6
 
Discrimination threshold in Experiments 3A and 3B. Error bars indicate the standard errors of the mean.
Figure 7
 
Probability density of vertical gaze position (pooled across all participants) in Experiment 3A. The horizontal axes indicate the gaze density (probability per 1 pixel). The vertical axes indicate the vertical position (pixel). For example, a gaze density of 0.005 (5 × 10−3) indicates that 0.5% of all gaze samples was in the 1-pixel bin. Red and blue areas indicate the density during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position (normal walking at the time of 765 ms; see Figure 2), which helps visual comparison of vertical gaze position over the PL.
Figure 7
 
Probability density of vertical gaze position (pooled across all participants) in Experiment 3A. The horizontal axes indicate the gaze density (probability per 1 pixel). The vertical axes indicate the vertical position (pixel). For example, a gaze density of 0.005 (5 × 10−3) indicates that 0.5% of all gaze samples was in the 1-pixel bin. Red and blue areas indicate the density during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position (normal walking at the time of 765 ms; see Figure 2), which helps visual comparison of vertical gaze position over the PL.
Figure 8
 
Vertical position and density of peaks in Experiments 3A and 3B. Lower body-absent (one peak depicted with triangle) and lower body-present conditions (two peaks depicted with circle) were averaged separately among the participants. Red and blue symbols indicate the peak during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position for visual comparison of vertical gaze position over the PL. Error bars indicate the standard errors of the mean.
Figure 8
 
Vertical position and density of peaks in Experiments 3A and 3B. Lower body-absent (one peak depicted with triangle) and lower body-present conditions (two peaks depicted with circle) were averaged separately among the participants. Red and blue symbols indicate the peak during the first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots and sticks indicate the PL position for visual comparison of vertical gaze position over the PL. Error bars indicate the standard errors of the mean.
Figure 9
 
Density of vertical gaze position (pixel) in Experiment 3B. Light gray and dark gray areas indicate the density during first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots indicate the PL position.
Figure 9
 
Density of vertical gaze position (pixel) in Experiment 3B. Light gray and dark gray areas indicate the density during first (0–500 ms) and second (500–1000 ms) periods, respectively. The white dots indicate the PL position.
Table 1
 
Coefficients of fitting of gaze density with unimodal and bimodal distribution in Experiments 3A and 3B. In each row, the top and bottom values indicate the first and second periods, respectively. AIC indicates Akaike Information Criterion.
Table 1
 
Coefficients of fitting of gaze density with unimodal and bimodal distribution in Experiments 3A and 3B. In each row, the top and bottom values indicate the first and second periods, respectively. AIC indicates Akaike Information Criterion.
Upper body Lower body Vertical position (pixel) Peak density (×10−3) AIC (×104)
Upper mode Lower mode Upper mode Lower mode Unimodal Bimodal
Experiment 3A Normal Normal 22.61 −167.74 4.54 1.97 −1.25 −1.51
13.74 −179.37 2.90 3.21 −1.27 −1.47
Hakobi 23.83 −170.20 4.64 2.06 −1.23 −1.51
12.15 −176.88 3.04 3.33 −1.26 −1.45
None 32.67 5.48 −1.40
49.52 4.50 −1.48
Hakobi Normal 22.57 −176.40 4.58 1.76 −1.26 −1.60
6.93 −184.88 2.90 3.04 −1.28 −1.43
Hakobi 27.06 −163.55 4.44 1.79 −1.28 −1.64
10.13 −178.45 2.75 3.05 −1.32 −1.53
None 34.49 5.57 −1.41
66.66 4.52 −1.52
None Normal 19.68 −167.32 4.49 2.12 −1.25 −1.63
−0.14 −168.33 3.32 3.52 −1.28 −1.52
Hakobi 22.68 −165.07 4.84 1.95 −1.23 −1.49
1.35 −163.65 3.19 3.46 −1.31 −1.49
Experiment 3B Normal Normal 6.48 −95.91 4.59 3.90 −1.36 −1.61
4.52 −125.08 3.79 3.38 −1.35 −1.47
Hakobi 5.23 −113.03 4.43 3.48 −1.36 −1.60
−20.27 −165.96 3.50 3.14 −1.39 −1.52
Hakobi Normal 10.85 −81.67 4.60 3.91 −1.35 −1.59
−8.71 −149.13 3.62 3.42 −1.38 −1.54
Hakobi 7.99 −109.33 4.44 3.70 −1.35 −1.52
3.16 −142.37 3.23 3.73 −1.34 −1.45
Static Normal 9.55 −96.74 4.63 3.97 −1.33 −1.55
5.75 −129.82 3.67 3.78 −1.34 −1.51
Hakobi 8.37 −113.57 4.62 3.62 −1.31 −1.54
8.18 −138.93 3.37 3.56 −1.33 −1.46
None Normal 7.34 −99.75 4.49 4.08 −1.35 −1.57
1.90 −105.21 3.64 3.99 −1.39 −1.54
Hakobi 9.59 −107.01 4.43 4.03 −1.34 −1.62
9.04 −127.30 3.34 4.17 −1.34 −1.56
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×