Free
Article  |   February 2013
Complex interactions between spatial, orientation, and motion cues for biological motion perception across visual space
Author Affiliations
  • Steven M. Thurman
    Department of Psychology, University of California, Los Angeles, Los Angeles, CA, USA
    sthurman@ucla.edu
  • Hongjing Lu
    Departments of Psychology and Statistics, University of California, Los Angeles, Los Angeles, CA, USA
    hongjing@ucla.edu
Journal of Vision February 2013, Vol.13, 8. doi:https://doi.org/10.1167/13.2.8
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Steven M. Thurman, Hongjing Lu; Complex interactions between spatial, orientation, and motion cues for biological motion perception across visual space. Journal of Vision 2013;13(2):8. https://doi.org/10.1167/13.2.8.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract
Abstract
Abstract:

Abstract  Human observers are adept at perceiving complex actions in point-light biological motion displays that represent the human form with a sparse array of moving points. However, the neural computations supporting action perception remain unclear, particularly with regards to central versus peripheral vision. We created novel action stimuli comprised of Gabor patches to examine the contributions of various competing visual cues to action perception across the visual field. The Gabor action stimulus made it possible to pin down form processing at two levels: (a) local information about limb angle represented by Gabor orientations and (b) global body structure signaled by the spatial arrangement of Gabor patches. This stimulus also introduced two types of motion signals: (a) local velocity represented by Gabor drifting motion and (b) joint motion trajectories signaled by position changes of Gabor disks over time. In central vision, the computational analysis of global cues based on the spatial arrangement of joints and joint trajectories dominated processing, with minimal influence of local drifting motion and orientation cues. In the periphery we found that local drifting motion and orientation cues interacted with spatial cues in sophisticated ways depending on the particular discrimination task and location within the visual field to influence action perception. This dissociation was evident in several experiments showing phantom action percepts in the periphery that contradicted central vision. Our findings suggest a highly flexible and adaptive system for processing visual cues at multiple levels for biological motion and action perception.

Introduction
The ability to detect and recognize human movements after catching a glimpse of others reflects the sophistication of visual processing and plays a central role in bridging perception and cognition. Given its complexity, it is not surprising that a large-scale cortical network spanning the ventral and dorsal streams is recruited to implement key computational components of action perception (Grossman et al., 2000; Jastorff & Orban, 2009; Peelen, Wiggett, & Downing, 2009; Peuskens, Vanrie, Verfaillie, & Orban, 2005; Saygin, Wilson, Hagler, Bates, & Sereno, 2004; Vaina, Solomon, Chowdhury, Sinha, & Belliveau, 2001). Though this system equips human observers with an exquisite ability for action detection and recognition (Blake & Shiffrar, 2007), its complexity has made it challenging to pin down how action representations are formed and utilized to guide behavior. 
Studies of single cells in monkey temporal cortex have revealed intermixed populations of neurons in the superior temporal polysensory (STP) region that encode actions in terms of either static snapshots of body form (Singer & Sheinberg, 2010), complex body movements, or both form and motion (Oram & Perrett, 1996; Vangeneugden, Pollick, & Vogels, 2009). Current computational models of point-light biological motion have correspondingly proposed that action representations can be derived from either sequences of global body postures (Lange & Lappe, 2006) or hierarchically from analysis of local motion and local form signals (Casile & Giese, 2005; Giese & Poggio, 2003). However, it has proven difficult to distinguish between these distinct theoretical mechanisms experimentally using standard point-light displays (Johansson, 1973), because motion and form signals are inherently confounded (Lu, 2010; Pinto & Shiffrar, 1999; Shiffrar, Lichtey, & Heptulla Chatterjee, 1997). That is, global changes in body posture necessarily coincide with local changes in dot positions that carry low-level motion signals. 
As a result, researchers have developed unique methods for manipulating point-light stimuli to influence the relative strength of visual cues relating to global configural processing and/or motion processing. For instance, previous studies have measured the perceptual effects of disrupting the spatial and phase relations between the points via random scrambling (Hiris, Humphrey, & Stout, 2005; Troje & Westhoff, 2006), manipulating the temporal interval between frames (Mather, Radford, & West, 1992) or the overall stimulus duration (Thirkettle, Benton, & Scott-Samuel, 2009; Thurman, Giese, & Grossman, 2010), changing orientation (Chang & Troje, 2009; Pavlova & Sokolov, 2000), masking in noise dots (Bertenthal & Pinto, 1994; Pinto & Shiffrar, 1999) and white noise (Lu & Liu, 2006), reassigning point positions to different locations on the limb on each frame of the sequence (Beintema, Georg, & Lappe, 2006; Beintema & Lappe, 2002), replacing the points with more complex tokens and objects (Hunt & Halper, 2008; Wittinghofer, de Lussanet, & Lappe, 2012), and masking in space/time with aperture masks (Lu, 2010; Shiffrar et al., 1997; Thurman et al., 2010). Using inventive methods and stimulus manipulations, these studies and several others have shed some light on the computational mechanisms underlying perception of biological actions in point-light displays. 
However, despite the multitude of behavioral studies there is no clear consensus about the underlying mechanisms of action perception. This is due, in part, to the fact that while many studies support a model of biological motion based on global processing, others have instead shown an important role for local motion. A potentially profitable way of reconciling this debate is to consider that there are multiple ways in which body actions can be processed and represented based on analysis of various visual cues, such as the global spatial arrangement of joints, the orientation of limb segments, the motion signals induced by body movements, and the local joint trajectories. Reports showing that observers in behavioral tasks appear to transition between form-based and motion-based processing strategies based on stimulus conditions (e.g., stimulus duration) support this notion and further suggest that these systems operate together in parallel (Thirkettle, Benton, & Scott-Samuel, 2009; Thurman et al., 2010). Also, there is evidence for significant interactions between these processing networks, for instance, with local motion facilitating the extraction of global form in the presence of noise (Lu, 2010; Thurman & Grossman, 2008) and global form influencing judgments of local motion coherence (Tadin, Lappin, Blake, & Grossman, 2002). 
In the present study, we placed Gabor patches on the joints of point-light animations to examine the contribution of form cues (local orientation and spatial configuration), motion cues (local grating drift within Gabor patches and global motion trajectories of Gabor patches), and the interaction between these cues to biological motion perception. In several experiments, we manipulated the informational content of human actions defined by Gabor patches by systematically putting these cues into conflict. At the same time, we investigated how action processing differs in the central versus peripheral parts of the visual field, allowing us to assess the possibility that different neural mechanisms may be recruited to achieve robust action recognition across visual space (Thompson & Baccus, 2011). The findings we report here demonstrate significant differences between central and peripheral vision in the analysis of biological movements, with implications for computational theories of action perception. In short, the current results reveal complex interactions between spatial position cues, local orientation cues, and motion cues for perceiving and constructing representations of human action across visual space. 
Experiment 1
Methods
Participants
Twenty participants were recruited through the UCLA Department of Psychology subject pool and given course credit for participation. All participants had normal or corrected vision, gave informed consent approved by the UCLA Institutional Review Board, and were naïve to the purpose and stimuli used in the studies. 
Stimulus and procedure
All stimuli were created using Matlab (MathWorks Inc.) and the Psychophysics Toolbox (Brainard, 1997; Pelli, 1997) and were displayed on a calibrated monitor (85 Hz, background luminance 16.2 c/m2) powered by a Dell PC running Windows XP. Experiments were conducted in a dark room with a chin rest to maintain a constant viewing distance (35 cm). 
The biological motion pattern of human walking was obtained from the Carnegie Mellon Graphics Lab Motion Capture Database available free online (http://mocap.cs.cmu.edu). Software developed in our laboratory was used to convert the raw motion capture files to point-light format, with nine points representing the head, elbows, wrists, knees, and feet. Point lights were connected to create skeletal representations (e.g., stick figures) for estimating the angular orientation of body shape near each joint location. Orientations for the head, elbows, wrists, knees, and feet points were respectively extracted from neck, upper arms, lower arms, upper legs, and lower legs of the human skeletal display. Thus, orientations at the joint locations represented the orientation of a single adjacent limb segment and not the angle between connected limbs (e.g., knee or elbow angles). Local motion was computed directly from differences in the known locations of point lights across subsequent frames (Giese & Poggio, 2003; Lu, 2010). Leftward and rightward walkers were created by reflecting across the vertical axis, and backward walkers were played in reverse temporal order. The walker comprised 120 frames presented at 85 Hz, resulting in a natural gait speed (1.4 s/cycle). In all experiments, the walker was presented for one cycle and starting phase was randomized. The horizontal translation component was removed so that the animation appeared to walk in place as if on a treadmill. 
Human walker stimuli were constructed from Gabor patches placed at the joint locations (SD = 0.85°; 0.5 cycles/deg, 33% contrast). Gabor diameter was 1.8° while the biological animations subtended 4.2° by 9.2° at the greatest extent. Spatial frequency was chosen carefully so that phase shifts in all animations were below the Nyquist limit. Adopting the stimulus description from Amano, Edwards, Badcock, and Nishida (2009), we computed the one-dimensional (component) motion of each Gabor patch according to the underlying 2-D biological motion signals extracted from the point-light sequence (Figure 1a, lower panel). That is, since Gabor patches can only display unidirectional drifting motion orthogonal to grating orientation due to the aperture problem (Marr & Ullman, 1981), and because orientation was independently determined by the corresponding limb orientation on each frame, drifting speed imposed on the Gabor patches was calculated by taking a sine function of the angle between the Gabor orientation and its assigned 2-D motion velocity (e.g., Lu, 2010). This is a common method for imposing global motion signals on multiple aperture stimuli composed of randomly orientated Gabor patches (Amano et al., 2009; Lee & Lu, 2010; Lu, 2010; Rider, McOwan, & Johnston, 2009). In some experimental conditions, plaids were created by superimposing two orthogonal Gabor patches, each with half contrast and offset ±45° from the assigned limb orientation. Like oriented Gabor patches, plaids carry local velocity signals but without the influence of local orientation cues that were used here to represent the underlying shape of the human skeleton. Although plaids unambiguously represent 2-D motion signals (Adelson & Movshon, 1982), and oriented Gabors inherit local motion ambiguity due to the aperture problem, there is evidence that structural information represented by oriented gratings provide a reference frame that allows unambiguous interpretation of local one-dimensional (1D) motions into global 2-D motion for moving shapes (Lin & He, 2012) and biological motion (Lu, 2010). Comparing performance with Gabor elements to plaids allowed the dissociation of local motion cues from orientation cues for the walker discrimination task. 
Figure 1
 
Stimuli and results discriminating forward/backward walking direction in Experiment 1. (a) Schematic illustration of walkers with and without orientation signals. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the antagonistic local 2-D motion vectors drifting in the direction opposite to global disk movement. For demonstration of Gabor stimulus with local motion scalar = 2.5, see Movie 1. The diagram in the lower panel illustrates the difference between 2-D biological motion vectors and the 1D motion vectors that determined the actual drifting speeds of individual gratings. (b) and (c) Mean proportion of responses consistent with the spatially defined direction as a function of the amount of antagonistic local motion, with psychometric fits. Error bars represent SEM.
Figure 1
 
Stimuli and results discriminating forward/backward walking direction in Experiment 1. (a) Schematic illustration of walkers with and without orientation signals. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the antagonistic local 2-D motion vectors drifting in the direction opposite to global disk movement. For demonstration of Gabor stimulus with local motion scalar = 2.5, see Movie 1. The diagram in the lower panel illustrates the difference between 2-D biological motion vectors and the 1D motion vectors that determined the actual drifting speeds of individual gratings. (b) and (c) Mean proportion of responses consistent with the spatially defined direction as a function of the amount of antagonistic local motion, with psychometric fits. Error bars represent SEM.
In the first experiment observers discriminated forward and backward walkers, which is a task that requires spatiotemporal analysis of the dynamic action sequence and cannot be solved on the basis of a single static body posture (Lange & Lappe, 2006; Wittinghofer et al., 2012). The Gabor patches were spatially arranged frame by frame into postures reflecting the movements of either a walker moving forwards or backwards. The orientation of each grating represented the limb angle of the underlying walker, providing local shape cues to support analysis of the body form. However, the motion imposed on the gratings was opposite to the true movements of the joints frame by frame. Thus, the key manipulation was that the 2-D drifting velocity of the local elements was opposite (−180°) to positional shifts of the patches (Figure 1a) by a scaling factor ranging from zero to 2.5 (i.e., motion scalar zero implies no drifting in Gabor/plaid patches, whereas scalar of two means that the speed of the opposite local motion was a two-fold increase in the speed of patch movements). 
Two separate groups performed the forward/backward walking direction discrimination task with either oriented Gabors (n = 10) or plaids (n = 10) presented centrally (0°) and peripherally (20°). Participants performed one block of 140 trials in central vision and two blocks of 140 trials in the periphery. In all experiments reported below, the order of experimental conditions was randomized and counterbalanced across trials within each block. No feedback was provided and trials were self-paced. A brief description and a few sample trials were given to familiarize subjects with the stimuli prior to starting the experiment. 
A red fixation cross was placed 20° to the left of the screen center, and observers were instructed to fixate for the duration of each trial. Biological animations presented peripherally were always discriminated in the right visual field. To help ensure that observers maintained fixation, a relatively simple target detection task was employed simultaneously with the biological motion task. This task involved continually monitoring the red fixation cross and detecting a brief (246 ms) color change that could randomly occur (or not) during the trial. Observers had no difficulty in completing the target detection task, with error rates confirmed to be less than 5%–10% for all subjects in all experiments reported. This result suggests that eye movements were likely minimal during peripheral viewing. 
Results
A mixed-design analysis of variance (ANOVA) with two locations (central, peripheral) and six drifting speeds as within-subjects factors, and two patch types (oriented Gabors, plaids) as a between-subjects factor revealed highly significant main effects of location, F(1, 18) = 246.3 , p < 0.001, Greenhouse-Geisser corrected, and local motion scalar, F(5, 90) = 284.20 , p < 0.001, as well as an interaction between location and local motion, F(5, 90) = 119.0 , p < 0.001. Results (Figures 1b and c) show that observers readily ignored the local grating signals and consistently identified the spatially defined walking direction when presented centrally. However, performance in the periphery quickly declined to chance levels as the local motion scalar increased to about 1.5 times the amount of positional shift of the patches (see Movie 1), indicating that the local drifting speeds perceptually cancelled out the action percept derived from the spatial configuration and movement of the Gabor patches. Moreover, response proportion dipped significantly below chance levels with greater scalars and faster drifting speeds (one-sample t tests, all ps < 0.05), indicating that observers reliably perceived an illusory reversal of forward/backward walking opposite to that perceived when viewed centrally. There were no significant differences between the oriented Gabor and plaid conditions for any factors (all ps > 0.5), confirming that orientation did not play a prominent role in discriminating forward/backward walkers or in causing perceptual reversals. 
Discussion
To explain the results in Experiment 1, we consider possibilities based on two prominent aspects of peripheral vision. Previous studies have demonstrated that local drifting motion is automatically integrated with global movement of a Gabor patch when viewed peripherally and that drifting gratings can also induce illusory spatial shifts (De Valois & De Valois, 1991; Mather & Pavan, 2009; Shapiro, Lu, Huang, Knight, & Ennis, 2010; Zhang, Yeh, & De Valois, 1993). To generate the illusory reversal of forward/backward walking in the periphery found in Experiment 1, the first possibility proposes that the peripheral action is analyzed on the basis of integrated motion signals that serve as input to a process that recognizes actions from spatiotemporal patterns of local motion (Casile & Giese, 2005; Giese & Poggio, 2003). As a key example we consider the horizontal movements of a foot point, since the feet have been shown to carry significant diagnostic information for walker discrimination tasks (Casile & Giese, 2005; Chang & Troje, 2010; Mather et al., 1992; Thurman et al., 2010; Thurman & Grossman, 2008; Troje & Westhoff, 2006). Figure 2a illustrates the horizontal motion components (Vx) of a foot point as it changes over time for a forward and backward walker, and Figure 2b shows the estimated integrated motion signal at each time point of a forward walker with varying amounts of antagonistic local motion. The integrated motion signal was taken as the weighted sum of the global patch and local grating motion components, with local motion weighted by 0.66 times global motion. This value is within the range of previous weight estimates (Tse & Hsieh, 2006) and was chosen to achieve perceptual motion cancellation with a local motion scalar of 1.5 to be most consistent with our group data from Experiment 1. On the basis of this analysis, integrated motion signals would predict both the drop in performance as local motion scalar increased and the illusory reversals observed with very fast antagonistic local motion (scalars greater than 1.5). That is, as the motion scalar increased, the integrated motion vectors became more correlated with the opposite walking direction and less correlated with the walking direction represented by explicit spatial cues. 
Figure 2
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 1. (a) Profile of horizontal motion vectors (Vx) for a foot point of forward (purple) and backward (blue) walkers over time. (b) Integrated motion vectors (black) representing the weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory profile of the foot point of forward and backward walkers with arrows indicating overall direction of rotation for clarity. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions with arrows to indicate the overall direction of rotation. The independent variable representing the amount of antagonistic local motion is labeled in the top left of the panels in (b) and (d).
Figure 2
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 1. (a) Profile of horizontal motion vectors (Vx) for a foot point of forward (purple) and backward (blue) walkers over time. (b) Integrated motion vectors (black) representing the weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory profile of the foot point of forward and backward walkers with arrows indicating overall direction of rotation for clarity. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions with arrows to indicate the overall direction of rotation. The independent variable representing the amount of antagonistic local motion is labeled in the top left of the panels in (b) and (d).
The second possibility suggests that the illusory positions induced by antagonistic local motion signals could serve as input to a global template-matching process that analyzes the sequence of body postures based on (illusory) spatial position cues (Lange & Lappe, 2006) and that walking direction judgments can be made on the basis of posture changes from one frame to another. To examine this account, we measured the true spatial position of each point frame by frame and also estimated an illusory offset in position depending on the amount of antagonistic drifting motion. The illusory offset was equal to 0.66 times the instantaneous 2-D velocity of the drifting grating (Rider et al., 2009). For instance, a grating drifting to the right at 4°/s would result in a rightward spatial shift of 2.64°. This is an extremely generous amount of spatial shift considering that previous studies have measured much smaller (up to only 0.25°) illusory shifts for a Vernier acuity task performed 8° in the periphery (De Valois & De Valois, 1991). Thus, our estimates of illusory spatial shift likely represent an overestimation of the actual perceptual effect. Nevertheless, our analysis suggests it is very unlikely that illusory position shifts can account for the results observed in Experiment 1. Figure 2c shows the true profiles of spatial position and motion vectors derived from the foot point of a forward and backward walker, and Figure 2d shows the estimated illusory positions and motion vectors derived from frame-to-frame changes in illusory position for a forward walker. Despite corrupting the signal of the true walker to some degree, the overall trajectory based on illusory positions appears consistent with a forward walker for all levels of antagonistic local motion. This result holds true for all body joints, not just the foot joint, and is corroborated by the observation that a forward moving point-light walker synthetically created from these illusory positions still appears to be walking forwards (i.e., at the global level, the sequence of illusory postures defined by illusory shifts does not result in a perceptual reversal of walking direction). 
Experiment 2
A notable feature of the current stimulus paradigm is the ability to decouple spatial information from local motion and orientation cues using drifting Gabor patches. Results from the first experiment support a prominent role for integrated motion signals in perceiving actions in the periphery and suggest that action recognition may be possible from drifting gratings that reflect joint movements through local 2-D motion signals but that are spatially stationary. In Experiment 2, we configured the nine patches into the shape of a generic human body pose and kept their spatial locations constant over time, creating a stationary array of patches throughout the animation sequence (Figure 3a and Movie 2). As such, the stimulus did not explicitly yield spatial movements of joints but required the visual system to utilize local drifting motion and limb orientation cues to construct a global, dynamic representation of walkers moving forward or backward. As in the previous experiment, the drifting velocity and orientation of Gabor patches were manipulated over time using the local 2D motion and orientation signals extracted from human walkers. Based on the findings of the Experiment 1, we hypothesized that local cues would support discrimination of walking direction, particularly when presented in the periphery due to the inclination of peripheral visual processing to utilize integrated motion signals for action perception. 
Figure 3
 
Stimuli and results discriminating forward/backward walking direction in Experiment 2. (a) Schematic illustration of spatially stationary walkers arranged in the shape of a neutral walking pose or in the shape of an unnatural posture (rectangle). The circular outlines are meant for illustrative purposes to make apparent the stationary arrangement of spatial cues. Red arrows indicate local 2-D motion vectors of a walker and circular outlines illustrate that the spatial configuration was stationary over time. For demonstration of Gabor stimulus in the walking posture arrangement, see Movie 2. (b) and (c) Mean proportion correct discriminating forward/backward walkers as a function of viewing location for orientation present (Gabor) and absent (plaid) conditions. Data using the walking posture arrangement is shown in (b) and the rectangular arrangement in (c). Dotted lines represent chance performance and error bars represent SEM.
Figure 3
 
Stimuli and results discriminating forward/backward walking direction in Experiment 2. (a) Schematic illustration of spatially stationary walkers arranged in the shape of a neutral walking pose or in the shape of an unnatural posture (rectangle). The circular outlines are meant for illustrative purposes to make apparent the stationary arrangement of spatial cues. Red arrows indicate local 2-D motion vectors of a walker and circular outlines illustrate that the spatial configuration was stationary over time. For demonstration of Gabor stimulus in the walking posture arrangement, see Movie 2. (b) and (c) Mean proportion correct discriminating forward/backward walkers as a function of viewing location for orientation present (Gabor) and absent (plaid) conditions. Data using the walking posture arrangement is shown in (b) and the rectangular arrangement in (c). Dotted lines represent chance performance and error bars represent SEM.
Methods
A group of nine naïve subjects participated in Experiment 2. We investigated perception of walking direction (forward/backward) in a range of location eccentricities across the visual field (0°–26°), as well as the influence of orientation by comparing conditions with oriented Gabors to plaids. Participants performed two blocks of 144 trials. The generic body posture was created by taking a characteristic posture from the same time point in the leftward and rightward walking sequences and averaging the postures across space. An intermediate posture in the gait cycle was chosen so that the Gabor patches would not overlap in space but also would not be at their most extended positions during the stance phase of the walking action. The generic posture contained no spatial information to bias perception of facing or walking direction. All other methods and parameters of stimulus creation and presentation were the same as in the previous experiment. 
Results
Overall, observers discriminated walking direction significantly above chance (Figure 3b) for both patch types at all locations (one-sample t tests, all ps < 0.05). A repeated-measure ANOVA with factors patch type (Gabor, plaid) and eccentricity revealed a significant interaction effect, F(5, 40) = 2.74 , p = 0.032, but no main effects. For oriented Gabors, observers showed similar discrimination performance in both central vision and the periphery, F(5, 40) = 0.58, p = 0.718. However when orientation information was removed in the plaid condition, accuracy varied significantly across the visual field, F(5, 40) = 2.78 , p = 0.030, with performance lowest in central locations (0° and 2°) and higher in the periphery. This result may reflect that orientation cues consistent with the human skeleton can provide marginal support for analysis of global body postures, particularly in central vision. However when compared directly at each eccentricity tested, paired-samples t tests revealed no statistically significant differences between the Gabor and plaid conditions (two-tailed, all ps > 0.05). Overall, these results demonstrate that local drifting motion can invoke perceptual representations of human walkers in the absence of explicit spatial cues and that orientation signals may provide additional support for action analysis in central vision. 
To assess the possibility that these results were influenced by the particular arrangement of the points into a static posture derived from the average of a leftward and rightward walker, we ran an additional experiment (n = 6) which instead configured the eight limbs into the shape of a vertically-oriented rectangle (Figure 3a, bottom). This spatial arrangement was not derived from a naturalistic body pose (e.g., a walker) and only crudely matched the true organization of the eight limbs on the human body. In other words, it would be very unnatural for the joints in the human body to align perfectly into the exact shape of a rectangle. Nonetheless, with the rectangular arrangement we replicated the main results of Experiment 2 (Figure 3c). A repeated-measure ANOVA revealed a nonsignificant main effect between the two groups with the static walker arrangement and the rectangular arrangement, F(1, 13) = 0.003, p = 0.95. In fact, mean performance was nearly identical across all conditions for both groups (walker arrangement: M = 0.672, SD = 0.04; rectangular arrangement: M = 0.676, SD = 0.05). This result highlights that the fixed locations of the Gabor and plaid disks in this stimulus served primarily to provide a basic spatial organization to the local drifting motion and/or orientation signals. These local signals ultimately were used by the visual system to construct dynamic representations of human action. 
Discussion
We again consider two potential hypotheses for explaining this pattern of results. Figure 4a illustrates horizontal local motion signals derived from the foot point of a forward and backward walker, and Figure 4b shows changes in integrated motion signals over time. The global motion component was always zero due to the stationary Gabor disks, so integrated motion signals simply represent the local drifting motion scaled by 0.66. Hence, the pattern of integrated motion derived from a forward walking stimulus correlates with the actual local motion signals of a forward walker, consistent with perceived walking direction. In contrast, the illusory positions induced by drifting motion in the periphery do not accurately reflect the true spatial movements of a forward walker (Figures 4c and d). Since the Gabor disks were stationary, the illusory positions are centered on a fixed position without following a smooth trajectory like that of the actual walking pattern. Like the results of Experiment 1, it seems unlikely that the results could be explained in terms of global form analysis on the illusory posture sequence. Instead both sets of results can be explained by a parsimonious account based on the pattern of integrated motion signals in the periphery. 
Figure 4
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 2. (a) Horizontal motion vectors for a foot point of forward and backward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory of the foot point of forward and backward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions.
Figure 4
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 2. (a) Horizontal motion vectors for a foot point of forward and backward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory of the foot point of forward and backward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions.
Experiment 3
Experiments 1 and 2 suggest that local orientation cues may only provide marginal support for discrimination of forward/backward walking and that integrated motion signals appear to play a predominant role in the periphery. This is supported by results showing minimal differences between experimental conditions with oriented Gabors and plaids. However, the influence of local orientation may have been weak due to unique task demands that require analysis of global body motion and the dynamic posture sequence for discriminating forward/backward walkers. We examined whether orientation cues would play a more prominent role in a leftward/rightward walker discrimination task that is more dependent on static form cues and does not necessarily require analysis of the dynamic sequence. That is, leftward/rightward heading can be accurately discriminated from form cues evident in a single static posture (Lange & Lappe, 2006; Thirkettle, Benton, & Scott-Samuel, 2009; Thurman & Grossman, 2008). Hybrid Gabor walkers were constructed in such a way that the local drifting motion and orientation cues represented human walkers heading (leftward/rightward) in the direction opposite to the information provided by spatial position cues. The spatial configuration of Gabor patches was arranged frame-by-frame to reflect either a leftward or rightward walker, while the local features of the Gabor patches (orientation and drifting velocity) were set according to those extracted from a walker with opposite leftward/rightward walking direction (Figure 5a and Movie 3). The mappings between the limbs of leftward and rightward walkers were also reversed such that, for instance, the front leg of the leftward walker corresponded to the back leg of the rightward walker and vice versa. The reversed mapping was done because when leftward and rightward walkers are spatially overlaid, the location of the front leg of a leftward walker roughly matches the back leg of the rightward walker across the gait cycle. 
Figure 5
 
Stimuli and results discriminating leftward/rightward walking direction in Experiment 3. (a) Schematic illustration of hybrid walker stimuli with orientation only, local motion only, and with local motion and orientation. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the local 2D motion vectors representing biological movements. For demonstration of Gabor stimulus with orientation and motion cues, see Movie 3. (b) Mean proportion of responses consistent with the spatially-defined leftward/rightward walking direction as a function of viewing location, with psychometric fits. (c) Data from the control condition with size-scaled stimuli (1.5x and 2x) presented with data from the main experiment (solid black line) for comparison. (d) Proportion of rightward walking direction responses as a function of eccentricity as an indicator of facing bias. Dotted line represents null hypothesis of no facing bias (equal number of leftward/rightward responses). All error bars represent SEM.
Figure 5
 
Stimuli and results discriminating leftward/rightward walking direction in Experiment 3. (a) Schematic illustration of hybrid walker stimuli with orientation only, local motion only, and with local motion and orientation. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the local 2D motion vectors representing biological movements. For demonstration of Gabor stimulus with orientation and motion cues, see Movie 3. (b) Mean proportion of responses consistent with the spatially-defined leftward/rightward walking direction as a function of viewing location, with psychometric fits. (c) Data from the control condition with size-scaled stimuli (1.5x and 2x) presented with data from the main experiment (solid black line) for comparison. (d) Proportion of rightward walking direction responses as a function of eccentricity as an indicator of facing bias. Dotted line represents null hypothesis of no facing bias (equal number of leftward/rightward responses). All error bars represent SEM.
To investigate the specific influence of orientation, we again compared performance with walking stimuli composed of either oriented Gabor or plaid patches. In a follow up control condition (n = 10), we examined the influence of local orientation in isolation with stationary gratings that induced no local drifting motion. Observers reported the perceived leftward/rightward walking direction of stimuli presented in a range of location eccentricities across the visual field (0°–26°). Experimental conditions were randomized and counterbalanced within two blocks of trials (144 total trials). 
Results
Figure 5b shows the proportion of trials in which the observer's response matched the leftward/rightward walking direction represented by spatial cues as a function of location eccentricity. Values below the 50% chance level (dotted line) indicate that observers were more likely to report the leftward/rightward walking direction signaled by local cues. A repeated-measure ANOVA revealed significant main effects of patch type, Gabor versus plaid, F(1, 9) = 63.83 , p < 0.001, and eccentricity, F(5, 45) = 71.89 , p < 0.001, as well as an interaction, F(5, 45) = 43.02 , p < 0.001. When the stimulus was presented near the fovea (less than 2°), all observers consistently reported the leftward/rightward walking direction defined by spatial positional shifts of Gabor patches over time. However, in the oriented Gabor condition (solid line in Figure 5b), the proportion reporting the spatially-defined leftward/rightward walking direction decreased significantly below chance levels in the far periphery (26°, one-sample t test, p < 0.001). This result indicates that, as the stimulus was moved further from fixation, observers increasingly perceived an illusory leftward/rightward walking direction consistent with the local motion and orientation cues and opposite to the direction perceived when viewed centrally. 
Five additional subjects performed the task with hybrid walker stimuli containing local motion and orientation cues to induce perceptual reversals, but scaled by 1.5 and two times the original size in the main experiment (height: 13.8° and 18.4°, respectively) to assess the possibility that size-scaling would compensate for performance impairments resulting from decreased spatial resolution and/or increased spatial integration in the peripheral visual field (Gurnsey, Roddy, Ouhnana, & Troje, 2008; Gurnsey & Troje, 2010b). The size of the Gabor envelope and the grating wavelength were also scaled proportionally (by 1.5 and two times) so that the relative sizes and spatial frequencies of the enlarged stimuli were the same as stimuli presented in the main experiment. Results (Figure 5c) with the enlarged stimuli showed a similar decrease in the proportion of responses consistent with spatial position cues, confirming the reliance on the combination of local motion and orientation cues in the peripheral visual field, F(5, 85) = 59.3 , p < 0.001. However, an ANOVA with factors eccentricity (within subjects) and stimulus size (between subjects) revealed that size-scaling did result in a significant reduction of the reversal effect, F(10, 85) = 4.73 , p < 0.001, suggesting that limits to spatial processing in the peripheral field did play a role in modulating the relative influence of local cues and spatial cues for action perception. However, doubling the size to over 18° clearly did not equate performance between central and peripheral vision. 
Previous reports have shown a significant bias in the perceived leftward/rightward walking direction of point-light stimuli presented in the periphery, whereby rightward facing walkers are perceived more accurately in the right visual field and leftward walkers in the left visual field (De Lussanet et al., 2008; Michels, Kleiser, de Lussanet, Seitz, & Lappe, 2009). We analyzed our data to examine the possible influence of this facing direction bias in the current experiment. We measured the proportion of rightward direction responses across all trials in the key experimental condition that caused perceptual reversals (e.g., with local motion and orientation cues). An ANOVA revealed a marginal but statistically insignificant effect of eccentricity on the proportion of rightward direction responses, F(5, 45) = 2.43, p = 0.096; Greenhouse-Geiser corrected. Although the overall pattern of results is consistent with previous reports (Figure 5d), with a small bias to report rightward walkers presented in the right visual field and a general increase in this bias as a function of eccentricity (De Lussanet et al., 2008), one-sample t tests at each test location revealed no significant differences (all ps > 0.05, two-tailed) from the null hypothesis (e.g., an equal number of leftward/rightward responses). Further, we were careful to counterbalance all conditions in the current experimental design to produce an equal number of rightward and leftward walkers defined by spatial/local cues. Thus, the facing direction bias should have influenced all conditions equally on average across the experiment and could not account for the significant number of perceptual reversals resulting from the incongruence of local orientation cues with spatial cues. 
Discussion
What factors account for the reversal effect found in Experiment 3, which appears fundamentally different from the effects observed in Experiments 1 and 2 that relied upon integrated motion signals? In fact, analysis of the integrated motion signals elicited by the foot point in a hybrid leftward/rightward stimulus shows a correlation with the local motion signals of both leftward and rightward walkers (Figure 6a). This is due to the fact that the front leg of the leftward walker moves in the same horizontal direction, with the same phase, as the back leg of the rightward walker (e.g., see start points in Figure 6c). Thus analysis solely based on integrated motion would not predict reliable perceptual reversals of leftward/rightward walking direction. Likewise, the proposition of global form analysis on illusory spatial positions would also not predict perceptual reversals as shown in Figures 6c and d. Instead the reversal appears driven primarily by incongruent local orientation cues. However, this cannot be the entire story because a follow-up control experiment with stationary gratings (orientation only condition, Figure 5b) revealed no reversal in the perception of leftward/rightward walking direction when local drifting motion was absent. This result reveals that, unlike Experiments 1 and 2, an interaction between orientation and local motion cues was necessary to induce a perceptual reversal of leftward/rightward walking direction in the periphery. 
Figure 6
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 3. (a) Horizontal motion vectors for a foot point of leftward and rightward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a leftward walking stimulus. (c) Spatiotemporal trajectory of the foot point of leftward and rightward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a leftward walking stimulus based on estimated illusory spatial positions. The “start point” illustrates that local motion and orientation were extracted from the opposite limb of the walker with opposite leftward/rightward walking direction (See Experiment 3, Methods).
Figure 6
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 3. (a) Horizontal motion vectors for a foot point of leftward and rightward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a leftward walking stimulus. (c) Spatiotemporal trajectory of the foot point of leftward and rightward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a leftward walking stimulus based on estimated illusory spatial positions. The “start point” illustrates that local motion and orientation were extracted from the opposite limb of the walker with opposite leftward/rightward walking direction (See Experiment 3, Methods).
We propose that integrated motion signals, and perhaps illusory spatial shifts, played a slightly different role in causing perceptual reversals in Experiment 3. While the integrated motion signals were indeed correlated with both leftward and rightward walkers, these signals did not readily discriminate the exact underlying motion of either walking direction. In other words, the local grating motion introduced some noise and uncertainty to the perceptual estimates of body motion and estimates of spatial position for each Gabor disk. We suggest that because motion integration and spatial uncertainty each increase as a function of eccentricity, incongruent local orientation cues came to dominate to generate action reversals in the periphery due to a large decrease in the reliability of spatial cues as compared to orientation cues. This increase in uncertainty in the periphery may also help explain why perceptual reversals only occurred about 70% of the time on average in Experiment 1, even with strong antagonistic local grating motion, and why performance was limited to only about 60%–75% correct in Experiment 2. Not only is the peripheral visual field susceptible to motion integration and illusory position shifts, but these effects necessarily produce an increase in perceptual uncertainty about the precise spatial locations of the disks themselves, which may serve to impose a limit on overall discrimination performance. 
General discussion
In a series of experiments we introduced a novel stimulus, designed to independently manipulate various features of biological motion (orientation, spatial configuration, and local motion) in order to explore different computational mechanisms of action perception. When action characteristics (e.g., leftward/rightward and forward/backward walking direction) defined by local motion and orientation cues were put into conflict with global spatial cues, we discovered significant perceptual differences between central and peripheral vision. In central vision the computational analysis of global cues based on the spatial arrangement of joints and their movement trajectories dominated, causing observers to report the spatially-defined facing and walking directions and ignore competing local signals in all conditions tested. In contrast, the periphery was influenced substantially by local drifting motion and orientation cues that signaled conflicting information about action characteristics. Experiment 1 showed that antagonistic local drifting motion could cancel or even reverse the perceived forward/backward walking direction when stimuli were presented in the periphery and drifting velocity was scaled sufficiently. The second experiment further highlighted the potency of orientation cues and local drifting motion for action processing, with above chance performance discriminating forward/backward walking stimuli that lacked explicit spatial movements of the joints. In Experiment 3, we found that the combination of local motion and orientation cues together caused a significant reversal in perceived leftward/rightward walking direction of hybrid walker stimuli, but that orientation and local motion in isolation were insufficient to cause perceptual reversals. Although previous studies have reported sensitivity differences between central and peripheral vision in detecting and discriminating biological motion (Gurnsey et al., 2008; Gurnsey & Troje, 2010b; Ikeda, Blake, & Watanabe, 2005; Thompson, Hansen, Hess, & Troje, 2007), our study is the first to show that central and peripheral viewings of the same action stimulus can produce very different percepts of action characteristics. 
In considering potential explanations for these peripheral action phantom illusions and the implications for computational theories of action perception, we first examine related work demonstrating significant perceptual differences between central and peripheral vision. Previous research using drifting Gabor patches for basic positional judgment tasks has shown that local drifting motion can induce an illusory shift in the perceived position of a Gabor patch (Chung, Patel, Bedell, & Yilmaz, 2007; De Valois & De Valois, 1991; Mather & Pavan, 2009; Zhang et al., 1993). This effect is strongest in the periphery for Gabor patches with a soft (Gaussian) boundary and is minimal for Gabor patches presented centrally or with a hard boundary (Zhang et al., 1993). In an additional control experiment, we likewise found that using a hard boundary did not result in perceptual reversals in the periphery for hybrid walker stimuli like those presented in Experiment 3 (data not shown). It has also been suggested that the periphery compulsively integrates local grating motion with the motion signal derived from the global movement of the patch envelope, resulting in a unified motion percept representing a weighted vector sum of the local and global motion components (Shapiro, Lu, Huang, Knight, & Ennis, 2010; Tse & Hsieh, 2006). In fact, researchers have discovered a variety of visual illusions (Anstis, 2012; Hedges et al., 2011; Levi & Klein, 1996; Shapiro, Knight, & Lu, 2011; Shapiro et al., 2010; Tse & Hsieh, 2006) that rely upon these fundamental differences between central and peripheral vision (e.g., in terms of motion integration and spatial uncertainty). 
Shapiro et al. (2011) recently characterized the tendency of peripheral vision to blend visual features, and its failure to dissociate features (such as first order drifting motion and second order motion of the Gabor envelope), as feature blur. The idea of peripheral feature blur is compatible with our results and may help explain why local grating motion and orientation cues contributed significantly to perception under peripheral viewing conditions. The drifting motion inside the Gabor patches introduced (a) motion signals that were automatically incorporated into the global motion percept and (b) increased uncertainty about the spatial locations of the disks themselves. Meanwhile, Gabor disks viewed centrally were not subjected to these effects. The perceptual reversals discovered in the current experiments were likely induced by a complex interaction between increased position uncertainty in the periphery and distorted body movements resulting from integrated motion signals. 
We suggest that the combination of these factors in the periphery helped tip the balance in favor of local cues in the face of competing (and uncertain) spatial cues to signal walking and heading direction. For example, in Experiment 3, perceptual reversals of leftward/rightward direction were induced only by the combination of local drifting motion and orientation cues incongruent with explicit spatial cues. If action perception relied upon global analysis of postures derived purely from spatial cues (Lange, Georg, & Lappe, 2006; Lange & Lappe, 2006), then even the assumption that local drifting motion could induce illusory spatial positions consistent with the reversed facing direction could not account for these results because no reversals were observed in the local motion only condition (with plaids). We believe, however, that a modified version of the global template-matching model could potentially account for the observed results in Experiment 3 if orientation cues representing the angle of each limb segment were also incorporated with spatial cues in the model's representation of human body postures and if the model combined spatial uncertainty and orientation uncertainty in a probabilistic (or Bayesian) manner. Indeed, our data suggests that limb orientation can be a very potent cue for activating neural representations of body posture (Lu, 2010) and that drifting motion in this case may have introduced sufficient uncertainty to spatial position estimates of the individual markers to cause the form analysis biased toward limb orientation. 
Interestingly, however, we found different effects in Experiment 1 in which limb orientation was consistent with the body postures implied by explicit spatial cues. In this case, local drifting motion in the opposite direction to body movements, if scaled appropriately, caused the action to perceptually stand still or even reverse direction. This indicates that the computation of global motion for each disk, representing the weighted sum of drifting motion and envelope movement, preceded spatial analysis of body postures and even overcame the information about body posture signaled by orientation cues. The fact that there were no significant differences between conditions with and without orientation cues (e.g., Gabor vs. plaid) in the periphery further suggests a prominent role of integrated motion signals at the expense of orientation cues and explicit spatial cues. We believe the difference in the primacy of orientation cues in Experiment 3, and integrated motion signals in Experiment 1, reflects differences in the visual strategies and computations involved in leftward/rightward versus forward/backward discrimination tasks, respectively (Beintema et al., 2006; Lange & Lappe, 2006; Wittinghofer et al., 2012). 
Experiment 2 provided a further demonstration of the potency, and limitations, of orientation cues and local drifting motion for forward/backward walking discriminations in the absence of explicit spatial cues. In the context of a globally stationary array of points crudely organized in the shape of a human body pose, the integrated motion signals elicited by drifting gratings in the periphery were sufficient to activate neural representations of walking direction. The complex spatial pattern of local motion signals could have activated dynamic motion-based templates (Casile & Giese, 2005; Giese & Poggio, 2003), or so-called sprites that have been proposed as the basis for recognition of dynamic events and biological motion (Cavanagh, Labianca, & Thornton, 2001). Orientation cues appeared to support action analysis primarily in central vision but provided only marginal support for performance in the periphery. This conclusion is based on the observation of no significant differences between the oriented Gabor and plaid conditions beyond central vision (0°–2°). Overall performance in Experiment 2, however, was limited to only about 60%–75% correct performance on average, highlighting the important role that global spatial cues generally play in biological motion perception. Together, these experiments demonstrate the flexibility and adaptability of the action processing system in using various visual cues to meet particular task demands and in accounting for changes in the relative reliabilities of these cues under changing experimental conditions (e.g., viewing location). 
This apparent dichotomy between central and peripheral vision also suggests that the action processing system employs different computational schemes, including body posture analysis of global spatial cues (Lange & Lappe, 2006) and analysis of more local spatiotemporal cues (Giese & Poggio, 2003) and weights the outputs of these systems differently depending on experimental conditions. In line with sensory cue integration research in other fields, these weights likely reflect the relative uncertainty, or reliability, of the visual cues (Alais & Burr, 2004; Ernst & Banks, 2002; Knill & Pouget, 2004). As discussed previously, the increased weight given to local grating motion and orientation in the periphery may be due to the low precision of spatial processing, compulsive motion integration, and to feature blur. The data showing that size-scaling resulted in weaker illusory reversals of leftward/rightward walking direction in Experiment 3 provides some support for this hypothesis, as increasing the size of biological motion stimuli in the periphery has been shown to compensate for limits in spatial processing and equate performance with central vision for other discrimination tasks (Gurnsey et al., 2008; Gurnsey & Troje, 2010a). Although size scaling reduced the peripheral bias for local cues, it did not equate performance between central and peripheral vision, suggesting differential contribution from distinct mechanisms in central and peripheral vision. This interpretation also sheds light on results showing poor peripheral discrimination of biological motion when embedded in scrambled dot masks (Ikeda et al., 2005; Thompson et al., 2007) because segmenting a point-light stimulus from a noisy background typically requires analysis of the global body shape from spatial cues (Bertenthal & Pinto, 1994; Gurnsey et al., 2008; Thompson et al., 2007). Our data suggest impairment of form processing on the basis of spatial cues in the periphery, especially in the face of competing dynamic and/or orientation cues. 
We conjecture that the posterior superior temporal sulcus (pSTS) plays a fundamental role in this process since the pSTS receives cortical inputs from both the dorsal and ventral visual streams, allowing it to act as an intermediary that integrates the outputs from both systems during action perception (Giese & Poggio, 2003; Grossman et al., 2000; Kourtzi, Krekelberg, & van Wezel, 2008; Oram & Perrett, 1996). Recent fMRI studies designed specifically to disentangle these computational systems also seem to support this notion (Jastorff & Orban, 2009; Thompson & Baccus, 2011). Of particular significance in light of the findings in the current study, Thompson and Baccus (2011) found that activation in response to the human body shape in the fusiform body area (FBA), a region found in the ventral pathway, was dominant in central vision and weak in the periphery. The observation that spatial analysis of the human body form in the ventral stream (e.g., FBA) may be biased to the central visual field is consistent with the present behavioral results and may explain why global analysis of spatial cues played such a dominant role in central vision in Experiments 1 and 3 and explain why performance was limited overall in Experiment 2 when global spatial cues were reduced. A recent fMRI study has also demonstrated significant clustering of brain activity in the fusiform gyrus for leftward versus rightward facing point-light walkers presented 20° in the periphery (Michels et al., 2009), suggesting that form analysis may indeed contribute to action perception across the entire visual field. We hypothesize that while responses from both cortical pathways may contribute to biological motion perception across the entire visual field and under various experimental conditions, the outputs of these distinct processes are integrated at a later stage with weights reflecting the relative uncertainty about the underlying computations and visual cues. Interestingly, Thompson and Baccus (2011) also reported that the pSTS was influenced strongly by body form in central vision but was influenced by form and motion in the periphery, which seems consistent with our behavioral data and further suggests the pSTS as a prime locus for integrating lower-level visual analyses from both processing streams for action perception. 
Given a flexible action processing system capable of representing actions based on various competing visual cues, it is not surprising that particular stimulus conditions would favor one system or the other. For instance, global form analysis should prevail when local cues are disrupted or rendered less informative. For instance, masking in motion-matched noise dots is known to be very effective at obscuring biological motion because it adds noise predominantly to local motion processes. However, this deficit in local processing may be overcome by knowledge of body configuration and form analysis (Bertenthal & Pinto, 1994; Lu, 2010; Pinto & Shiffrar, 1999), although specific midlevel motion cues may also be helpful in the segmentation process (Thurman & Grossman, 2008). In addition, the sequential-position stimulus disrupts local biological motion signals by reassigning dot positions on each limb frame-by-frame, yet observers can spontaneously perceive and discriminate such walkers (Beintema et al., 2006; Beintema & Lappe, 2002). These findings provide evidence that actions can be analyzed in terms of global form with minimal contribution from local motion. 
Conversely, processing based on local features should be possible when global signals are perturbed. For example, when the global configuration of point lights is disrupted due to spatial scrambling (Troje & Westhoff, 2006) or adding random spatial noise (Casile & Giese, 2005), observers can still discriminate walking direction using specific dynamic features of the feet (Chang & Troje, 2010). Troje and Westhoff (2006) have theorized that mammalian brains contain a life-detection mechanism that is highly-tuned for low-level motion features of locomoting terrestrial animals. This hypothesis seems to be supported by evidence that newly hatched chicks (Vallortigara & Regolin, 2006) and newborn human babies (Simion, Regolin, & Bulf, 2008) show a preference for biological motion, presumably before the development of adequate global templates. Studies on visual search and perceived temporal duration further show converging evidence to support this hypothesis (van Boxtel & Lu, 2011, 2012; Wang & Jiang, 2012; Wang, Zhang, He & Jiang, 2010). Global signals are also reduced in displays that selectively omit point lights or randomly obscure body points in space and time. Results from such experiments suggest that observers use perceptual strategies that favor local motion, particularly in the extremities of the lower body for walking direction tasks (Mather et al., 1992; Thurman et al., 2010), although the same extremity points have also proven important for form-based models that ignore local motion cues and exclusively analyze body postures (Lange et al., 2006; Lange & Lappe, 2006). 
To summarize, the current experiments reveal complex interactions between visual cues and computational strategies for action processing by carefully manipulating local and global cues relating to body form and motion, as well as stimulus location within the visual field. These results suggest refinement and revision of current theoretical models to include computational mechanisms capable of representing actions through spatial analysis of global body structure together with analysis of local stimulus features (orientation, local motion), perhaps through weighted integration, although the nature of this integration mechanism in the human brain is still unclear and will require further studies. We conclude that biological motion perception is the result of an elaborate, competitive dance between visual cues relating to form and motion processing, which likely reflects a more general property of how the visual system represents complex objects in motion. 
Supplementary Materials
Acknowledgments
This research was supported by NSF grant BCS-0843880 to H. L. The human motion data used in this project was obtained from http://mocap.cs.cmu.edu. The database was created with funding from NSF EIA-0196217. These illusions were presented at the annual Vision Sciences Society Best Illusion of the Year context in Naples, FL in May 2012. We thank Jeroen van Boxtel and Alan Lee for helpful comments on earlier versions of the manuscript. 
Commercial relationships: none. 
Corresponding author: Steven M. Thurman. 
Email: sthurman@ucla.edu 
Address: Department of Psychology, University of California, Los Angeles, Los Angeles, CA, USA. 
References
Adelson E. H. Movshon J. A. (1982). Phenomenal coherence of moving visual patterns. Nature, 300(5892), 523–525. [CrossRef] [PubMed]
Alais D. Burr D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14(3), 257–262. [CrossRef] [PubMed]
Amano K. Edwards M. Badcock D. R. Nishida S. (2009). Adaptive pooling of visual motion signals by the human visual system revealed with a novel multi-element stimulus. Journal of Vision, 9(3):4, 1–25, http://www.journalofvision.org/content/9/3/4, doi:10.1167/9.3.4. [PubMed] [Article] [CrossRef] [PubMed]
Anstis S. (2012). The furrow illusion: Peripheral motion becomes aligned with stationary contours. Journal of Vision, 12(12):12, 1–11, http://www.journalofvision.org/content/12/12/12, doi:10.1167/12.12.12. [PubMed] [Article] [CrossRef] [PubMed]
Beintema J. A. Georg K. Lappe M. (2006). Perception of biological motion from limited-lifetime stimuli. Perception & Psychophysics,68(4), 613–624. [CrossRef] [PubMed]
Beintema J. A. Lappe M. (2002). Perception of biological motion without local image motion. Proceedings of the National Academy of Sciences of the United States of America, 99(8), 5661–5663. [CrossRef] [PubMed]
Bertenthal B. I. Pinto J. (1994). Global processing of biological motions. Psychological Science,5(4), 221–224. [CrossRef]
Blake R. Shiffrar M. (2007). Perception of human motion. Annual Review of Psychology,58, 47–73. [CrossRef] [PubMed]
Brainard D. H. (1997). The psychophysics toolbox. Spatial Vision,10(4), 433–436. [CrossRef] [PubMed]
Casile A. Giese M. A. (2005). Critical features for the recognition of biological motion. Journal of Vision, 5(4):6, 348–360, http://www.journalofvision.org/content/5/4/6, doi:10.1167/5.4.6. [PubMed] [Article] [CrossRef]
Cavanagh P. Labianca A. T. Thornton I. M. (2001). Attention-based visual routines: Sprites. Cognition,80(1-2), 47–60. [CrossRef] [PubMed]
Chang D. H. F. Troje N. F. (2009). Acceleration carries the local inversion effect in biological motion perception. Journal of Vision, 9(1):19, 1–17, http://www.journalofvision.org/content/9/1/19, doi:10.1167/9.1.19. [PubMed] [Article] [CrossRef] [PubMed]
Chang D. H. F. Troje N. F. (2010). The local inversion effect in biological motion perception is acceleration-based. Journal of Vision, 8(6):911, http://www.journalofvision.org/content/8/6/911, doi:10.1167/8.6.911. [Abstract] [CrossRef]
Chung S. T. L. Patel S. S. Bedell H. E. Yilmaz O. (2007). Spatial and temporal properties of the illusory motion-induced position shift for drifting stimuli. Vision Research,47, 231–243. [CrossRef] [PubMed]
De Lussanet M. H. E. Fadiga L. Michels L. Seitz R. J. Kleiser R. Lappe M. (2008). Interaction of visual hemifield and body view in biological motion perception. The European Journal of Neuroscience,27(2), 514–522. [CrossRef] [PubMed]
De Valois R. L. De Valois K. K. (1991). Vernier acuity with stationary moving Gabors. Vision Research, 31(9), 1619–1626. [CrossRef] [PubMed]
Ernst M. O. Banks M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature,415(6870), 429–433. [CrossRef] [PubMed]
Giese M. A. Poggio T. (2003). Neural mechanisms for the recognition of biological movements. Nature Reviews Neuroscience,4(3), 179–192. [CrossRef] [PubMed]
Grossman E. Donnelly M. Price R. Pickens D. Morgan V. Neighbor G. (2000). Brain areas involved in perception of biological motion. Journal of Cognitive Neuroscience, 12(5), 711–720. [CrossRef] [PubMed]
Gurnsey R. Roddy G. Ouhnana M. Troje N. F. (2008). Stimulus magnification equates identification and discrimination of biological motion across the visual field. Vision Research, 48(28), 2827–2834. [CrossRef] [PubMed]
Gurnsey R. Troje N. F. (2010a). Limits of peripheral direction discrimination of point-light walkers. Journal of Vision, 10(2):15, 1–17, http://www.journalofvision.org/content/10/2/15, doi:10.1167/10.2.15. [PubMed] [Article] [CrossRef]
Gurnsey R. Troje N. F. (2010b). Peripheral sensitivity to biological motion conveyed by first and second-order signals. Vision Research,50(2), 127–135. [CrossRef]
Hedges J. H. Gartshteyn Y. Kohn A. Rust N. C. Shadlen M. N. Newsome W. T. (2011). Dissociation of neuronal and psychophysical responses to local and global motion. Current Biology,21(23), 2023–2028. [CrossRef] [PubMed]
Hiris E. Humphrey D. Stout A. (2005). Temporal properties in masking biological motion. Perception & Psychophysics,67(3), 435–443. [CrossRef] [PubMed]
Hunt A. R. Halper F. (2008). Disorganizing biological motion. Journal of Vision, 8(9):12, 1–5, http://www.journalofvision.org/content/8/9/12, doi:10.1167/8.9.12. [PubMed] [Article] [CrossRef] [PubMed]
Ikeda H. Blake R. Watanabe K. (2005). Eccentric perception of biological motion is unscalably poor. Vision Research, 45(15), 1935–1943. [CrossRef] [PubMed]
Jastorff J. Orban G. A. (2009). Human functional magnetic resonance imaging reveals separation and integration of shape and motion cues in biological motion processing. The Journal of Neuroscience,29(22), 7315–7329. [CrossRef] [PubMed]
Johansson G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14(2), 201–211. [CrossRef]
Knill D. C. Pouget A. (2004). The Bayesian brain: The role of uncertainty in neural coding and computation. Trends in Neurosciences,27(12), 712–719. [CrossRef] [PubMed]
Kourtzi Z. Krekelberg B. van Wezel R. J. A. (2008). Linking form and motion in the primate brain. Trends in Cognitive Sciences,12(6), 230–236. [CrossRef] [PubMed]
Lange J. Georg K. Lappe M. (2006). Visual perception of biological motion by form: A template-matching analysis. Journal of Vision, 6(8):6, 836–849, http://www.journalofvision.org/content/6/8/6, doi:10.1167/6.8.6. [PubMed] [Article] [CrossRef]
Lange J. Lappe M. (2006). A model of biological motion perception from configural form cues. The Journal of Neuroscience,26(11), 2894–2906. [CrossRef] [PubMed]
Lee A. L. F. Lu H. (2010). A comparison of global motion perception using a multiple-aperture stimulus. Journal of Vision, 10(4):9, 1–16, http://www.journalofvision.org/content/10/4/9, doi:10.1167/10.4.9. [PubMed] [Article] [CrossRef] [PubMed]
Levi D. M. Klein S. A. (1996). Limitations on position coding imposed by undersampling and univariance. Vision Research, 36(14), 2111–2120. [CrossRef] [PubMed]
Lin Z. He S. (2012). Emergent filling in induced by motion integration reveals a high-level mechanism in filling in. Psychological Science, 23(12), 1534–1541. [CrossRef] [PubMed]
Lu H. (2010). Structural processing in biological motion perception. Journal of Vision, 10(12):13, 1–13, http://www.journalofvision.org/content/10/12/13, doi:10.1167/10.12.13. [PubMed] [Article] [CrossRef] [PubMed]
Lu H. Liu Z. (2006). Computing dynamic classification images from correlation maps. Journal of Vision, 6(4):12, 475–483, http://www.journalofvision.org/content/6/4/12, doi:10.1167/6.4.12. [PubMed] [Article] [CrossRef]
Marr D. Ullman S. (1981). Directional selectivity and its use in early visual processing. Proceedings of the Royal Society of London Biological Sciences, 211(1183), 151–180. [CrossRef]
Mather G. Pavan A. (2009). Motion-induced position shifts occur after motion integration. Vision Research, 49(23), 2741–2746. [CrossRef] [PubMed]
Mather G. Radford K. West S. (1992). Low-level visual processing of biological motion. Proceedings of the Royal Society of London Biological Sciences,249(1325), 149–155. [CrossRef]
Michels L. Kleiser R. de Lussanet M. H. E. Seitz R. J. Lappe M. (2009). Brain activity for peripheral biological motion in the posterior superior temporal gyrus and the fusiform gyrus: Dependence on visual hemifield and view orientation. NeuroImage, 45(1), 151–159. [CrossRef] [PubMed]
Oram M. W. Perrett D. I. (1996). Integration of form and motion in the anterior superior temporal polysensory area (STPa) of the macaque monkey. Journal of Neurophysiology, 76(1), 109–129. [PubMed]
Pavlova M. Sokolov A. (2000). Orientation specificity in biological motion perception. Perception & Psychophysics, 62(5), 889–899. [CrossRef] [PubMed]
Peelen M. V. Wiggett A. J. Downing P. E. (2009). Patterns of fMRI activity dissociate overlapping functional brain areas that respond to biological motion. Neuron,49(6), 815–822.
Pelli D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision,10(4), 437–442. [CrossRef] [PubMed]
Peuskens H. Vanrie J. Verfaillie K. Orban G. A. (2005). Specificity of regions processing biological motion. European Journal of Neuroscience, 21(10), 2864–2875. [CrossRef] [PubMed]
Pinto J. Shiffrar M. (1999). Subconfigurations of the human form in the perception of biological motion displays. Acta Psychologica, 102(2-3), 293–318. [CrossRef] [PubMed]
Rider A. McOwan P. Johnston A. (2009). Motion-induced position shifts in global dynamic Gabor arrays. Journal of Vision, 9(13):8, 1–8, http://www.journalofvision.org/content/9/13/8, doi:10.1167/9.13.8. [PubMed] [Article] [CrossRef] [PubMed]
Saygin A. P. Wilson S. M. Hagler D. J. Bates E. Sereno M. I. (2004). Point-light biological motion perception activates human premotor cortex. Journal of Neuroscience, 24(27), 6181–6188. [CrossRef] [PubMed]
Shapiro A. G. Knight E. J. Lu Z.-L. (2011). A first- and second-order motion energy analysis of peripheral motion illusions leads to further evidence of “feature blur” in peripheral vision. PLoS ONE, 6(4), 10.
Shapiro A. Lu Z.-L. Huang C.-B. Knight E. Ennis R. (2010). Transitions between central and peripheral vision create spatial/temporal distortions: A hypothesis concerning the perceived break of the curveball. PLoS ONE, 5(10), 7.
Shiffrar M. Lichtey L. Heptulla Chatterjee S. (1997). The perception of biological motion across apertures. Perception & Psychophysics,59(1), 51–59. [CrossRef] [PubMed]
Simion F. Regolin L. Bulf H. (2008). A predisposition for biological motion in the newborn baby. Proceedings of the National Academy of Sciences of the United States of America, 105(2), 809–813. [CrossRef] [PubMed]
Singer J. M. Sheinberg D. L. (2010). Temporal cortex neurons encode articulated actions as slow sequences of integrated poses. Journal of Neuroscience, 30(8), 3133–3145. [CrossRef] [PubMed]
Tadin D. Lappin J. S. Blake R. Grossman E. D. (2002). What constitutes an efficient reference frame for vision?Nature Neuroscience, 5(10), 1010–1015. [CrossRef] [PubMed]
Thirkettle M. Benton C. P. Scott-Samuel N. E. (2009). Contributions of form, motion and task to biological motion perception. Journal of Vision, 9(3):28, 1–11, http://www.journalofvision.org/content/9/3/28, doi:10.1167/9.3.28. [PubMed] [Article] [CrossRef] [PubMed]
Thompson B. Hansen B. C. Hess R. F. Troje N. F. (2007). Peripheral vision: Good for biological motion, bad for signal noise segregation?Journal of Vision, 7(10):12, 1–7, http://www.journalofvision.org/content/7/10/12, doi:10.1167/7.10.12. [PubMed] [Article] [CrossRef] [PubMed]
Thompson J. C. Baccus W. (2011). Form and motion make independent contributions to the response to biological motion in occipitotemporal cortex. NeuroImage, 59(1), 625–634. [PubMed]
Thurman S. M. Giese M. A. Grossman E. D. (2010). Perceptual and computational analysis of critical features for biological motion. Journal of Vision, 10(12):15, 1–14, http://www.journalofvision.org/content/10/12/15, doi:10.1167/10.12.15. [PubMed] [Article] [CrossRef] [PubMed]
Thurman S. M. Grossman E. D. (2008). Temporal “bubbles” reveal key features for point-light biological motion perception. Journal of Vision, 8(3):28, 1–11, http://www.journalofvision.org/content/8/3/28, doi:10.1167/8.3.28. [PubMed] [Article] [CrossRef] [PubMed]
Troje N. F. Westhoff C. (2006). The inversion effect in biological motion perception: Evidence for a “life detector”?Current Biology,16(8), 821–824. [CrossRef] [PubMed]
Tse P. U. Hsieh P.-J. (2006). The infinite regress illusion reveals faulty integration of local and global motion signals. Vision Research, 46(22), 3881–3885. [CrossRef] [PubMed]
Vaina L. M. Solomon J. Chowdhury S. Sinha P. Belliveau J. W. (2001). Functional neuroanatomy of biological motion perception in humans. Proceedings of the National Academy of Sciences of the United States of America, 98(20), 11 656–11 661. [CrossRef]
Vallortigara G. Regolin L. (2006). Gravity bias in the interpretation of biological motion by inexperienced chicks. Current Biology,16(8), R279–R280. [CrossRef] [PubMed]
van Boxtel J. J. A. Lu H. (2011). Visual search by action category. Journal of Vision, 11(7):19, 1–14, http://www.journalofvision.org/content/11/7/19, doi:10.1167/11.7.19. [PubMed] [Article] [CrossRef] [PubMed]
van Boxtel J. J. A. Lu H. (2012). Signature movements lead to efficient search for threatening actions. PloS One,7(5), e37085. [CrossRef] [PubMed]
Vangeneugden J. Pollick F. Vogels R. (2009). Functional differentiation of macaque visual temporal cortical neurons using a parametric action space. Cerebral Cortex, 19(3), 593–611. [CrossRef] [PubMed]
Wang L. Jiang Y. (2012). Life motion signals lengthen perceived temporal duration. Proceedings of the National Academy of Sciences of the United States of America, 109(11), E673–E677. [CrossRef] [PubMed]
Wang L. Zhang K. He S. Jiang Y. (2010). Searching for life motion signals visual search asymmetry in local but not global biological-motion processing. Psychological Science,21(8), 1083–1089. [CrossRef] [PubMed]
Wittinghofer K. de Lussanet M. H. E. Lappe M. (2012). Local-to-global form interference in biological motion perception. Attention, Perception, & Psychophysics,74(4), 730–738. [CrossRef]
Zhang J. Yeh S. L. De Valois K. K. (1993). Motion contrast and motion integration. Vision Research,33(18), 2721–2732. [PubMed]
Figure 1
 
Stimuli and results discriminating forward/backward walking direction in Experiment 1. (a) Schematic illustration of walkers with and without orientation signals. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the antagonistic local 2-D motion vectors drifting in the direction opposite to global disk movement. For demonstration of Gabor stimulus with local motion scalar = 2.5, see Movie 1. The diagram in the lower panel illustrates the difference between 2-D biological motion vectors and the 1D motion vectors that determined the actual drifting speeds of individual gratings. (b) and (c) Mean proportion of responses consistent with the spatially defined direction as a function of the amount of antagonistic local motion, with psychometric fits. Error bars represent SEM.
Figure 1
 
Stimuli and results discriminating forward/backward walking direction in Experiment 1. (a) Schematic illustration of walkers with and without orientation signals. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the antagonistic local 2-D motion vectors drifting in the direction opposite to global disk movement. For demonstration of Gabor stimulus with local motion scalar = 2.5, see Movie 1. The diagram in the lower panel illustrates the difference between 2-D biological motion vectors and the 1D motion vectors that determined the actual drifting speeds of individual gratings. (b) and (c) Mean proportion of responses consistent with the spatially defined direction as a function of the amount of antagonistic local motion, with psychometric fits. Error bars represent SEM.
Figure 2
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 1. (a) Profile of horizontal motion vectors (Vx) for a foot point of forward (purple) and backward (blue) walkers over time. (b) Integrated motion vectors (black) representing the weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory profile of the foot point of forward and backward walkers with arrows indicating overall direction of rotation for clarity. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions with arrows to indicate the overall direction of rotation. The independent variable representing the amount of antagonistic local motion is labeled in the top left of the panels in (b) and (d).
Figure 2
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 1. (a) Profile of horizontal motion vectors (Vx) for a foot point of forward (purple) and backward (blue) walkers over time. (b) Integrated motion vectors (black) representing the weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory profile of the foot point of forward and backward walkers with arrows indicating overall direction of rotation for clarity. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions with arrows to indicate the overall direction of rotation. The independent variable representing the amount of antagonistic local motion is labeled in the top left of the panels in (b) and (d).
Figure 3
 
Stimuli and results discriminating forward/backward walking direction in Experiment 2. (a) Schematic illustration of spatially stationary walkers arranged in the shape of a neutral walking pose or in the shape of an unnatural posture (rectangle). The circular outlines are meant for illustrative purposes to make apparent the stationary arrangement of spatial cues. Red arrows indicate local 2-D motion vectors of a walker and circular outlines illustrate that the spatial configuration was stationary over time. For demonstration of Gabor stimulus in the walking posture arrangement, see Movie 2. (b) and (c) Mean proportion correct discriminating forward/backward walkers as a function of viewing location for orientation present (Gabor) and absent (plaid) conditions. Data using the walking posture arrangement is shown in (b) and the rectangular arrangement in (c). Dotted lines represent chance performance and error bars represent SEM.
Figure 3
 
Stimuli and results discriminating forward/backward walking direction in Experiment 2. (a) Schematic illustration of spatially stationary walkers arranged in the shape of a neutral walking pose or in the shape of an unnatural posture (rectangle). The circular outlines are meant for illustrative purposes to make apparent the stationary arrangement of spatial cues. Red arrows indicate local 2-D motion vectors of a walker and circular outlines illustrate that the spatial configuration was stationary over time. For demonstration of Gabor stimulus in the walking posture arrangement, see Movie 2. (b) and (c) Mean proportion correct discriminating forward/backward walkers as a function of viewing location for orientation present (Gabor) and absent (plaid) conditions. Data using the walking posture arrangement is shown in (b) and the rectangular arrangement in (c). Dotted lines represent chance performance and error bars represent SEM.
Figure 4
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 2. (a) Horizontal motion vectors for a foot point of forward and backward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory of the foot point of forward and backward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions.
Figure 4
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 2. (a) Horizontal motion vectors for a foot point of forward and backward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a forward walking stimulus. (c) Spatiotemporal trajectory of the foot point of forward and backward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a forward walking stimulus based on estimated illusory spatial positions.
Figure 5
 
Stimuli and results discriminating leftward/rightward walking direction in Experiment 3. (a) Schematic illustration of hybrid walker stimuli with orientation only, local motion only, and with local motion and orientation. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the local 2D motion vectors representing biological movements. For demonstration of Gabor stimulus with orientation and motion cues, see Movie 3. (b) Mean proportion of responses consistent with the spatially-defined leftward/rightward walking direction as a function of viewing location, with psychometric fits. (c) Data from the control condition with size-scaled stimuli (1.5x and 2x) presented with data from the main experiment (solid black line) for comparison. (d) Proportion of rightward walking direction responses as a function of eccentricity as an indicator of facing bias. Dotted line represents null hypothesis of no facing bias (equal number of leftward/rightward responses). All error bars represent SEM.
Figure 5
 
Stimuli and results discriminating leftward/rightward walking direction in Experiment 3. (a) Schematic illustration of hybrid walker stimuli with orientation only, local motion only, and with local motion and orientation. The circular outlines are meant for illustrative purposes to make apparent the sequence of postures defined by spatial cues. Contrast was also enhanced for illustration. Red arrows indicate the local 2D motion vectors representing biological movements. For demonstration of Gabor stimulus with orientation and motion cues, see Movie 3. (b) Mean proportion of responses consistent with the spatially-defined leftward/rightward walking direction as a function of viewing location, with psychometric fits. (c) Data from the control condition with size-scaled stimuli (1.5x and 2x) presented with data from the main experiment (solid black line) for comparison. (d) Proportion of rightward walking direction responses as a function of eccentricity as an indicator of facing bias. Dotted line represents null hypothesis of no facing bias (equal number of leftward/rightward responses). All error bars represent SEM.
Figure 6
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 3. (a) Horizontal motion vectors for a foot point of leftward and rightward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a leftward walking stimulus. (c) Spatiotemporal trajectory of the foot point of leftward and rightward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a leftward walking stimulus based on estimated illusory spatial positions. The “start point” illustrates that local motion and orientation were extracted from the opposite limb of the walker with opposite leftward/rightward walking direction (See Experiment 3, Methods).
Figure 6
 
Analysis of integrated motion and illusory spatial positions for stimuli used in Experiment 3. (a) Horizontal motion vectors for a foot point of leftward and rightward walkers over time. (b) Motion vectors representing the integrated weighted sum of the local and global motion components for a leftward walking stimulus. (c) Spatiotemporal trajectory of the foot point of leftward and rightward walkers with arrows indicating overall direction of rotation. (d) Trajectories of a leftward walking stimulus based on estimated illusory spatial positions. The “start point” illustrates that local motion and orientation were extracted from the opposite limb of the walker with opposite leftward/rightward walking direction (See Experiment 3, Methods).
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×