August 2015
Volume 15, Issue 11
Free
Article  |   August 2015
Smooth pursuit eye movements and motion perception share motion signals in slow and fast motion mechanisms
Author Affiliations
Journal of Vision August 2015, Vol.15, 12. doi:10.1167/15.11.12
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Kazumichi Matsumiya, Satoshi Shioiri; Smooth pursuit eye movements and motion perception share motion signals in slow and fast motion mechanisms. Journal of Vision 2015;15(11):12. doi: 10.1167/15.11.12.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Pursuit eye movements correlate with perceived motion in both velocity and direction, even without retinal motion. Cortical cells in the monkey medial temporal region generate signals for initiating pursuit eye movements and respond to retinal motion for perception. However, recent studies suggest multiple motion processes, fast and slow, even for low-level motion. Here we investigated whether the relationship with pursuit eye movements is different for fast and slow motion processes, using a motion aftereffect technique with superimposed low- and high-spatial-frequency gratings. A previous study showed that the low- and high-spatial-frequency gratings adapt the fast and slow motion processes, respectively, and that a static test probes the slow motion process and a flicker test probes the fast motion process (Shioiri & Matsumiya, 2009). In the present study, an adaptation stimulus was composed of two gratings with different spatial frequencies and orientations but the same temporal frequency, moving in the orthogonal direction of ±45° from the vertical. We measured the directions of perceived motion and pursuit eye movements to a test stimulus presented after motion adaptation with changing relative contrasts of the two adapting gratings. Pursuit eye movements were observed in the same direction as that of the motion aftereffects, independent of the relative contrasts of the two adapting gratings, for both the static and flicker tests. These results suggest that pursuit eye movements and perception share motion signals in both slow and fast motion processes.

Introduction
Motion processing in early vision is often considered to have a temporal frequency tuning with a peak at around 5 Hz (Livingstone & Hubel, 1988; Pantle, 1974). However, several psychophysical studies have suggested two separate motion-processing channels, one sensitive to fast motion stimuli and the other sensitive to slow motion stimuli (Alais, Verstraten, & Burr, 2005; Gegenfurtner & Hawken, 1996; Hawken, Gegenfurtner, & Tang, 1994; Hirahara, 2006; Shioiri & Matsumiya, 2009; van der Smagt, Verstraten, & van de Grind, 1999; Verstraten, van der Smagt, & van de Grind, 1998). This reveals the existence of an additional motion mechanism, one that is sensitive to slow motion. Of these studies, the research of Verstraten and his colleagues has used motion aftereffects (MAEs) to examine characteristics of the two channels and has shown that fast and slow motion stimuli can produce independent MAEs. Indeed, after adaptation to a moving stimulus including fast and slow motion patterns moving in opposite directions, the perceived direction of the MAE depended on the temporal condition of the test stimulus, which strongly suggests the existence of distinct fast and slow motion processes (Alais et al., 2005; Shioiri & Matsumiya, 2009; van der Smagt et al., 1999; Verstraten et al., 1998). A more recent study (Shioiri & Matsumiya, 2009) showed that the two types of motion processes have different spatiotemporal frequency characteristics. After adaptation to a moving stimulus composed of low- and high-spatial-frequency gratings moving in opposite directions, observers perceived MAEs in the direction opposite to the high- and low-spatial-frequency moving gratings with static and flicker test stimuli, respectively. Moreover, differences in orientation tuning and in extraction of relative velocity were found between the fast and slow motion processes, suggesting qualitative differences between the two. Here we focus on the relationship between motion perception and pursuit eye movements, which may use two different motion processes. If the slow and fast motion processes are different channels in the visual system, such as one channel that is common for perception and pursuit and another that is specific only to perception, we would predict that the relationships between motion perception and pursuit eye movements for the slow and the fast motion processes would differ. 
The relationship between motion perception and smooth pursuit eye movements has been investigated across a variety of stimulus conditions. Many studies have suggested that visual motion signals are shared between perception and eye movements (Beutter & Stone, 1998, 2000; Braun, Pracejus, & Gegenfurtner, 2006; Kowler & McKee, 1987; Krauzlis & Stone, 1999; Krukowski & Stone, 2005; Lisi & Cavanagh, 2015; Steinbach, 1976; Stone & Krauzlis, 2003; Watamaniuk & Heinen, 2007; Wyatt & Pola, 1979; Yasui & Young, 1975). For example, Beutter and Stone (1998) measured the perceived motion direction and the direction of smooth pursuit eye movements using a moving plaid, consisting of two superimposed gratings with different orientations moving in different directions. They found that the perceptual and eye-movement responses have similar directional biases for window shape, where three types of spatial Gaussian windows were used: an elongated and −40° tilted window, an elongated and +40° tilted window, and a circularly symmetric window. Braun et al. (2006) showed that pursuit eye movements are elicited by the MAE from adaptation to a low-spatial-frequency drifting grating. 
Contrary to the studies mentioned so far, several studies have suggested that visual motion signals are dissociated from perception and pursuit eye movements (Barton, Sharpe, & Raymond, 1996; Gegenfurtner, Xing, Scott, & Hawken, 2003; Hawken & Gegenfurtner, 2001; Mack, Fendrich, & Pleune, 1979; Spering & Carrasco, 2012; Spering & Gegenfurtner, 2007; Spering, Pomplun, & Carrasco, 2011; Tavassoli & Ringach, 2010; for reviews, see Spering & Carrasco, 2015). Other studies have suggested dissociations between motion perception and ocular following response for the same visual motion information (Bostrom & Warzecha, 2010; Glasser & Tadin, 2014; Simoncini, Perrinet, Montagnini, Mamassian, & Masson, 2012; for reviews, see Spering & Carrasco, 2015). For example, the early work by Mack et al. (1979) showed that in experimental conditions where there was a conflict between retinal and perceived motion, pursuit eye movements were controlled not by perceived motion but by retinal motion. Gegenfurtner et al. (2003) showed that there is no correlation between pursuit errors and perceptual errors for speed judgments. Spering and Gegenfurtner (2007) showed that under conditions that require the segmentation of target motion from background motion, perceived speed and the speed of pursuit eye movements are determined by different computations. Tavassoli and Ringach (2010) showed that pursuit eye movements respond to the velocity fluctuations of a moving target even when the corresponding target motion is perceptually invisible to the observer. 
These inconsistent results may be attributed to the existence of two different motion processes: one for a shared process and one for separate processes for perception and pursuit (Spering & Carrasco, 2015). Moreover, these shared and separate processes might be related to the slow and fast motion processes. Although the results of previous studies cover a considerable range of moving speeds, none of them have attempted to separate the slow and fast motion processes. Commonly, there is substantial overlap in tuning between two channels with different spatiotemporal characteristics, so selective masking or selective adaptation is often used to isolate a single channel (Alais et al., 2005; Anderson & Burr, 1985; Fredericksen & Hess, 1998; Hammett & Smith, 1992; Hess & Snowden, 1992; Shioiri, Ono, & Sato, 2002). Therefore, it is still possible that the slow and fast motion processes are different in terms of control of pursuit eye movements, and that the inconsistent results seen in the previous studies may instead be explained by the different properties of the slow and fast motion processes. In the present study, we investigated whether motion perception and pursuit eye movements share motion signals in each of the slow and fast motion processes by making use of an MAE technique that was capable of isolating the fast and slow motion processes (Shioiri & Matsumiya, 2009). Our results show a similar effect of stimulus conditions between motion perception and pursuit eye movements for both the slow and fast motion processes, suggesting that both the slow and fast motion processes contribute to perception and pursuit. 
We examined the relationship between the perceived motion and smooth pursuit eye movement after presentation of the adaptation stimulus: perceived MAEs and pursuit MAEs. To compare the two MAEs directly and quantitatively, we evaluated the direction of the MAE after adaptation to plaid motion, involving two superimposed gratings with different orientations moving in different directions. Considering that the perceived direction of a static MAE is determined by the slow motion process and that that of a flicker MAE is determined by the fast motion process, we could thus separately examine the contributions of motion signals to pursuit eye movements and motion perception for the two motion processes. For the static test, we obtained MAEs caused by the high-spatial-frequency (HSF) grating, which we assumed to be the MAEs of the slow motion process. For the flicker test, we obtained MAEs caused by the low-spatial-frequency (LSF) grating, which we assumed to be the MAEs of the fast motion process. A change in the relative contrast between the two gratings in a plaid would change the MAE direction. If the same motion signals are used for pursuit eye movements and motion perception, the direction of the pursuit MAEs would change in accordance with that of the perceived MAEs, depending on the relative contrasts of the gratings. We used the static test with variable contrasts of either the HSF or LSF grating to examine the relationship between the pursuit and perceived MAEs in the slow motion process, and used the flicker test with variable contrasts of either the HSF or LSF grating to examine the relationship between the pursuit and perceived MAEs in the fast motion process. Our results show that the directions of the smooth pursuit eye movements and motion perception are similar in both static and flicker MAEs. 
Note that the perceived motion of stationary stimuli generated by the static MAE drives pursuit eye movements when test stimuli without sharp edges are used (Braun et al., 2006), but not stimuli with sharp edges (Mack et al., 1979; Mack et al., 1987; Seidman, Leigh, & Thomas, 1992). In the present study, we used Gabor patches as stimuli to avoid the effect of sharp edges. 
Methods
Apparatus
Visual stimuli were presented on a CRT monitor (1024 × 768 pixel resolution; Sony Trinitron CPD-520) with a refresh rate of 80 Hz and controlled by a ViSaGe visual-stimulus generator (Cambridge Research Systems, Inc., Rochester, UK) and a computer. Observers were seated in a dark room and viewed the display (52° × 34°) from a distance of 38 cm. The observer's head was fixed with a combination of a chin rest and a forehead rest. 
Eye movements of the right eye were recorded using a video-based eye-tracking system (Cambridge Research Systems). Eye position was sampled at a frequency of 50 Hz. Pursuit eye movements are continuous slow eye movements, not ballistic rapid movements such as saccades. Therefore, eye positions during pursuit do not change dramatically between sampling points even with the 50-Hz sampling frequency. We think that the 50-Hz sampling rate does not produce a serious difference between the actual eye positions and the sampled eye positions when the direction of pursuit eye movements is measured. Before each experimental session, we performed a calibration by having observers fixate on a series of nine dots arranged in a 10° × 10° grid. The dots were presented in a random order. We recorded eye positions for 2 s from the onset of a test stimulus. Eye-position data were stored for off-line analysis. 
Visual stimuli
An adaptation stimulus was composed of two drifting gratings with different spatial frequencies (LSF and HSF were 0.3 and 1.2 c/°, respectively, and drifting rate was 5 Hz) moving in the orthogonal directions of ±45° from the vertical (Figure 1). The orientations of the LSF and HSF gratings were randomly assigned to two orientations: ±45° from the vertical. The contrasts of the gratings were controlled by a sine-wave function with the envelope of a Gaussian function along the radius direction. The peak of the Gaussian function was set at a distance of 4° from the center of the display. The standard deviation of the Gaussian function was 1.5°. We defined one half of the stimulus width as a value obtained by adding twice the standard deviation (3°) and the distance from the center of the display to the peak of the Gaussian function (4°), and defined one half of the blank center region as a value by subtracting twice the standard deviation (3°) from the distance between the center of the display and the peak of the Gaussian function (4°). Therefore, the plaid stimulus subtended 14° in diameter, with the center 2° spared. This spared region was contained because it was possible that a static test stimulus would abolish pursuit eye movements. This means that if the test stimulus had been presented without the spared region, observers might not have generated pursuit eye movements while it was presented. During the adaptation period, observers fixated on a white dot presented at the center of the display. During the test period, the white dot was not presented. If the adaptation and test stimuli had been presented without the spared region, the region of the white dot would have produced a nonadapted part of the test stimulus (see also Figure 2). This would have led to a situation in which the observers could not generate pursuit eye movements during the test period, because a motion aftereffect was not induced on the corresponding region of the white dot where they directed their gaze. In addition, the spared region played a role in stabilizing the fixation during adaptation because visual motion was not presented immediately near the observer's gaze. The average luminance of the gratings and of the background was 76.5 cd/m2. Fixation was supported by a white dot in the adaptation phase, not in the test phase. 
Figure 1
 
Visual stimulus. The plaid stimulus was composed of two drifting gratings with different spatial frequencies (0.3 and 1.2 c/°), moving in the orthogonal directions of ±45° (orange arrows). The peak of the Gaussian function was set at a distance of 4° from the center of the display. The fixation point, a white central dot, was presented during the adaptation period.
Figure 1
 
Visual stimulus. The plaid stimulus was composed of two drifting gratings with different spatial frequencies (0.3 and 1.2 c/°), moving in the orthogonal directions of ±45° (orange arrows). The peak of the Gaussian function was set at a distance of 4° from the center of the display. The fixation point, a white central dot, was presented during the adaptation period.
Figure 2
 
Procedure. Observers adapted to the drifting gratings for 20 s while fixating on a white spot. Adaptation was followed by a 0.5-s blank. A test stimulus, either static or 4-Hz flickered, was then presented for 2 s, followed by a probe (a bar) presented on the display. The observers adjusted the direction of the probe to indicate the direction of the MAE. While the test stimulus was being presented, eye movements were measured.
Figure 2
 
Procedure. Observers adapted to the drifting gratings for 20 s while fixating on a white spot. Adaptation was followed by a 0.5-s blank. A test stimulus, either static or 4-Hz flickered, was then presented for 2 s, followed by a probe (a bar) presented on the display. The observers adjusted the direction of the probe to indicate the direction of the MAE. While the test stimulus was being presented, eye movements were measured.
Before the experiments, we measured the contrast thresholds for the motion detection of the HSF and LSF gratings using the method of adjustment. Observers were asked to adjust the contrast of the grating by pressing a button until the motion of the grating reached the threshold level while one of the moving gratings was continuously displayed. Four settings were made for each spatial frequency for each observer. 
In Experiment 1 (original), the contrast of the HSF grating was chosen from 7.5, 15, 30, or 60 times the threshold, with a fixed contrast of the LSF grating (30 times the threshold) during the adaptation phase before the static test. During the adaptation phase before the flicker test, the contrast of the LSF grating was chosen from 7.5, 15, 30, or 60 times the threshold, with a fixed contrast of the HSF grating (30 times the threshold). In Experiment 2 (additional contrast manipulation), the contrast of the LSF grating was chosen from 7.5, 15, 30, or 60 times the threshold, with a fixed contrast of the HSF grating during the adaptation phase before the static test, and the contrast of the HSF grating was chosen from 7.5, 15, 30, or 60 times the threshold, with a fixed contrast of the LSF grating during the adaptation phase before the flicker test. In the test stimulus, the contrasts of the HSF and LSF gratings were always 30 times the threshold for both the static and flicker tests. The test stimulus was the same as the adaptation stimulus except for motion and contrast. The contrast of the flicker test was sinusoidally modulated at 4 Hz, while the contrast of the static test was constant. 
We also conducted Experiment 3 (fixation control) to examine whether the adaptation stimulus elicits pursuit eye movements. In this experiment, the adaptation stimulus was presented for 20 s, and the contrasts of the HSF and LSF grating were chosen in the same manner as for the adaptation stimulus in the main experiment. Observers were instructed to keep looking at the fixation point on the adaptation stimulus. While the adaptation stimulus was being presented, observers' eye movements were measured. For each observer, eye-movement distance was defined as the average of the distance between the fixation point and the eye position from the start of the trial to the end of the trial, and eye-movement direction was defined as the average of the instantaneous eye direction from the start of the trial to the end of the trial. 
Observers
Five male observers were recruited for each of Experiments 1–3 (for Experiment 1: age range = 22–35 years, mean age = 25 years; for Experiments 2 and 3, age range = 23–43 years, mean age = 28.6 years). All observers had normal or corrected-to-normal vision. All were unaware of the purpose of the experiment except for one who was one of the authors of this study (KM). All the observers were experienced with respect to other psychophysical experiments in which eye movements are measured. This study was approved by the Ethics Committee of the Research Institute of Electrical Communication, Tohoku University. 
Procedure
Figure 2 shows the stimulus sequence for the MAE experiment. The adaptation stimulus was presented for 20 s and followed by a blank screen for 0.5 s and then a test stimulus for 2 s, during which eye movements were measured. Observers were instructed to keep looking at the center of the plaid stimulus during the test period, tracking it with eye movements if the plaid appeared to move. After the test stimulus, a probe stimulus was presented on the display and the observers indicated the direction of the perceived MAE by adjusting the direction of the probe with a trackball. 
Each observer participated in three sessions of eight trials (4 contrasts × 2 directions) for both the static and flicker tests. Before the main experiment, the observers performed one session for each of the static and flicker tests for practice (a total of 16 trials; the data from these trials were discarded). In each session, the adaptation direction of the LSF grating's motion was randomly selected to be at ±45°, and that of the HSF grating's motion was determined as the direction of the sign opposite to the LSF grating's motion for each trial (see Figure 1). The run for the condition with the LSF direction of −45° and the HSF direction of +45° consisted of three trials, as did the run for the condition with the LSF direction of +45° and the HSF direction of −45°. We combined the data from the two runs, yielding a total of six trials for each contrast for each test.1 In the data analysis, a direction opposite to the HSF grating's adaptation motion was defined as positive. 
Analysis of eye-movement data
Figure 3a shows an example trajectory of eye movements in a trial. Figure 3b and c shows the time course of eye-movement distance and eye velocity. Eye movements were measured while a test stimulus was presented. The eye-movement direction was defined as the angle of the regression line to the eye trajectory. The slope of the line between the initial and end points of eye positions for a specific period was calculated, and then the angle from the downward vertical was calculated. A downward direction was defined as zero and the opposite of the HSF stimulus motion in the adaptation phase was defined as positive. Eye-position data were smoothed with a low-pass filter with a cutoff at 10 Hz before the analysis. We set the cutoff to 10 Hz to remove noisy fluctuations of about 20–30 Hz. This provided eye-movement data sufficient to describe pursuit eye movements in our experiments. We have learned that the noisy fluctuations were due to incompatibility between the eye tracker and the computer. However, spatial distance and direction of eye movements were not dramatically affected by the 10-Hz filtering we used. We differentiated the eye-position data to obtain eye velocity. The onset of a pursuit was defined as the time when eye velocity exceeded 0.5°/s. We discarded trials from the analysis when a saccade occurred during the test presentation (8% of trials from all observers). The occurrence of a saccade was defined as a case where eye velocity exceeded 30°/s. 
Figure 3
 
Analysis of eye-movement data. (a) An example trace of eye positions. The black line represents the eye position. The red line represents the line connected between the initial and end points of the eye positions. (b) Eye-movement distance as a function of time. (c) Eye velocity as a function of time. The dashed horizontal line represents the threshold for pursuit onset.
Figure 3
 
Analysis of eye-movement data. (a) An example trace of eye positions. The black line represents the eye position. The red line represents the line connected between the initial and end points of the eye positions. (b) Eye-movement distance as a function of time. (c) Eye velocity as a function of time. The dashed horizontal line represents the threshold for pursuit onset.
We analyzed the eye-movement directions separately for the initial 200 ms, a later 200 ms, and the whole 2000-ms period. During the initial period of about 200 ms, pursuit eye movements are controlled by motion signals, whereas visual feedback, such as retinal slip, influences eye movements during later periods (Lisberger, Morris, & Tychsen, 1987; Lisberger & Westbrook, 1985). This raises the possibility that the relationship between perception and pursuit eye movements may be different in the initial and late periods. Nevertheless, a previous study (Rasche & Gegenfurtner, 2009) has shown similarities between pursuit eye movements and motion perception, even during the initial period. If the underlying mechanisms for the slow and fast motion processing are the same as those reported in these previous studies, we expect that the directions of perceived and pursuit MAEs will be similar for both the initial and later periods. 
Results
Figure 4 shows the average eye-movement distance and velocity when looking at the fixation point on the adaptation stimulus as a function of adaptation direction. The eye-movement distance and velocity were almost zero across all the adaptation directions for both the static and flicker tests. Repeated-measures analysis of variance (ANOVA) showed that the effects of adaptation direction on eye movement distance and distance were not significant for either the static or the flicker tests (all ps > 0.3), indicating that the adaptation stimulus does not elicit tracking eye movements. 
Figure 4
 
Average eye-movement distance and velocity when looking at the fixation point on the adaptation stimulus as a function of grating contrast. Eye-movement distance and eye velocity, respectively, as a function of the directions of the adaptation gratings for the static test (a–b) and the flicker test (c–d), N = 5. Error bars represent the standard error of the mean.
Figure 4
 
Average eye-movement distance and velocity when looking at the fixation point on the adaptation stimulus as a function of grating contrast. Eye-movement distance and eye velocity, respectively, as a function of the directions of the adaptation gratings for the static test (a–b) and the flicker test (c–d), N = 5. Error bars represent the standard error of the mean.
Figure 5 shows the average eye-movement trajectories of two observers for the static and flicker tests. Eye position is plotted on a two-dimensional space for a 2-s presentation of the test stimulus. Each colored curve represents a different contrast of the adaptation grating (the contrast of the HSF adaptation grating for the static test, and that of the LSF adaptation grating for the flicker test). As shown in Figure 5a, the eye-movement direction for the static test depended on the contrast of the HSF adaptation grating. The eye-movement direction shifted from the opposite direction of the LSF adaptation grating's motion (−45°) toward the opposite direction of the HSF adaptation grating's motion (+45°) with an increase in the contrast of the HSF adaptation grating. As shown in Figure 5b, the eye-movement direction for the flicker test depended on the contrast of the LSF adaptation grating; however, the effect of adaptation contrast differed among observers. In one observer, the eye-movement direction shifted to the opposite direction of the LSF adaptation grating's motion with an increase in the contrast of the LSF grating (left panel), whereas in the other observer, the eye-movement direction was relatively constant across the different contrasts of the LSF adaptation grating and was close to the opposite direction of the LSF adaptation grating's motion (right panel). The average latency among observers was 295 ms for both the static and flicker tests, and was independent of contrast. This value is consistent with the latencies measured in a previous study of pursuits elicited by an MAE (Braun et al., 2006). 
Figure 5
 
Average eye-movement traces for observers HN and KM for the static (a) and flicker (b) test stimuli. Orange arrows represent the direction of the high- or low-spatial-frequency grating's motion during the adaptation period. Eye position is plotted against two-dimensional space. The color of each curve represents a contrast of the adaptation grating. Eye positions were analyzed for the 2-s test presentation after adaptation.
Figure 5
 
Average eye-movement traces for observers HN and KM for the static (a) and flicker (b) test stimuli. Orange arrows represent the direction of the high- or low-spatial-frequency grating's motion during the adaptation period. Eye position is plotted against two-dimensional space. The color of each curve represents a contrast of the adaptation grating. Eye positions were analyzed for the 2-s test presentation after adaptation.
Figure 6 compares the directions of perceived and pursuit MAEs for the static test during the initial period of the test presentation, which corresponds to the open-loop phase of pursuits, as a function of HSF grating contrast. As the contrast of the HSF grating increased, the directions of both perceived and pursuit MAEs shifted from the opposite direction of the LSF grating's motion (−45°) toward the opposite direction of the HSF grating's motion (+45°). The directions of perceived and pursuit MAEs were very close on average and were similar for all observers. An ANOVA showed that the directions of perceived and pursuit MAEs were not significantly different, F(1, 32) < 1, p = 0.47 (not significant), whereas the effect of contrast was highly significant, F(3, 32) = 27.16, p < 0.0001. 
Figure 6
 
Results for the static test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the high-spatial-frequency adaptation grating. The results of the individual observers and their averages are shown in the different panels. The horizontal axis represents the contrast in threshold units for the HSF grating. Solid and open symbols represent the results for perception and pursuits, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 6
 
Results for the static test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the high-spatial-frequency adaptation grating. The results of the individual observers and their averages are shown in the different panels. The horizontal axis represents the contrast in threshold units for the HSF grating. Solid and open symbols represent the results for perception and pursuits, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 7 compares the directions of perceived and pursuit MAEs for the flicker test. As the contrast of the LSF grating increased, the directions of both perceived and pursuit MAEs shifted from the opposite direction of the HSF grating's motion (+45°) toward the opposite direction of the LSF grating's motion (−45°) for observer HN, whereas both directions were largely constant for the other observers. The directions of perceived and pursuit MAEs were very close on average and were similar for all observers. An ANOVA showed that the directions of perceived and pursuit MAEs were not significantly different, F(1, 32) < 1, p = 0.94 (not significant), whereas the effect of contrast was significant, F(3, 32) = 3.04, p < 0.05. 
Figure 7
 
Results for the flicker test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the low-spatial-frequency adaptation grating. The horizontal axis represents the contrast in threshold units for the LSF grating. Solid and open symbols represent the results for perception and eye movements, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 7
 
Results for the flicker test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the low-spatial-frequency adaptation grating. The horizontal axis represents the contrast in threshold units for the LSF grating. Solid and open symbols represent the results for perception and eye movements, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
To assess whether the direction of pursuit eye movements was constant over the test period, we analyzed the eye-movement directions for the later 200 ms and the whole 2000-ms period in addition to the initial 200 ms. The directions of the pursuits for the late and whole periods of the test presentation were essentially the same as those for the initial period (Figure 8a, b). An ANOVA indicated that the directions of perceived and pursuit MAEs were not significantly different across periods: F(3, 64) < 1, p = 0.68 (not significant), for the static test; F(3, 64) < 1, p = 0.65 (not significant), for the flicker test. These results show no hint of the dissociation between the perceived and pursuit MAEs for either the static or flicker tests, and also suggest that the pursuits do not systematically change direction from the initial period to the late period. 
Figure 8
 
The mean directions of pursuit eye movements as a function of the contrast threshold unit for the static (a, c) and flicker (b, d) tests during the initial 200 ms, the later 200 ms, and the whole 2000-ms period. Solid, open-circle, open-square, and open-diamond symbols represent the results for perception, eye movements during the initial period, eye movements during the later period, and eye movements during the whole period, respectively. (e–f) The same data shown as a function of contrast ratio. N = 5; error bars represent the standard error of the mean.
Figure 8
 
The mean directions of pursuit eye movements as a function of the contrast threshold unit for the static (a, c) and flicker (b, d) tests during the initial 200 ms, the later 200 ms, and the whole 2000-ms period. Solid, open-circle, open-square, and open-diamond symbols represent the results for perception, eye movements during the initial period, eye movements during the later period, and eye movements during the whole period, respectively. (e–f) The same data shown as a function of contrast ratio. N = 5; error bars represent the standard error of the mean.
Figure 8c shows the contrast effect of the LSF adaptation grating with the static test. When the contrast of the LSF grating increased, the directions of both perceived and pursuit MAEs for the static test shifted from the opposite direction of the HSF grating's motion toward the opposite direction of the LSF grating's motion. Figure 8d shows the contrast effect of the HSF adaptation grating with the flicker test. When the contrast of the HSF grating increased, the directions of both perceived and pursuit MAEs for the flicker test were close to the opposite direction of the LSF grating's motion and were largely constant. For both the static and flicker tests, the directions of the pursuits for the late and whole periods of the test presentation were essentially the same as those of the initial period (Figure 8c, d). An ANOVA showed that the directions of perceived and pursuit MAEs were not significantly different across periods—F(3, 64) < 1, p = 0.74 (not significant), for the static test; F(3, 64) < 1, p = 0.69 (not significant), for the flicker test—whereas the effect of contrast was significant for the static test but not for the flicker test—F(3, 64) = 41.9, p < 0.0001, for the static test; F(3, 64) = 1.33, p = 0.28 (not significant), for the flicker test. 
We replotted the data against the contrast ratio, defined as the ratio of the contrast of the HSF adaptation grating to the contrast of the LSF adaptation grating (Figure 8e, f). Figure 8e combines the data shown in Figure 8a and c, and Figure 8f combines the data shown in Figure 8b and d. These results suggest that the ratio of the two gratings' contrasts is a determining factor of the directions of both perceived and pursuit MAEs. 
Discussion
This study reveals that the directions of smooth pursuit eye movements and motion perception are similar in both static and flicker MAEs after adaptation to two superimposed gratings with different spatial frequencies and drift in orthogonal directions. Since the static MAE reflects the adaptation effect of the slow motion process and the flicker MAE reflects the adaptation effect of the fast motion process (Alais et al., 2005; Shioiri & Matsumiya, 2009; van der Smagt et al., 1999; Verstraten et al., 1998), our results indicate that smooth pursuit eye movements are driven by motion signals from both slow and fast motion processes. Many studies have reported that visual motion signals for motion perception and pursuit eye movements are processed in the same way (Beutter & Stone, 1998, 2000; Braun et al., 2006; Kowler & McKee, 1987; Krauzlis & Stone, 1999; Krukowski & Stone, 2005; Spering & Gegenfurtner, 2008; Stone & Krauzlis, 2003) and share the same neural pathways (Krukowski & Stone, 2005; Lisberger & Movshon, 1999; Newsome, Wurtz, & Komatsu, 1988; Tychsen & Lisberger, 1986; for reviews, see Spering & Carrasco, 2015). Other studies have reported that visual motion signals for motion perception and pursuit eye movements are processed in different ways (Barton et al., 1996; Gegenfurtner et al., 2003; Hawken & Gegenfurtner, 2001; Mack et al., 1979; Spering et al., 2011; Spering & Carrasco, 2012; Spering & Gegenfurtner, 2007; Tavassoli & Ringach, 2010; for reviews, see Schutz et al., 2011; Spering & Montagnini, 2011; Spering & Carrasco, 2015). Recent studies have reported that visual motion signals for motion perception and ocular following response are also processed in different ways (Bostrom & Warzecha, 2010; Glasser & Tadin, 2014; Simoncini et al., 2012; for reviews, see Spering & Carrasco, 2015). These studies suggest that motion perception and pursuit eye movements generally use similar neural computations for visual motion analysis, although they can also use different neural computations. In the present study, we examined whether similarities and differences between motion perception and pursuit eye movements are related to the slow and fast motion processes. Our results show similarities between motion perception and pursuit eye movements for both the slow and fast motion processes, even though the processes are isolated. 
What roles do the slow and fast motion processes play in motion analysis? The slow motion process has been suggested to play a role in the processing of object motion, because this process is sensitive to relative motion (Shioiri & Matsumiya, 2009). Therefore, a motion-sensitive unit with a center–surround antagonistic receptive field can model relative motion processing (Frost & Nakayama, 1983), and such a receptive-field structure is suited for distinguishing an object's motion from background motion (Murakami & Shimojo, 1993; Shioiri, Ito, Sakurai, & Yaguchi, 2002; Shioiri, Ono, & Sato, 2002; Tadin, Lappin, Gilroy, & Blake, 2003). Several neurophysiological studies provide evidence that the processing of object motion is carried out through antagonistic interactions in center–surround receptive fields of motion-sensitive neurons in the MT region and in the lateral region of the medial superior temporal area (MST; Born & Tootell, 1992; Eifuku & Wurtz, 1998). These neurons are also responsible for the generation of pursuit eye movements (Born, Groh, Zhao, & Lukasewycz, 2000; Komatsu & Wurtz, 1988). The present results show that the slow motion process underlies both control of pursuit eye movements and visual motion perception, suggesting that the slow motion process may be involved in the neural processing of object motion. 
The present results also show that visual motion signals are shared between motion perception and pursuit eye movements in not only the slow motion process but also the fast motion process. One possible explanation for this result is that these two motion processes might depend on distinct but overlapping neural substrates in the dorsal pathway. Previous psychophysical studies have suggested the existence of two parallel motion mechanisms for analyzing relative and uniform motion (Shioiri, Ito, et al., 2002; Shioiri, Ono, & Sato, 2002). The processing of relative motion is related to the analysis of object motion, as mentioned already, while the processing of uniform motion is closely related to the analysis of global motion generated by an observer's own movement (Morrone, Burr, & Vaina, 1995; Williams & Sekuler, 1984). Neurophysiological studies have also demonstrated that there are different neurons with different receptive-field properties within motion-sensitive brain areas (Berezovskii & Born, 2000; Born & Tootell, 1992; Eifuku & Wurtz, 1998). Born and Tootell (1992) showed that neurons with different receptive-field properties are segregated in columnar fashion within MT, suggesting anatomical segregation of neurons in MT with and without center–surround antagonistic receptive fields. On the other hand, Eifuku and Wurtz (1998) showed that the MST area can be divided into two regions. In this area, neurons in one region have center–surround antagonistic receptive fields, and neurons in the other region have center–surround summation receptive fields. 
Although Berezovskii and Born (2000) have suggested that the organization of the receptive fields in MST may not be simply related to the two different receptive-field properties in MT, these studies suggest that the different neurons with different receptive-field properties in MT or MST may have different functions: one for the processing of object motion and one for that of global motion. If the fast motion process has a completely different function from the slow motion process, we would expect that the fast motion process might be related to only the processing of uniform motion. However, the present results show that the fast motion process is also responsible for the generation of pursuit eye movements. As mentioned already, the processing of relative motion is related to the analysis of object motion (Murakami & Shimojo, 1993; Shioiri, Ito, et al., 2002; Shioiri, Ono, & Sato, 2002; Tadin et al., 2003), and the processing of object motion underlies both control of pursuit eye movements and visual motion perception (Born et al., 2000; Eifuku & Wurtz, 1998; Komatsu & Wurtz, 1988). Taken together, the present results imply that the fast motion process may also play a role in the analysis of object motion and may depend on overlapping neural substrates for slow and fast motion processing in the dorsal pathway. 
Gegenfurtner and Hawken (1996) proposed that slow and fast motion processing streams are parallel and independent in the brain; the neural mechanism for the slow motion pathway is likely associated with the ventral cortical processing stream, including V4, and the neural mechanism for the fast motion pathway is likely associated with the dorsal cortical processing stream, including areas MT and MST. On the other hand, Goodale and Milner (1992) proposed that the ventral and dorsal streams subserve separate respective functions, namely, vision for perception and vision for action. Based on these propositions, one possibility is that the slow and fast motion processes correspond to the visual functions for motion perception and pursuit eye movements, respectively. If the slow motion process were associated only with ventral cortical processing, we would expect that slow motion signals would not influence pursuit eye movements, at least not directly. This is because areas MT and MST, which are included in the dorsal cortical processing stream, are involved in the control of pursuit eye movements (Celebrini & Newsome, 1995; Dursteler & Wurtz, 1988; Ilg & Thier, 2003; Komatsu & Wurtz, 1988, 1989; Lisberger & Movshon, 1999; Pasternak & Merigan, 1994; Rudolph & Pasternak, 1999). However, the present results indicate that visual motion signals are shared between motion perception and pursuit eye movements even in the slow motion process, and therefore they do not support the view that the slow and fast motion processes correspond to the visual functions for motion perception and pursuit eye movements, respectively. 
Mack et al. (1979, 1982) have demonstrated that there is a dissociation between perception and eye movements with slow motion signals, which is inconsistent with our present results. The discrepancy between our present results and theirs could be explained by considering that there may be different slow motion processes in the dorsal and ventral streams. There is evidence for different motion mechanisms for luminance and color at slow speeds but not at fast speeds (Hawken et al., 1994). For slow motion stimuli, it may be the case that the dorsal stream is responsible for luminance motion signals and that the ventral stream is responsible for color motion signals (Gegenfurtner & Hawken, 1996), although color and luminance motion may share a common process in general (Cavanagh & Favreau, 1985; Cropper & Derrington, 1996; McKeefry, Laviers, & McGraw, 2006; Mullen, Yoshizawa, & Baker, 2003; Shioiri, Yoshizawa, Ogiya, Matsumiya, & Yaguchi, 2012; Yoshizawa, Mullen, & Baker, 2000, 2003). Also, Mack et al. (1979, 1982) used relative motion stimuli with very slow speeds and showed dissociation between motion perception and eye movements. The effect of relative motion for slow motion stimuli seems to be different between perception and eye movements. Indeed, when there is a difference in motion between adjacent regions, the background motion can disturb accurate tracking of a moving object with eye movements (Masson, Proteau, & Mestre, 1995; Spering & Gegenfurtner, 2008). However, the difference in motion between adjacent regions can be detected perceptually even as slow as less than 1 min/s (Shioiri, Ito, et al., 2002), although the slowest speed of pursuit eye movements is around 60 min/s (Ilg, 1997). The motion process that specializes in relative motion might be closely related to the slow motion process in the ventral stream. Taken together, the slow motion process in the ventral stream could contribute solely to perception, not to controlling eye movements. Perception from the very slow motion stimuli that were used by Mack et al. might be mediated by the slow motion process in the ventral stream without influencing eye movements. 
Our present findings are inconsistent with the view that the slow and fast motion processes might correlate with the dichotomy of the parvo and magno pathways in early vision. Our previous study has shown that the slow motion process is sensitive to low temporal frequencies and high spatial frequencies, and that conversely the fast motion process is sensitive to high temporal frequencies and low spatial frequencies (Shioiri & Matsumiya, 2009). In comparison, the parvo pathway is thought to convey low-temporal- and high-spatial-frequency contents of retinal images, while the magno pathway is thought to convey high-temporal- and low-spatial-frequency contents. It might be considered that the slow motion process correlates with the parvo pathway and the fast motion process correlates with the magno. However, the previous study showed a narrow orientation tuning only for the slow motion process (Shioiri & Matsumiya, 2009), although it has been reported that motion-sensitive mechanisms in the magno pathway have a narrow orientation tuning (Gur, Kagan, & Snodderly, 2005). The present study suggests that both the slow and fast motion processes may depend on neural substrates in the dorsal pathway, including the magno pathway. In addition, the stimulus configurations in the previous and present studies are consistent in that the fovea is excluded from stimulation. Given that the parvo pathway is primarily driven by central vision, the stimuli used in both the previous and present studies seem to be suitable for the magno pathway. Thus, we suggest that it is inappropriate for the slow and fast motion processes to be characterized by the parvo/magno pathway dichotomy. 
The present study suggests that the direction of pursuit eye movements does not systematically change for the 2-s test period (see Figure 8). However, it remains unclear whether the direction of the perceived MAE changes over the course of the 2-s test period. In the present study, observers adjusted the direction of the probe to indicate the direction of the perceived MAE after the test presentation for 2 s. This method makes the implicit assumption that the perceived direction is constant over the test period. A future investigation should examine whether changes in the direction of the perceived MAE occur for the test period. 
In summary, we simultaneously measured perceptual direction judgments and pursuit eye-movement responses to aftereffects caused by adaptation to two overlapping gratings with different moving directions and different spatial frequencies. This method helps to clarify the relationship between perception and pursuit for the fast and slow motion processes. The findings reveal that the directions of pursuit eye movements and motion perception are remarkably similar for both the fast and slow motion processes. These motion processes differ in spatial and temporal properties, in orientation tunings, and in sensitivity to relative motion (Shioiri & Matsumiya, 2009), suggesting that there are separate motion mechanisms operating in parallel for motion processing. We conclude that motion perception and pursuit eye movements are processed by common motion mechanisms in both fast and slow motion domains. 
Acknowledgments
This work has been partially supported by the Core Research for Evolution Science and Technology (CREST) Program of the Japan Science and Technology Agency (JST) to SS and the Ministry of Education, Culture, Sports, Science and Technology (MEXT) Brainware LSI Project to KM and SS. 
Commercial relationships: none. 
Corresponding author: Kazumichi Matsumiya. 
Email: kmat@riec.tohoku.ac.jp. 
Address: Research Institute of Electrical Communication, Tohoku University, Aoba-ku, Sendai, Japan. 
References
Alais D., Verstraten F. A., Burr D. C. (2005). The motion aftereffect of transparent motion: Two temporal channels account for perceived direction. Vision Research, 45 (4), 403–412.
Anderson S. J., Burr D. C. (1985). Spatial and temporal selectivity of the human motion detection system. Vision Research, 25 (8), 1147–1154.
Barton J. J., Sharpe J. A., Raymond J. E. (1996). Directional defects in pursuit and motion perception in humans with unilateral cerebral lesions. Brain: A Journal of Neurology, 119 (5), 1535–1550.
Berezovskii V. K., Born R. T. (2000). Specificity of projections from wide-field and local motion-processing regions within the middle temporal visual area of the owl monkey. The Journal of Neuroscience, 20 (3), 1157–1169.
Beutter B. R., Stone L. S. (1998). Human motion perception and smooth eye movements show similar directional biases for elongated apertures. Vision Research , 38 (9), 1273–1286.
Beutter B. R., Stone L. S. (2000). Motion coherence affects human perception and pursuit similarly. Visual Neuroscience , 17 (1), 139–153.
Born R. T., Groh J. M., Zhao R., Lukasewycz S. J. (2000). Segregation of object and background motion in visual area MT: Effects of microstimulation on eye movements. Neuron, 26 (3), 725–734.
Born R. T., Tootell R. B. (1992). Segregation of global and local motion processing in primate middle temporal visual area. Nature , 357 (6378), 497–499.
Bostrom K. J., Warzecha A. K. (2010). Open-loop speed discrimination performance of ocular following response and perception. Vision Research, 50 (9), 870–882.
Braun D. I., Pracejus L., Gegenfurtner K. R. (2006). Motion aftereffect elicits smooth pursuit eye movements. Journal of Vision, 6 (7): 1, 671–684, doi:10.1167/6.7.1. [PubMed] [Article]
Cavanagh P., Favreau O. E. (1985). Color and luminance share a common motion pathway. Vision Research , 25 (11), 1595–1601.
Celebrini S., Newsome W. T. (1995). Microstimulation of extrastriate area MST influences performance on a direction discrimination task. Journal of Neurophysiology , 73 (2), 437–448.
Cropper S. J., Derrington A. M. (1996). Rapid colour-specific detection of motion in human vision. Nature , 379 (6560), 72–74.
Dursteler M. R., Wurtz R. H. (1988). Pursuit and optokinetic deficits following chemical lesions of cortical areas MT and MST. Journal of Neurophysiology , 60 (3), 940–965.
Eifuku S., Wurtz R. H. (1998). Response to motion in extrastriate area MSTl: Center-surround interactions. Journal of Neurophysiology, 80 (1), 282–296.
Fredericksen R. E., Hess R. F. (1998). Estimating multiple temporal mechanisms in human vision. Vision Research, 38 (7), 1023–1040.
Frost B. J., Nakayama K. (1983, May 13). Single visual neurons code opposing motion independent of direction. Science, 220 (4598), 744–745.
Gegenfurtner K. R., Hawken M. J. (1996). Interaction of motion and color in the visual pathways. Trends in Neurosciences , 19 (9), 394–401.
Gegenfurtner K. R., Xing D., Scott B. H., Hawken M. J. (2003). A comparison of pursuit eye movement and perceptual performance in speed discrimination. Journal of Vision, 3 (11): 19, 865–876, doi:10.1167/3.11.19. [PubMed] [Article]
Glasser D. M., Tadin D. (2014). Modularity in the motion system: Independent oculomotor and perceptual processing of brief moving stimuli. Journal of Vision, 14 (3): 28, 1–13, doi:10.1167/14.3.28. [PubMed] [Article]
Goodale M. A., Milner A. D. (1992). Separate visual pathways for perception and action. Trends in Neuroscience, 15 (1), 20–25.
Gur M., Kagan I., Snodderly D. M. (2005). Orientation and direction selectivity of neurons in V1 of alert monkeys: Functional relationships and laminar distributions. Cerebral Cortex, 15 (8), 1207–1221.
Hammett S. T., Smith A. T. (1992). Two temporal channels or three? A re-evaluation. Vision Research, 32 (2), 285–291.
Hawken M. J., Gegenfurtner K. R. (2001). Pursuit eye movements to second-order motion targets. Journal of the Optical Society of America A, 18 (9), 2282–2296.
Hawken M. J., Gegenfurtner K. R., Tang C. (1994). Contrast dependence of colour and luminance motion mechanisms in human vision. Nature , 367 (6460), 268–270.
Hess R. F., Snowden R. J. (1992). Temporal properties of human visual filters: Number, shapes and spatial covariation. Vision Research, 32 (1), 47–59.
Hirahara M. (2006). Reduction in the motion coherence threshold for the same direction as that perceived during adaptation. Vision Research , 46 (28), 4623–4633.
Ilg U. J. (1997). Slow eye movements. Progress in Neurobiology , 53 (3), 293–329.
Ilg U. J., Thier P. (2003). Visual tracking neurons in primate area MST are activated by smooth-pursuit eye movements of an “imaginary” target. Journal of Neurophysiology , 90 (3), 1489–1502.
Komatsu H., Wurtz R. H. (1988). Relation of cortical areas MT and MST to pursuit eye movements. I. Localization and visual properties of neurons. Journal of Neurophysiology , 60 (2), 580–603.
Komatsu H., Wurtz R. H. (1989). Modulation of pursuit eye movements by stimulation of cortical areas MT and MST. Journal of Neurophysiology , 62 (1), 31–47.
Kowler E., McKee S. P. (1987). Sensitivity of smooth eye movement to small differences in target velocity. Vision Research , 27 (6), 993–1015.
Krauzlis R. J., Stone L. S. (1999). Tracking with the mind's eye. Trends in Neurosciences , 22 (12), 544–550.
Krukowski A. E., Stone L. S. (2005). Expansion of direction space around the cardinal axes revealed by smooth pursuit eye movements. Neuron , 45 (2), 315–323.
Lisberger S. G., Morris E. J., Tychsen L. (1987). Visual motion processing and sensory-motor integration for smooth pursuit eye movements. Annual Review of Neuroscience, 10, 97–129.
Lisberger S. G., Movshon J. A. (1999). Visual motion analysis for pursuit eye movements in area MT of macaque monkeys. The Journal of Neuroscience, 19 (6), 2224–2246.
Lisberger S. G., Westbrook L. E. (1985). Properties of visual inputs that initiate horizontal smooth pursuit eye movements in monkeys. The Journal of Neuroscience, 5 (6), 1662–1673.
Lisi M., Cavanagh P. (2015, May). A dissociation of motion processing for saccades, smooth pursuit, and perception measured for the same target. Paper presented at the Vision Sciences Society, St. Pete Beach, FL, USA.
Livingstone M., Hubel D. (1988, May 6). Segregation of form, color, movement, and depth: Anatomy, physiology, and perception. Science, 240 (4853), 740–749.
Mack A., Fendrich R., Pleune J. (1979, Mar 30). Smooth pursuit eye movements: Is perceived motion necessary? Science, 203 (4387), 1361–1363.
Mack A., Fendrich R., Wong E. (1982). Is perceived motion a stimulus for smooth pursuit. Vision Research , 22 (1), 77–88.
Mack A., Goodwin J., Thordarsen H., Benjamin D., Palumbo D., Hill J. (1987). Motion aftereffects associated with pursuit eye movements. Vision Research , 27 (4), 529–536.
Masson G., Proteau L., Mestre D. R. (1995). Effects of stationary and moving textured backgrounds on the visuo-oculo-manual tracking in humans. Vision Research, 35 (6), 837–852.
McKeefry D. J., Laviers E. G., McGraw P. V. (2006). The segregation and integration of colour in motion processing revealed by motion after-effects. Proceedings of the Royal Society B, 273 (1582), 91–99.
Morrone M. C., Burr D. C., Vaina L. M. (1995). Two stages of visual processing for radial and circular motion. Nature , 376 (6540), 507–509.
Mullen K. T., Yoshizawa T., Baker C. L., Jr. (2003). Luminance mechanisms mediate the motion of red-green isoluminant gratings: the role of “temporal chromatic aberration.” Vision Research, 43 (11), 1235–1247.
Murakami I., Shimojo S. (1993). Motion capture changes to induced motion at higher luminance contrasts, smaller eccentricities, and larger inducer sizes. Vision Research, 33 (15), 2091–2107.
Newsome W. T., Wurtz R. H., Komatsu H. (1988). Relation of cortical areas MT and MST to pursuit eye movements. II. Differentiation of retinal from extraretinal inputs. Journal of Neurophysiology , 60 (2), 604–620.
Pantle A. (1974). Motion aftereffect magnitude as a measure of the spatio-temporal response properties of direction-sensitive analyzers. Vision Research, 14 (11), 1229–1236.
Pasternak T., Merigan W. H. (1994). Motion perception following lesions of the superior temporal sulcus in the monkey. Cerebral Cortex , 4 (3), 247–259.
Rasche C., Gegenfurtner K. R. (2009). Precision of speed discrimination and smooth pursuit eye movements. Vision Research, 49 (5), 514–523.
Rudolph K., Pasternak T. (1999). Transient and permanent deficits in motion perception after lesions of cortical areas MT and MST in the macaque monkey. Cerebral Cortex , 9 (1), 90–100.
Schutz A. C., Braun D. I., Gegenfurtner K. R. (2011). Eye movements and perception: A selective review. Journal of Vision , 11 (5): 9, 1–30, doi:10.1167/11.5.9. [PubMed] [Article]
Seidman S. H., Leigh R. J., Thomas C. W. (1992). Eye movements during motion after-effect. Vision Research , 32 (1), 167–171.
Shioiri S., Ito S., Sakurai K., Yaguchi H. (2002). Detection of relative and uniform motion. Journal of the Optical Society of America A, 19 (11), 2169–2179.
Shioiri S., Matsumiya K. (2009). Motion mechanisms with different spatiotemporal characteristics identified by an MAE technique with superimposed gratings. Journal of Vision, 9 (5): 30, 1–15, doi:10.1167/9.5.30. [PubMed] [Article]
Shioiri S., Ono H., Sato T. (2002). Adaptation to relative and uniform motion. Journal of the Optical Society of America A, 19 (8), 1465–1474.
Shioiri S., Yoshizawa M., Ogiya M., Matsumiya K., Yaguchi H. (2012). Low-level motion analysis of color and luminance for perception of 2D and 3D motion. Journal of Vision, 12 (6): 33, 1–13, doi:10.1167/12.6.33. [PubMed] [Article]
Simoncini C., Perrinet L. U., Montagnini A., Mamassian P., Masson G. S. (2012). More is not always better: Adaptive gain control explains dissociation between perception and action. Nature Neuroscience, 15 (11), 1596–1603.
Spering M., Carrasco M. (2012). Similar effects of feature-based attention on motion perception and pursuit eye movements at different levels of awareness. The Journal of Neuroscience, 32 (22), 7594–7601.
Spering M., Carrasco M. (2015). Acting without seeing: Eye movements reveal visual processing without awareness. Trends in Neurosciences, 38 (4), 247–258.
Spering M., Gegenfurtner K. R. (2007). Contrast and assimilation in motion perception and smooth pursuit eye movements. Journal of Neurophysiology, 98 (3), 1355–1363.
Spering M., Gegenfurtner K. R. (2008). Contextual effects on motion perception and smooth pursuit eye movements. Brain Research , 1225 , 76–85.
Spering M., Montagnini A. (2011). Do we track what we see? Common versus independent processing for motion perception and smooth pursuit eye movement: A Review. Vision Research, 51 (8), 836–852.
Spering M., Pomplun M., Carrasco M. (2011). Tracking without perceiving: A dissociation between eye movements and motion perception. Psychological Science, 22 (2), 216–225.
Steinbach M. J. (1976). Pursuing the perceptual rather than the retinal stimulus. Vision Research, 16 (12), 1371–1376.
Stone L. S., Krauzlis R. J. (2003). Shared motion signals for human perceptual decisions and oculomotor actions. Journal of Vision, 3 (11): 7, 725–736, doi:10.1167/3.11.7. [PubMed] [Article]
Tadin D., Lappin J. S., Gilroy L. A., Blake R. (2003). Perceptual consequences of centre-surround antagonism in visual motion processing. Nature , 424 (6946), 312–315.
Tavassoli A., Ringach D. L. (2010). When your eyes see more than you do. Current Biology, 20 (3), R93–R94.
Tychsen L., Lisberger S. G. (1986). Visual motion processing for the initiation of smooth-pursuit eye movements in humans. Journal of Neurophysiology , 56 (4), 953–968.
van der Smagt M. J., Verstraten F. A., van de Grind W. A. (1999). A new transparent motion aftereffect. Nature Neuroscience , 2 (7), 595–596.
Verstraten F. A., van der Smagt M. J., van de Grind W. A. (1998). Aftereffect of high-speed motion. Perception , 27 (9), 1055–1066.
Watamaniuk S. N., Heinen S. J. (2007). Storage of an oculomotor motion aftereffect. Vision Research, 47 (4), 466–473.
Williams D. W., Sekuler R. (1984). Coherent global motion percepts from stochastic local motions. Vision Research, 24 (1), 55–62.
Wyatt H. J., Pola J. (1979). The role of perceived motion in smooth pursuit eye movements. Vision Research, 19 (6), 613–618.
Yasui S., Young L. R. (1975, Nov 28). Perceived visual motion as effective stimulus to pursuit eye movement system. Science, 190 (4217), 906–908.
Yoshizawa T., Mullen K. T., Baker C. L., Jr. (2000). Absence of a chromatic linear motion mechanism in human vision. Vision Research , 40 (15), 1993–2010.
Yoshizawa T., Mullen K. T., Baker C. L.,Jr. (2003). Failure of signed chromatic apparent motion with luminance masking. Vision Research , 43 (7), 751–759.
Footnotes
1  We think that the standard errors shown in Figures 6 and 7 ensure the reliability of the data even with six trials per contrast condition. In addition, all the observers who participated in our experiments were highly trained with respect to psychophysical experiments in which eye movements are measured. Their eye fixation was greatly steady when fixating on the white spot during adaptation, and they hardly blinked in a trial. The rejection rate of eye movements was less than 8% on average across all experiments for the test phase.
Footnotes
 © ARVO
Figure 1
 
Visual stimulus. The plaid stimulus was composed of two drifting gratings with different spatial frequencies (0.3 and 1.2 c/°), moving in the orthogonal directions of ±45° (orange arrows). The peak of the Gaussian function was set at a distance of 4° from the center of the display. The fixation point, a white central dot, was presented during the adaptation period.
Figure 1
 
Visual stimulus. The plaid stimulus was composed of two drifting gratings with different spatial frequencies (0.3 and 1.2 c/°), moving in the orthogonal directions of ±45° (orange arrows). The peak of the Gaussian function was set at a distance of 4° from the center of the display. The fixation point, a white central dot, was presented during the adaptation period.
Figure 2
 
Procedure. Observers adapted to the drifting gratings for 20 s while fixating on a white spot. Adaptation was followed by a 0.5-s blank. A test stimulus, either static or 4-Hz flickered, was then presented for 2 s, followed by a probe (a bar) presented on the display. The observers adjusted the direction of the probe to indicate the direction of the MAE. While the test stimulus was being presented, eye movements were measured.
Figure 2
 
Procedure. Observers adapted to the drifting gratings for 20 s while fixating on a white spot. Adaptation was followed by a 0.5-s blank. A test stimulus, either static or 4-Hz flickered, was then presented for 2 s, followed by a probe (a bar) presented on the display. The observers adjusted the direction of the probe to indicate the direction of the MAE. While the test stimulus was being presented, eye movements were measured.
Figure 3
 
Analysis of eye-movement data. (a) An example trace of eye positions. The black line represents the eye position. The red line represents the line connected between the initial and end points of the eye positions. (b) Eye-movement distance as a function of time. (c) Eye velocity as a function of time. The dashed horizontal line represents the threshold for pursuit onset.
Figure 3
 
Analysis of eye-movement data. (a) An example trace of eye positions. The black line represents the eye position. The red line represents the line connected between the initial and end points of the eye positions. (b) Eye-movement distance as a function of time. (c) Eye velocity as a function of time. The dashed horizontal line represents the threshold for pursuit onset.
Figure 4
 
Average eye-movement distance and velocity when looking at the fixation point on the adaptation stimulus as a function of grating contrast. Eye-movement distance and eye velocity, respectively, as a function of the directions of the adaptation gratings for the static test (a–b) and the flicker test (c–d), N = 5. Error bars represent the standard error of the mean.
Figure 4
 
Average eye-movement distance and velocity when looking at the fixation point on the adaptation stimulus as a function of grating contrast. Eye-movement distance and eye velocity, respectively, as a function of the directions of the adaptation gratings for the static test (a–b) and the flicker test (c–d), N = 5. Error bars represent the standard error of the mean.
Figure 5
 
Average eye-movement traces for observers HN and KM for the static (a) and flicker (b) test stimuli. Orange arrows represent the direction of the high- or low-spatial-frequency grating's motion during the adaptation period. Eye position is plotted against two-dimensional space. The color of each curve represents a contrast of the adaptation grating. Eye positions were analyzed for the 2-s test presentation after adaptation.
Figure 5
 
Average eye-movement traces for observers HN and KM for the static (a) and flicker (b) test stimuli. Orange arrows represent the direction of the high- or low-spatial-frequency grating's motion during the adaptation period. Eye position is plotted against two-dimensional space. The color of each curve represents a contrast of the adaptation grating. Eye positions were analyzed for the 2-s test presentation after adaptation.
Figure 6
 
Results for the static test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the high-spatial-frequency adaptation grating. The results of the individual observers and their averages are shown in the different panels. The horizontal axis represents the contrast in threshold units for the HSF grating. Solid and open symbols represent the results for perception and pursuits, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 6
 
Results for the static test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the high-spatial-frequency adaptation grating. The results of the individual observers and their averages are shown in the different panels. The horizontal axis represents the contrast in threshold units for the HSF grating. Solid and open symbols represent the results for perception and pursuits, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 7
 
Results for the flicker test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the low-spatial-frequency adaptation grating. The horizontal axis represents the contrast in threshold units for the LSF grating. Solid and open symbols represent the results for perception and eye movements, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 7
 
Results for the flicker test in analyzing eye-movement data during the open-loop phase (the 200-ms period after pursuit onset). The graphs show the directions of motion perception and pursuit eye movements as a function of the contrast threshold unit for the low-spatial-frequency adaptation grating. The horizontal axis represents the contrast in threshold units for the LSF grating. Solid and open symbols represent the results for perception and eye movements, respectively. Data are given for each observer (a–e) and as means of all five observers (f). Error bars represent the standard error of the mean.
Figure 8
 
The mean directions of pursuit eye movements as a function of the contrast threshold unit for the static (a, c) and flicker (b, d) tests during the initial 200 ms, the later 200 ms, and the whole 2000-ms period. Solid, open-circle, open-square, and open-diamond symbols represent the results for perception, eye movements during the initial period, eye movements during the later period, and eye movements during the whole period, respectively. (e–f) The same data shown as a function of contrast ratio. N = 5; error bars represent the standard error of the mean.
Figure 8
 
The mean directions of pursuit eye movements as a function of the contrast threshold unit for the static (a, c) and flicker (b, d) tests during the initial 200 ms, the later 200 ms, and the whole 2000-ms period. Solid, open-circle, open-square, and open-diamond symbols represent the results for perception, eye movements during the initial period, eye movements during the later period, and eye movements during the whole period, respectively. (e–f) The same data shown as a function of contrast ratio. N = 5; error bars represent the standard error of the mean.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×