October 2018
Volume 18, Issue 11
Open Access
Article  |   October 2018
Sensory integration of movements and their visual effects is not enhanced by spatial proximity
Author Affiliations
  • Nienke B. Debats
    Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
    Cognitive Interaction Technology Center of Excellence (CITEC), Universität Bielefeld, Bielefeld, Germany
    nienke.debats@uni-bielefeld.de
  • Herbert Heuer
    Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
    Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
    herbert.heuer@uni-bielefeld.de
Journal of Vision October 2018, Vol.18, 15. doi:https://doi.org/10.1167/18.11.15
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Nienke B. Debats, Herbert Heuer; Sensory integration of movements and their visual effects is not enhanced by spatial proximity. Journal of Vision 2018;18(11):15. https://doi.org/10.1167/18.11.15.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Spatial proximity enhances the sensory integration of exafferent position information, likely because it indicates whether the information comes from a single physical source. Does spatial proximity also affect the integration of position information regarding an action (here a hand movement) with that of its visual effect (here a cursor motion), that is, when the sensory information comes from physically distinct objects? In this study, participants made out-and-back hand movements whereby the outward movements were accompanied by corresponding cursor motions on a monitor. Their subsequent judgments of hand or cursor movement endpoints are typically biased toward each other, consistent with an underlying optimal integration mechanism. To study the effect of spatial proximity, we presented the hand and cursor either in orthogonal planes (horizontal and frontal, respectively) or we aligned them in the horizontal plane. We did not find the expected enhanced integration strength in the latter spatial condition. As a secondary question we asked whether spatial transformations required for the position judgments (i.e., horizontal to frontal or vice versa) could be the origin of previously observed suboptimal variances of the integrated hand and cursor position judgments. We found, however, that the suboptimality persisted when spatial transformations were omitted (i.e., with the hand and cursor in the same plane). Our findings thus clearly show that the integration of actions with their visual effects is, at least for cursor control, independent of spatial proximity.

Introduction
Humans can easily control the position of a cursor on a computer monitor by moving a computer mouse or a finger on a trackpad. Successful cursor control requires that the brain effectively detects which visual input is reafferent, that is, relates to the executed hand movement. According to recent evidence on “visuomotor binding,” visual information related to action effects such as cursor motions is processed differently than exafferent visual information that is unrelated to hand movements (Reichenbach & Diedrichsen, 2015; Reichenbach, Franklin, Zatka-Haas, & Diedrichsen, 2014). Other studies have shown that the functional link between hand and cursor movements comes with systematic perceptual biases of the perceived hand position toward the cursor position, and of perceived cursor position toward the hand position (e.g., Kirsch, Pfister, & Kunde, 2016; Ladwig, Sutter, & Müsseler, 2012, 2013; Rand & Heuer, 2013, 2016). These biases scale with the relative reliabilities of the unimodal hand and cursor position estimates, consistent with the reliability-based-weighting mechanism of optimal multisensory integration (Debats, Ernst, & Heuer, 2017b). However, the integration here is partial rather than complete, and the strength of the hand-cursor integration was found to decline when the cursor trajectory was manipulated such that the otherwise perfect correlation of hand and cursor trajectories was reduced (Debats, Ernst, & Heuer, 2017a). Together, these findings suggest that in cursor-control (and possibly in tool-use in general) the brain establishes a link between sensory information regarding the action and its visual reafference based on the mechanisms of optimal sensory integration. The “relatedness evidence” driving this sensorimotor link seems to be based on the hand-cursor kinematic cross-correlations, possibly in combination with other factors. The effect of one of these factors, the spatial proximity of hand and cursor, is examined in the present experiments. 
A rather unique feature of hand-cursor integration is that it occurs despite a considerable spatial separation between the hand (moving in the horizontal plane) and the cursor (moving in the frontal plane). Otherwise spatial separation has been identified as a factor that disrupts sensory integration (e.g., Gepshtein, Burge, Ernst, & Banks, 2005; Slutsky & Recanzone, 2001; Spence, 2013). These studies typically used tasks with only exafferent sensory signals. For example, participants judged the features of an object (e.g., its size based on visual and haptic information) or event (e.g., its location based on auditory and visual information). In such tasks, spatial proximity can be considered a strong indicator for that the sensory signals arise from a single source and should therefore be integrated (for reviews, see e.g., Shams & Beierholm, 2010; van Dam, Parise, & Ernst, 2014). If the brain is provided a likely explanation for the spatial separation, such as a virtual tool covering the distance, the integration strength is found to restore (Helbig & Ernst, 2007; Takahashi, Diedrichsen, & Watt, 2009; Takahashi & Watt, 2014). In cursor control, however, the spatial separation of the action and its visual reafference is an inherent feature, and there is no visible link between them such as a mechanical—real or virtual—tool. Nevertheless, the brain might take the spatial proximity of hand and cursor into account and not rely only on the kinematic cross-correlation as a basis for the visuomotor link. 
The primary purpose of the present experiments was to test the impact of spatial proximity, in addition to that of cross-correlations, on the strength of sensory integration in cursor control. We compared perceptual judgments in a cursor-control task (cf. Debats et al., 2017b, 2017a) between two spatial conditions that were tested in a single experimental setup (see Figure 1a). In the first condition, hand movements and cursor motions were in orthogonal planes, with the hand movements in the horizontal plane and the corresponding cursor motions in the frontal plane at some distance (condition OrthogPlane). This spatial arrangement is typical in everyday computer use. In the second condition, hand movements and cursor motions were in the same horizontal plane (condition SamePlane). Participants looked downward into a mirror where they saw a virtual image—projected from the monitor above the mirror—aligned with the hand movement plane. This spatial arrangement is frequently used in experimental setups (e.g., Bock, Schneider, & Bloomberg, 2001; Bock & Thomas, 2011; Synofzik, Thier, & Lindner, 2006; van Beers, Sittig, & Denier van der Gon, 1996). 
Figure 1
 
Setup, task and transformations. (a) The experimental setup, in which participants made hand movement on the half-circular workspace on the horizontal digitizer table. The corresponding cursor motion could either be presented on the monitor in the frontal plane (OrthogPlane), or the cursor could appear on the horizontal monitor such that it was reflected by the mirror just below the chin-rest to create a virtual image of the cursor in the exact plane of the hand movements (SamePlane). (b) The cursor was visible during the outward trajectory and disappeared once the hand reached the workspace boundary. This movement endpoint was later—after the return movement—to be judged by the participants. (c) The direction of the cursor movement deviated slightly from the direction of the hand movement with eight levels of visuomotor rotation. This discrepancy was needed to observe the degree to which judgment of hand position were biases toward the cursor position, and vice versa. (d) The position judgments were provided either by placing a cursor on the remembered hand or cursor movement endpoint (upper panel, Cursor response task), where the cursor position along the semicircular path was controlled by small lateral movements of the stylus (inset in upper panel), or by placing the hand on the remembered hand or cursor movement endpoint (lower panel, Hand response task) without visual feedback (inset in lower panel). (e) When the hand and cursor appeared in separate planes of motion (condition OrthogPlane), the judgments made with the Cursor response task and the Hand response task involved either no transformations or both a modality and spatial transformation (i.e., horizontal to frontal plane of motion, or vice versa). When the hand and cursor appeared in the same planes of motion (condition SamePlane—not illustrated here), the spatial transformation was no longer involved.
Figure 1
 
Setup, task and transformations. (a) The experimental setup, in which participants made hand movement on the half-circular workspace on the horizontal digitizer table. The corresponding cursor motion could either be presented on the monitor in the frontal plane (OrthogPlane), or the cursor could appear on the horizontal monitor such that it was reflected by the mirror just below the chin-rest to create a virtual image of the cursor in the exact plane of the hand movements (SamePlane). (b) The cursor was visible during the outward trajectory and disappeared once the hand reached the workspace boundary. This movement endpoint was later—after the return movement—to be judged by the participants. (c) The direction of the cursor movement deviated slightly from the direction of the hand movement with eight levels of visuomotor rotation. This discrepancy was needed to observe the degree to which judgment of hand position were biases toward the cursor position, and vice versa. (d) The position judgments were provided either by placing a cursor on the remembered hand or cursor movement endpoint (upper panel, Cursor response task), where the cursor position along the semicircular path was controlled by small lateral movements of the stylus (inset in upper panel), or by placing the hand on the remembered hand or cursor movement endpoint (lower panel, Hand response task) without visual feedback (inset in lower panel). (e) When the hand and cursor appeared in separate planes of motion (condition OrthogPlane), the judgments made with the Cursor response task and the Hand response task involved either no transformations or both a modality and spatial transformation (i.e., horizontal to frontal plane of motion, or vice versa). When the hand and cursor appeared in the same planes of motion (condition SamePlane—not illustrated here), the spatial transformation was no longer involved.
In both spatial conditions, participants made out-and-back movements from the center of a semicircular workspace to its boundary and back to the remembered start location. The cursor was shown during the outward movements only (see Figure 1b), and its directions deviated slightly and unpredictably from the directions of the hand movements (see Figure 1c). After each out-and-back movement in these bimodal trials (i.e., trials with concurrent hand movements and cursor motions) the participants judged either the cursor's (BiCursor trials) or the hand's (BiHand trials) position at the end of the outward movement. The discrepancy between the cursor and hand positions at the end of the outward movements allowed us to assess the mutual biases of the position judgments and the strength of the integration as the sum of these biases. Additionally, we ran unimodal trials in which there was only a cursor motion or a hand movement. In UniCursor trials an outward cursor motion was shown without accompanying hand movement, followed by the judgment of cursor endpoint after a delay corresponding to the duration of the backward hand movements in other trials. In UniHand trials the outward hand movement was made without accompanying cursor motion, followed by the judgment of hand endpoint after the backward movement to the remembered start position. These unimodal trials allowed us to derive optimal integration model predictions of the biases and judgment variability in bimodal trials. 
The second aim of the current study was to examine how the transformation noise, which is associated with spatial separation, affects the variability of the biased position judgments. In a previous study using the OrthogPlane condition, we varied the way in which participants reported the remembered hand or cursor movement endpoints (Debats et al., 2017b). When participants reported these positions by means of placing a cursor on the monitor (top panel Figure 1d), the judgments of hand position required both a spatial transformation (horizontal to frontal) and a modality transformation (proprioceptive to visual; see Figure 1e). When participants reported the remembered movement endpoints by placing their hand at the corresponding position on the tablet (lower panel Figure 1d), the judgments of cursor position required such transformations (see Figure 1e). The noise that comes with the transformations clearly affected the variability of the hand and cursor position estimates in unimodal trials (UniHand and UniCursor trials) and, consistent with a reliability-based weighting mechanism, the relative biases of the position judgments in bimodal trials. Importantly, we also observed consistently more variable position judgments in bimodal trials than predicted for optimal partial integration (Debats et al., 2017b, 2017a; Debats & Heuer, 2018), with predictions derived from the Coupling Prior model (Debats et al., 2017b; Ernst, 2007, 2012) that allows for partial integration strengths. In particular, the variabilities in bimodal trials were in-between those of the unimodal trials instead of being consistently lower than in the corresponding unimodal trials. This indicates a benefit of integration only for the variability of the bimodal estimates that correspond to the most variable unimodal estimates, yet a disadvantage of the integration for the bimodal estimates corresponding to the least variable unimodal estimates. We here ask whether the suboptimality of the bimodal variability is related to the spatial transformation noise. 
The current study thus addressed the role of spatial proximity for the strength of hand-cursor sensory integration as well as the role of spatial transformations for the previously observed suboptimal variances of biased hand and cursor position estimates. We used two main experimental manipulations: (a) In two experiments we varied the spatial separation of the hand and cursor movements by presenting them either in orthogonal planes (OrthogPlane) or the same plane (SamePlane). We hypothesize that integration is strengthened by hand-cursor spatial proximity—in addition to integration strength based on kinematic cross-correlations—and thus expect stronger integration in condition SamePlane. (b) Across the two experiments we varied how the position judgments were given, that is, by placing either a cursor or the hand at the remembered movement endpoints (Cursor response task and Hand response task, respectively). If the previously observed suboptimal variability were due to the spatial transformation, as we hypothesize, then the behavior should now be (at least closer to) optimal in condition SamePlane
Methods
Participants
There were fourteen right-handed participants (aged 20 to 28 years; three male, 11 female) in Experiment 1 and thirteen right-handed participants (aged 20 to 26 years; one male, 12 female) in Experiment 2. They gave written informed consent and were compensated with a payment of €6 per hour. The experiments were conducted in accordance with the declaration of Helsinki and approved by the Bielefeld University local ethics committee. 
Apparatus
The apparatus is shown in Figure 1a. Participants sat at a table with a digitizer tablet on top of it (Wacom Intuos4 XL; 48.8 by 30.5 cm). Supported by a chinrest, they kept a 60-cm viewing distance from the frontal screen of a computer monitor (Samsung MD230; 23 in.; 50.9 by 28.6 cm). A second monitor of the same type faced downward with a horizontal screen. The participants saw it in a horizontal half-silvered mirror placed halfway, each with 30-cm distance, between the monitor screen and the digitizer tablet. The mirror was approximately 15 cm below the eyes of the participants. The virtual image of objects presented on the horizontal monitor appeared in the plane of the digitizer tablet. The mirror was covered when the frontal monitor was used. The mirror or the cover fully occluded the participants' arms and hands. 
Participants held the digitizer stylus in their right hand. When required, they pressed a button on the stylus with their thumb or index finger. A semicircular workspace of 15-cm radius on the tablet was bordered by a 5-mm thick PVC template, which established a mechanical stop for the outward movements. It is referred to as the “stopper ring” (see dotted line in Figure 1b). The position of the stylus was recorded (sampling frequency: 60 samples/s; spatial resolution: 0.01 mm) and mapped online to the position of a cursor on one of the two monitors, using MATLAB (MathWorks, Natick, MA) with the Psychophysics Toolbox extension (Kleiner, Brainard, & Pelli, 2007). For the horizontal monitor the virtual image of the cursor appeared at the same location as the tip of the stylus as long as no visuomotor rotations were introduced. All images and text were presented light gray on a black background with only one exception described in section Detailed procedure
Movement task and visuomotor rotations
Participants performed outward movements on the digitizer that started at the center of the semicircular workspace and ended at the stopper ring, immediately followed by backward movements to the remembered start position. The cursor (a 6-mm diameter filled circle) was visible during the outward movement only. To prevent stereotyped movements, we defined eight ranges of movement directions with centers between −56° and +56° relative to straight-ahead in steps of 16°. In each trial we instructed participants to move into one of these eight ranges by means of a symbol shown prior to movement onset (see Detailed procedure for details). 
In all bimodal trials, the direction of the cursor motion was rotated relative to the direction of the hand movement. These visuomotor rotations varied randomly across trials between −17.5° and +17.5° in steps of 5°, with a mean of 0° (see Figure 1c). Visuomotor rotations were essential for assessing the biases in the position judgments. For participants, such small rotations generally remain unnoticed (Müsseler & Sutter, 2009; Rand & Heuer, 2013; Sülzenbrück & Heuer, 2009). 
Spatial conditions
In condition OrthogPlane, the cursor was shown on the monitor in the frontal plane, and thus spatially separated from the horizontal plane in which the hand movements were made. In condition SamePlane the cursor was presented on the horizontal monitor screen, with its virtual image appearing in the horizontal plane of the hand movements. Otherwise these two conditions were identical. 
Response tasks
After the end of each out-and-back movement, participants judged the hand or cursor position at the end of the outward movement. The word “CURSOR” or “HAND” was shown on the monitor, instructing participants to report the remembered cursor movement endpoint or hand movement endpoint, respectively. In the Cursor response task, judgments were provided by matching the position of a visual cursor (6-mm diameter filled circle) to the remembered position of cursor or hand at the end of the outward movement. The cursor initially appeared at the far left or right end of an invisible track that corresponded to all possible cursor endpoints (see dotted line in Figure 1d). Participants made small left/right movement with the stylus (< 1 cm) to control the movement speed of the cursor along this track. Once the cursor was in the desired position, participants pressed the stylus' button to confirm their judgment. In the Hand response task, judgments were provided by matching the felt hand position to the remembered position of cursor or hand. An arrow appeared on the monitor instructing participants to move their hand to the far left or far right corner of the stopper ring. They followed the stopper ring and pressed the stylus' button when their hand was in the desired position. Note that in both response tasks the movements of visual cursor or stylus differed from the outward movements, and only the end positions of the outward movements were shared with the movements of cursor or hand during the response tasks. The Cursor response task was tested in Experiment 1, the Hand response task in Experiment 2. In both tasks there were no time constraints. 
Trial types
For each combination of spatial conditions and response tasks, there were four different types of trial. The first two types were bimodal in that the outward hand movement was accompanied by a (randomly rotated) visual cursor motion. In the first type, participants judged the hand's movement endpoint (BiHand trials). In the second type, they judged the cursor's movement endpoint (BiCursor trials). The third and fourth types of trial were unimodal. In the third type the out-and-back hand movement was performed without the cursor being shown. In the response task participants judged the hand's movement endpoint (UniHand trials). In the fourth type of trial the cursor motion was presented without any hand movement being made (the hand remained in the start position). The cursor motion presented was the trajectory recorded in a preceding BiCursor trial. In the response task participants judged the cursor's movement endpoint (UniCursor trials). 
Design
At the start of the experiments, participants were instructed about how to perform the task by means of up to 28 familiarization trials during which verbal instructions were given. They were not informed about the presence of the visuomotor rotation. Instead they were told that the experiment was concerned with how well the brain is able to keep information obtained via the eyes separated from information obtained via the hand. We emphasized that it was therefore critical for them to pay attention to the word CURSOR or HAND being displayed on the monitor, and to focus accordingly on what either they saw or on what they felt when reporting the position judgments. 
We conducted two experiments, one for each response task. For each response task we tested each of the two spatial conditions in a separate experimental session. Their order was balanced across participants. Each experimental session comprised all combinations of four trial types and eight visuomotor rotations, each combination being presented 10 times. This resulted in a total of 320 trials. The order of the 32 trials per repetition set was randomized with the constraint that each UniCursor trial occurred later in the sequence than the corresponding BiCursor trial, so that the cursor trajectory recorded in a BiCursor trial could be presented in the corresponding UniCursor trial. For each repetition set of 32 trials, each of the eight ranges of instructed movement directions was randomly combined with one of the eight visuomotor rotations. Hence there was no systematic relation between the direction of movement and the visuomotor rotation. The 320 recorded trials per session were organized in six blocks with short breaks in-between. One session took about 2.5 hours to complete. 
Detailed procedure
Figure 2 illustrates the sequence of events in each trial. For reasons of clarity, not all items are shown at the correct scale. At the start of each trial an arrow was presented on the monitor that guided the participant to a randomly chosen initial position within a rectangular area of 1-cm height and 4-cm width, with its midpoint 1.5 cm below the center of the semicircular workspace (Figure 2a). The arrow disappeared when the stylus held by the participant was in the initial position (with a 4-mm margin) for 1 s. Next, we instructed the range of movement directions of the outward movement for the current trial by displaying a WiFi-like symbol (three circular arcs of 25° width at 18-, 24-, and 30-mm radial distance from the center position) for 1 s (Figure 2b). The color of the symbol indicated the type of trial: It was red for UniCursor trials (signaling participants not to move their hand), green for UniHand trials, and light gray for bimodal trials. Directly after the WiFi symbol disappeared, the center position was shown (a 7-mm diameter outline circle) as well as visual feedback of the hand position (a 6-mm filled circle; Figure 2c). This allowed participants to reach the center position swiftly. After maintaining this position for 250 ms (with a 2.5-mm margin), the center position circle disappeared and an auditory beep was presented as a cue to begin the outward movement to the stopper ring. If the cursor was shown (all trial types except UniHand), it was visible until 97% of the 15-cm distance between the start position and the stopper ring was covered. Its direction was rotated according to the magnitude of the visuomotor rotation selected for each specific trial (Figure 2d). The 97% criterion was required to detect all movement endpoints because, due to the thickness of the stylus and depending on its exact orientation in the hand, a slightly variable radial distance between 97% and 100% was reached. Immediately after hitting the stopper ring, participants moved the stylus back to the remembered start position of the outward movement without visual feedback (Figure 2e). They pressed the button on the stylus, which triggered an auditory beep, to confirm being back. Immediately thereafter the word CURSOR or HAND was presented on the monitor to instruct participants to judge the remembered cursor movement endpoint or hand movement endpoint, respectively (Figure 2f). 500 ms later the appearance of the response cursor (Cursor response task) or the arrow (Hand response task) signaled that participants could report their position judgment. 
Figure 2
 
Illustration of the detailed procedure of the bimodal trials. The tablet and monitor workspace are presented overlapping here, whereby the white items indicate images shown on the monitor, and black items indicate positions that were defined in both reference frames. Further explanation is provided in the main text under “Detailed procedure.”
Figure 2
 
Illustration of the detailed procedure of the bimodal trials. The tablet and monitor workspace are presented overlapping here, whereby the white items indicate images shown on the monitor, and black items indicate positions that were defined in both reference frames. Further explanation is provided in the main text under “Detailed procedure.”
Data preprocessing
Both the cursor and hand movements were recorded. Our main interest was in the physical and judged positions of cursor and hand at the end of the outward movements. Their Cartesian coordinates were converted into angles of a polar coordinate system with the origin in the center position, which was the start position of the outward movements. These angles captured all variations of physical and judged end positions because their distance from the start position was kept constant throughout both experiments (by means of the stopper ring for physical end positions, and by means of having participants choose from positions along the stopper ring for judged end positions). 
Data were screened for outliers defined by three criteria: (a) the direction of the outward movement in a trial deviated more than 35° from the center of the range of instructed movement directions, (b) the absolute angular deviation between the physical and judged positions of cursor or hand was larger than 35°, which is twice the maximal visuomotor rotation, and (c) the hand or cursor moved more than 2.5° to the left or right after reaching the stopper ring (or the equivalent distance on the monitor). 
After removal of outliers, judgments were corrected for hysteresis effects. In the response tasks, motions of the visual cursor (Experiment 1) or movements of the hand (Experiment 2) started either at the far left or right of the semicircular track or the stopper ring, respectively. Systematically different judgments with the different start positions could inflate the observed variances of the judgments. To remove such differences, for each experimental session (320 trials) the regression of the judged direction on the physical direction was computed with the constraint of a single slope, but two different intercepts for the right and left start positions of marker motions or hand movements. The judged directions were corrected for the difference between the intercepts by adding or subtracting half the difference, respectively. 
Dependent variables
We compared the two spatial conditions of both experiments with respect to integration strength, variability of the unimodal cursor and hand position judgments, and biases and variability of the bimodal cursor and hand position judgments. The bimodal biases and variabilities were compared with model predictions for partial optimal integration. Model predictions were derived from the Coupling Prior model proposed by Ernst (2006, 2007, 2012), for which the equations as they are provided here were derived by Debats and colleagues (2017b). The model captures the whole continuum of integration strengths, ranging from no integration (i.e., sensory independence) to complete integration (i.e., sensory fusion). It predicts the optimal weights (and thus the biases) and the associated variabilities, based on the integration strength and the variabilities of the position judgments in unimodal trials. An illustration of partial integration is provided in Figure 3a
Figure 3
 
Partial integration and regression analysis. (a) According to the so-called Coupling Prior model, partial integration fills the continuum between no integration (weights are zero) and complete fusion (the weights add up to one). The weights can be assessed experimentally through the relative judgment biases. For partial integration, the weights (and thus the relative biases) add up to any value between zero and one, with their sum indicating the integration strength. This illustration shows the predicted optimal biases and variability for an integration strength of 0.75; the unimodal variability is consistent with the average values for the OrthogPlane condition in Experiment 1. The dotted line indicated the optimal integration prediction for fusion (i.e., integration strength of one). (b) The relative biases were assessed using regression analysis, specifically of the deviation between the true and judged movement endpoint on the deviation between the true hand and cursor endpoints. This is illustrated here for the BiHand judgment of two exemplary participants for the OrthogPlane condition in Experiment 2. If the position judgment were on the horizontal line, this would have indicated that the participants' hand position judgments were—correctly scattered around the true hand position. If they had been scattered around the diagonal line, this would have indicated that their judgments were located around the true cursor position. The left and right panels illustrate one participant with low and one with high judgment variability, respectively.
Figure 3
 
Partial integration and regression analysis. (a) According to the so-called Coupling Prior model, partial integration fills the continuum between no integration (weights are zero) and complete fusion (the weights add up to one). The weights can be assessed experimentally through the relative judgment biases. For partial integration, the weights (and thus the relative biases) add up to any value between zero and one, with their sum indicating the integration strength. This illustration shows the predicted optimal biases and variability for an integration strength of 0.75; the unimodal variability is consistent with the average values for the OrthogPlane condition in Experiment 1. The dotted line indicated the optimal integration prediction for fusion (i.e., integration strength of one). (b) The relative biases were assessed using regression analysis, specifically of the deviation between the true and judged movement endpoint on the deviation between the true hand and cursor endpoints. This is illustrated here for the BiHand judgment of two exemplary participants for the OrthogPlane condition in Experiment 2. If the position judgment were on the horizontal line, this would have indicated that the participants' hand position judgments were—correctly scattered around the true hand position. If they had been scattered around the diagonal line, this would have indicated that their judgments were located around the true cursor position. The left and right panels illustrate one participant with low and one with high judgment variability, respectively.
For each of the bimodal trial types we determined the angular deviation between the judged and the physical hand or cursor positions (for BiHand and BiCursor trials, respectively). We then regressed, for each trial type and spatial condition separately, the angular deviations on the visuomotor rotations (see Figure 3b). The slopes of these regressions indicate the proportional biases (i.e., the attraction of the judgments towards the other modality as proportions of the visuomotor rotation). The proportional biases or slopes correspond to weights assigned to hand-position and cursor-position estimates in sensory integration (see Figure 3a). The slope for BiHand trials corresponds to the weight wC_obs given to the cursor's end position, with wC_obs = 0 indicating that the judged hand positions match the physical hand positions and wC_obs = 1 indicating that they match the physical cursor positions. The slope for BiCursor trials corresponds to the weight wH_obs given to the hand's end position, with wH_obs = 0 indicating that the judged cursor positions match the physical cursor positions and wH_obs = 1 indicating that they match the physical hand positions. We used linear regressions because we observed no apparent dependency of the integration strength on the magnitude of the visuomotor rotations (see Figure 3c). 
The intercepts of the regressions capture systematic judgment errors, which are unrelated to the strength of the sensory integration. They were generally small: For the participants of Experiment 1, averaged across BiCursor and BiHand trials, they ranged between −1.13° and 6.59° (mean OrthogPlane 1.67° ± 0.31°, and SamePlane 1.83° ± 0.54°) and for the participants of Experiment 2 they ranged between −7.77° and 7.75° (mean OrthogPlane −2.08° ± 0.95°, and SamePlane −2.21° ± 1.19°). These systematic errors are not of interest for the purpose of the present study and therefore not further analyzed. 
The variability of the position judgments was computed from the same regression analyses as the variance of the residuals. For this reason, the regressions were also computed for the unimodal trial types whereby the visuomotor rotation was a dummy variable. We thus computed the judgment variances for each trial type: UniHand2H_obs), UniCursor2C_obs), BiHand2H.C_obs), and BiCursor2C.H_obs). 
The integration strength λobs was computed as the sum of the observed weights: λobs = wH_obs + wC_obs. An integration strength of 1 indicates full sensory integration (i.e., hand and cursor are judged to be in the same position), an integration strength of 0 indicates independence (i.e., hand and cursor positions are not biased toward each other), and an integration strength between 0 and 1 indicates partial sensory integration. It should be noted that numerically the integration strength was not constrained to range between 0 and 1 because we determined wH_obs and wC_obs independently. 
Last, we derived model predictions for the biases and the variabilities of the position judgments in bimodal trials. Input to the model consisted of the observed variances in unimodal trials and the observed integration strength. In the model equations, the integration strength is converted into a parameter that is called coupling prior variance:  
\(\def\upalpha{\unicode[Times]{x3B1}}\)\(\def\upbeta{\unicode[Times]{x3B2}}\)\(\def\upgamma{\unicode[Times]{x3B3}}\)\(\def\updelta{\unicode[Times]{x3B4}}\)\(\def\upvarepsilon{\unicode[Times]{x3B5}}\)\(\def\upzeta{\unicode[Times]{x3B6}}\)\(\def\upeta{\unicode[Times]{x3B7}}\)\(\def\uptheta{\unicode[Times]{x3B8}}\)\(\def\upiota{\unicode[Times]{x3B9}}\)\(\def\upkappa{\unicode[Times]{x3BA}}\)\(\def\uplambda{\unicode[Times]{x3BB}}\)\(\def\upmu{\unicode[Times]{x3BC}}\)\(\def\upnu{\unicode[Times]{x3BD}}\)\(\def\upxi{\unicode[Times]{x3BE}}\)\(\def\upomicron{\unicode[Times]{x3BF}}\)\(\def\uppi{\unicode[Times]{x3C0}}\)\(\def\uprho{\unicode[Times]{x3C1}}\)\(\def\upsigma{\unicode[Times]{x3C3}}\)\(\def\uptau{\unicode[Times]{x3C4}}\)\(\def\upupsilon{\unicode[Times]{x3C5}}\)\(\def\upphi{\unicode[Times]{x3C6}}\)\(\def\upchi{\unicode[Times]{x3C7}}\)\(\def\uppsy{\unicode[Times]{x3C8}}\)\(\def\upomega{\unicode[Times]{x3C9}}\)\(\def\bialpha{\boldsymbol{\alpha}}\)\(\def\bibeta{\boldsymbol{\beta}}\)\(\def\bigamma{\boldsymbol{\gamma}}\)\(\def\bidelta{\boldsymbol{\delta}}\)\(\def\bivarepsilon{\boldsymbol{\varepsilon}}\)\(\def\bizeta{\boldsymbol{\zeta}}\)\(\def\bieta{\boldsymbol{\eta}}\)\(\def\bitheta{\boldsymbol{\theta}}\)\(\def\biiota{\boldsymbol{\iota}}\)\(\def\bikappa{\boldsymbol{\kappa}}\)\(\def\bilambda{\boldsymbol{\lambda}}\)\(\def\bimu{\boldsymbol{\mu}}\)\(\def\binu{\boldsymbol{\nu}}\)\(\def\bixi{\boldsymbol{\xi}}\)\(\def\biomicron{\boldsymbol{\micron}}\)\(\def\bipi{\boldsymbol{\pi}}\)\(\def\birho{\boldsymbol{\rho}}\)\(\def\bisigma{\boldsymbol{\sigma}}\)\(\def\bitau{\boldsymbol{\tau}}\)\(\def\biupsilon{\boldsymbol{\upsilon}}\)\(\def\biphi{\boldsymbol{\phi}}\)\(\def\bichi{\boldsymbol{\chi}}\)\(\def\bipsy{\boldsymbol{\psy}}\)\(\def\biomega{\boldsymbol{\omega}}\)\(\def\bupalpha{\unicode[Times]{x1D6C2}}\)\(\def\bupbeta{\unicode[Times]{x1D6C3}}\)\(\def\bupgamma{\unicode[Times]{x1D6C4}}\)\(\def\bupdelta{\unicode[Times]{x1D6C5}}\)\(\def\bupepsilon{\unicode[Times]{x1D6C6}}\)\(\def\bupvarepsilon{\unicode[Times]{x1D6DC}}\)\(\def\bupzeta{\unicode[Times]{x1D6C7}}\)\(\def\bupeta{\unicode[Times]{x1D6C8}}\)\(\def\buptheta{\unicode[Times]{x1D6C9}}\)\(\def\bupiota{\unicode[Times]{x1D6CA}}\)\(\def\bupkappa{\unicode[Times]{x1D6CB}}\)\(\def\buplambda{\unicode[Times]{x1D6CC}}\)\(\def\bupmu{\unicode[Times]{x1D6CD}}\)\(\def\bupnu{\unicode[Times]{x1D6CE}}\)\(\def\bupxi{\unicode[Times]{x1D6CF}}\)\(\def\bupomicron{\unicode[Times]{x1D6D0}}\)\(\def\buppi{\unicode[Times]{x1D6D1}}\)\(\def\buprho{\unicode[Times]{x1D6D2}}\)\(\def\bupsigma{\unicode[Times]{x1D6D4}}\)\(\def\buptau{\unicode[Times]{x1D6D5}}\)\(\def\bupupsilon{\unicode[Times]{x1D6D6}}\)\(\def\bupphi{\unicode[Times]{x1D6D7}}\)\(\def\bupchi{\unicode[Times]{x1D6D8}}\)\(\def\buppsy{\unicode[Times]{x1D6D9}}\)\(\def\bupomega{\unicode[Times]{x1D6DA}}\)\(\def\bupvartheta{\unicode[Times]{x1D6DD}}\)\(\def\bGamma{\bf{\Gamma}}\)\(\def\bDelta{\bf{\Delta}}\)\(\def\bTheta{\bf{\Theta}}\)\(\def\bLambda{\bf{\Lambda}}\)\(\def\bXi{\bf{\Xi}}\)\(\def\bPi{\bf{\Pi}}\)\(\def\bSigma{\bf{\Sigma}}\)\(\def\bUpsilon{\bf{\Upsilon}}\)\(\def\bPhi{\bf{\Phi}}\)\(\def\bPsi{\bf{\Psi}}\)\(\def\bOmega{\bf{\Omega}}\)\(\def\iGamma{\unicode[Times]{x1D6E4}}\)\(\def\iDelta{\unicode[Times]{x1D6E5}}\)\(\def\iTheta{\unicode[Times]{x1D6E9}}\)\(\def\iLambda{\unicode[Times]{x1D6EC}}\)\(\def\iXi{\unicode[Times]{x1D6EF}}\)\(\def\iPi{\unicode[Times]{x1D6F1}}\)\(\def\iSigma{\unicode[Times]{x1D6F4}}\)\(\def\iUpsilon{\unicode[Times]{x1D6F6}}\)\(\def\iPhi{\unicode[Times]{x1D6F7}}\)\(\def\iPsi{\unicode[Times]{x1D6F9}}\)\(\def\iOmega{\unicode[Times]{x1D6FA}}\)\(\def\biGamma{\unicode[Times]{x1D71E}}\)\(\def\biDelta{\unicode[Times]{x1D71F}}\)\(\def\biTheta{\unicode[Times]{x1D723}}\)\(\def\biLambda{\unicode[Times]{x1D726}}\)\(\def\biXi{\unicode[Times]{x1D729}}\)\(\def\biPi{\unicode[Times]{x1D72B}}\)\(\def\biSigma{\unicode[Times]{x1D72E}}\)\(\def\biUpsilon{\unicode[Times]{x1D730}}\)\(\def\biPhi{\unicode[Times]{x1D731}}\)\(\def\biPsi{\unicode[Times]{x1D733}}\)\(\def\biOmega{\unicode[Times]{x1D734}}\)\begin{equation}\tag{1}{\rm{\upsigma }}_{{\rm{prior}}}^2 = {{1 - {{\rm{\uplambda }}_{{\rm{obs}}}}} \over {{\lambda _{{\rm{obs}}}}}}\left( {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2} \right)\end{equation}
 
This parameter is essential for the variation of coupling strength between full integration (Display Formula\({\rm{\upsigma }}_{{\rm{prior}}}^2 = 0)\) and independence (Display Formula\({\rm{\upsigma }}_{{\rm{prior}}}^2 = \infty )\). It can be thought of as a measure of the uncertainty for that the sensory signals arise from a single source and should therefore be integrated. The predicted optimal bimodal weights are computed as  
\begin{equation}\tag{2}{w_{{\rm{H}}\_{\rm{pred}}}} = {{{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2} \over {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{prior}}}^2}},\quad{w_{{\rm{C}}\_{\rm{pred}}}} = {{{\rm{\upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2} \over {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{prior}}}^2}}\end{equation}
 
The predicted optimal variances for the bimodal position judgments are computed as  
\begin{equation}\tag{3}{\rm{\upsigma }}_{{\rm{C}}.{\rm{H}}\_{\rm{pred}}}^2 = {{{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2\left( {{\rm{\upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{prior}}}^2} \right)} \over {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2 + {\rm{\upsigma }}_{{\rm{prior}}}^2}},\quad{\rm{\upsigma }}_{{\rm{H}}.{\rm{C}}\_{\rm{pred}}}^2 = {{{\rm{\upsigma }}_{{\rm{H}}\_{\rm{obs}}}^2\left( {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{prior}}}^2} \right)} \over {{\rm{\upsigma }}_{{\rm{C}}\_{\rm{obs}}}^2 + {\rm{\ \upsigma }}_{{\rm{H}}\_{\rm{obs\ }}}^2 + {\rm{\upsigma }}_{{\rm{prior}}}^2}}\end{equation}
 
Note that Equations 2 and 3 are identical to the standard equations for full integration (e.g., Ernst & Banks, 2002) except for the variance of the coupling prior, Display Formula\({\rm{\upsigma }}_{{\rm{prior}}}^2\). The model is hence an extension of the full integration model (i.e., Display Formula\({\rm{\upsigma }}_{{\rm{prior}}}^2 = 0\)) in that it also captures partial integration. 
Statistical analysis
We excluded three participants from all analyses. In Experiment 1, one participant had more than 25% of the BiCursor trials in the SamePlane condition identified as outliers, that is, 21 out of the 80 trials (8 visuomotor rotations × 10 repetitions). For the other twelve participants, between 0 and 6 trials out of the 80 trials per trial type were identified as outliers and excluded (mean 0.84 ± 0.17 trials). In Experiment 2, one participant showed no integration as indicated by slightly negative integration strengths (−0.16 and −0.29 for the OrthogPlane and SamePlane conditions, respectively). Another participant did not follow the instruction to return to the start position upon reaching the movement endpoint and instead rested his/her hand at the movement endpoint (1,763 ms averaged over the four trial types in contrast to 476 ± 107 ms for the other participants). For the remaining eleven participants, between 0 and 15 trials were excluded (mean 3.08 ± 0.50 trials). 
All comparisons of the dependent measures between the experimental conditions and between observed and predicted measures were done by means of t tests and two-way ANOVAs for repeated measurements. In addition, we assessed the interindividual covariation of observed and predicted weights and variances by means of correlations. 
Results
We first report the effect of the experimental manipulations on the integration strength, our primary measure of interest, and on the variability of the unimodal position estimates. Subsequently we present the bimodal position judgments' characteristics, specifically the biases of cursor position and hand position judgments and their variability. According to optimal integration, these measures depend on the individual participants' integration strengths and their individual unimodal variabilities. We compared our data with model predictions for partial integration to reveal potential additional effects of our experimental manipulations. 
Strength of sensory integration
The mean strength of sensory integration in each of the two spatial conditions is shown in Figure 4a, both for the Cursor response task (Experiment 1) and the Hand response task (Experiment 2). In both experiments the strength of sensory integration did not differ significantly between the conditions with orthogonal and same planes of motion of hand and cursor: t(12) = 0.22, p = 0.831; and t(10) = 0.89, p = 0.395, respectively. We expected stronger integration in condition SamePlane than in condition OrthogPlane, yet the nonsignificant tendency was in the opposite direction. 
Figure 4
 
Integration strength and unimodal variability. (a) The integration strength for condition OrthogPlane (light gray bars) and SamePlane (dark gray bars), for both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). (b) The standard deviation of the position judgments for the unimodal position judgments of cursor position (UniCursor trials) or hand position (UniHand trials), in both the OrthogPlane and SamePlane condition and both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). The line pattern indicates which transformations were involved in the position judgments.
Figure 4
 
Integration strength and unimodal variability. (a) The integration strength for condition OrthogPlane (light gray bars) and SamePlane (dark gray bars), for both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). (b) The standard deviation of the position judgments for the unimodal position judgments of cursor position (UniCursor trials) or hand position (UniHand trials), in both the OrthogPlane and SamePlane condition and both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). The line pattern indicates which transformations were involved in the position judgments.
Variability of unimodal position judgments
The mean standard deviations of the judgments in the unimodal trials (UniCursor and UniHand) are illustrated in Figure 4b. Compared across experiments, variability in the Hand response task was larger than in the Cursor response task. More importantly, within each experiment, variability was larger for judgments that involved transformations (indicated by the additional marking on the bars) than for judgments that involved no transformations. Specifically, for the Cursor response task (Experiment 1), the variability of the hand position judgments was much larger than that of the cursor position judgments, and for the Hand response task (Experiment 2), the variability of the cursor position judgments was slightly larger than that of the hand position judgments. There was also a slightly larger variability for judgments that involved both a spatial transformation and a modality transformation (indicated in the figure by the crossed marking) than for judgments that involved only a modality transformation (indicated by the striped marking). 
An ANOVA with the within-participant factors spatial condition (OrthogPlane vs. SamePlane) and position judged (hand vs. cursor) for Experiment 1 revealed no main effect of condition, F(1, 12) = 0.00, p = 0.998, but a significant main effect of position judged, F(1, 12) = 91.83, p < 0.001. The interaction just failed to reach statistical significance, F(1, 12) = 4.36, p = 0.059. The same type of ANOVA for Experiment 2 revealed no main effects of spatial condition, F(1, 10) = 1.17, p = 0.304, and position judged, F(1, 10) = 0.75, p = 0.406, and again a marginally significant interaction, F(1, 10) = 4.53, p = 0.059. We conducted posthoc pairwise comparisons on the difference between the two spatial conditions. This revealed a marginally significant difference only for the judgment variability in UniCursor trials of Experiment 2, t(10) = 2.15, p = 0.057; other p > 0.188. Thus, the noise associated with the horizontal to frontal spatial transformation in Experiment 1 was statistically not significant, and the noise associated with the frontal to horizontal spatial transformation in Experiment 2 was small and only marginally significant. 
Observed and predicted biases of bimodal position judgments
Figure 5a illustrates the mean observed biases of the bimodal hand position judgments (BiHand trials) toward the cursor as standing gray bars, and the mean observed biases of the bimodal cursor position judgments (BiCursor trials) toward the hand position as hanging gray bars. The markings of the gray bars indicate which transformations were involved in the different conditions. The mean predicted biases are illustrated by the accompanying white bars. 
Figure 5
 
Bimodal position judgments' characteristics. (a) The biases in the bimodal judgments of hand position toward the cursor position (BiHand trials; the standing bars) and the biases in the cursor position judgment toward the hand position (BiCursor trials; the hanging bars). The gray bars indicate the observed biases in the OrthogPlane (light gray bars) and SamePlane (dark gray bars) conditions; the open white bars indicate the biases predicted for optimal partial integration. The line pattern on the gray bars indicates which transformations were involved in the position judgments. (b) The standard deviation of the bimodal position judgments as observed (gray bars) and predicted (open bars). The color coding and line patterns are as in panel (a). The horizontal black lines indicate, for each bimodal variability, the variability of the corresponding unimodal trials.
Figure 5
 
Bimodal position judgments' characteristics. (a) The biases in the bimodal judgments of hand position toward the cursor position (BiHand trials; the standing bars) and the biases in the cursor position judgment toward the hand position (BiCursor trials; the hanging bars). The gray bars indicate the observed biases in the OrthogPlane (light gray bars) and SamePlane (dark gray bars) conditions; the open white bars indicate the biases predicted for optimal partial integration. The line pattern on the gray bars indicates which transformations were involved in the position judgments. (b) The standard deviation of the bimodal position judgments as observed (gray bars) and predicted (open bars). The color coding and line patterns are as in panel (a). The horizontal black lines indicate, for each bimodal variability, the variability of the corresponding unimodal trials.
For the Cursor response task (Experiment 1), the observed biases were strongly asymmetric, being more toward the cursor position than toward the hand position (i.e., higher standing than hanging bars), whereas they were more symmetric for the Hand response task (Experiment 2). For Experiment 1, a two-way ANOVA with the within-participant factors spatial condition (OrthogPlane vs. SamePlane) and position judged (hand vs. cursor) revealed no effect of condition, F(1, 12) = 0.05, p = 0.831, a significant main effect of position judged, F(1, 12) = 46.38, p < 0.001, and a marginally significant interaction effect, F(1, 12) = 4.45, p = 0.057. Posthoc comparisons indicated a significantly different bias in the cursor position judgments, i.e., the hanging bars, t(12) = 2.29, p = 0.041, but not for the hand position judgments, i.e., the standing bars, t(10) = 0.81, p = 0.433. For Experiment 2 there were no significant effects (all p > 0.371). 
The comparison between the observed and predicted biases allows us to distinguish between two types of effects of the experimental conditions. The first type are indirect effects: According to the reliability-based weighting mechanism of optimal sensory integration, modulations of coupling strength and variabilities of unimodal estimates by the experimental conditions should have consequences for the biases. These are predicted by the model. The second type are direct effects of the experimental manipulations that are not predicted by the model, that is, that cannot be understood as consequences of the modulation of coupling strength and unimodal variabilities. 
The observed and predicted biases were highly similar, except for the SamePlane condition of Experiment 1 where the observed biases were somewhat more oriented toward the cursor than predicted. Note here that the model predictions were set to have the observed integration strength. This means that the sums of the observed and predicted weights are equal and that just their relative magnitude was unconstrained. We compared the differences between observed and predicted biases for the BiHand trials (which thus equal that for the BiCursor trials) by means of one-sample t tests. Only for the SamePlane condition of Experiment 1 the difference was significant: t(12) = 3.61, p = 0.004; all other p > 0.845. The correlations between the individual observed and predicted biases were high in all conditions of Experiment 1 (between 0.63 and 0.92; all p < 0.021), and somewhat lower in Experiment 2 (between 0.44 and 0.56; p between 0.096 and 0.177). Overall the biases of participants' bimodal position judgments were rather well explained by the reliability-based weighting mechanism and were not directly affected by the experimental conditions, except that in the SamePlane condition there was a somewhat stronger-than-predicted bias toward the cursor position with the Cursor response task of Experiment 1. 
Observed and predicted variability of the bimodal position judgements
Figure 5b illustrates the mean standard deviations of the position judgments in the bimodal trials (BiCursor and BiHand). The gray bars indicate the observed standard deviation, the accompanying white bars the predicted ones. The black horizontal lines indicate the mean standard deviations in the corresponding unimodal trials (i.e., the standard deviations shown in Figure 4b). For the OrthogPlane condition, the pattern of standard deviations in bimodal trials was similar to what we observed before: The bimodal judgment that corresponded to the most reliable unimodal judgment was not reduced, or it even increased, in variability (BiCursor in the Cursor response task; BiHand in the Hand response task); the bimodal judgment corresponding to the less reliable unimodal judgment benefitted from the integration with a reduced variability (BiHand in the Cursor response task; BiCursor in the Hand response task). For the SamePlane condition, no reduction of variability in bimodal trials compared to the unimodal trials was observed at all. 
We hypothesized that the absence of a spatial transformation in the SamePlane condition might lead to—closer to—optimal variabilities in bimodal trials. This hypothesis was not confirmed. If anything, the difference between observed and predicted standard deviations was somewhat larger in condition SamePlane than in condition OrthogPlane. We compared the differences between predicted and observed standard deviations by means of ANOVAs with the within-participant factors spatial condition (OrthogPlane vs. SamePlane) and position judged (hand vs. cursor). Although this analysis revealed no significant main or interaction effects for Experiment 1 (all p > 0.159), the main effect of spatial condition was significant for Experiment 2: F(1, 10) = 5.11, p = 0.047; other p values > 0.383, confirming the larger difference between observed and predicted standard deviations in condition SamePlane. Finally, we found marginally significant correlations between the observed and predicted standard deviations for the BiHand trials in condition SamePlane of both experiments (respectively, r = 0.54, p = 0.055, and r = 0.57, p = 0.066) and significant correlations in all other conditions (r between 0.62 and 0.90; p < 0.044). 
Discussion
In the current study we explored the integration of sensory information on an action (here: a hand movement) with sensory information on a visual effect of that action (here: a cursor motion). The experiments were designed to answer two questions. The primary question is whether the integration strength depends on the hand-cursor spatial separation. The secondary question relates to the role of transformations between separate planes of motion for the suboptimality of variances of integrated hand and cursor position judgments that we had observed in previous studies (Debats et al., 2017b, 2017a; Debats & Heuer, 2018). We discuss the answers to these questions in turn. 
To test the influence of spatial separation on the strength of sensory integration in cursor control, we compared two spatial conditions. In condition OrthogPlane hand and cursor were spatially separated and moved in the horizontal and frontal plane, respectively, similar to everyday computer work. In condition SamePlane, in contrast, hand and cursor were in close spatial proximity and moved in the same horizontal plane. In two experiments, in which position judgments were provided by different response tasks, the strength of sensory integration was not different between the two spatial conditions. Thus, we conclude that in a cursor-control task spatial proximity does not enhance sensory integration, in particular the underlying neural estimate of whether or not signals belong together. 
This conclusion appears in sharp contrast to findings according to which spatial separation disrupts sensory integration (e.g., Gepshtein et al., 2005; Slutsky & Recanzone, 2001; Spence, 2013). Sensory integration presupposes a distinction between sensory signals that should be integrated and sensory signals that should not be integrated. Signals that have a common cause and are thus redundant are among the first type of signals, whereas signals that arise from different objects or events are of the second type. Spatial proximity seems to be taken by the brain as evidence of a common cause, whereas spatial separation suggests different causes. From this perspective sensory integration in cursor control appears quite unusual: Hand and cursor are clearly distinct objects that typically move in different planes of motion that are spatially separated. 
Even though spatial separation generally reduces the strength of sensory integration, there are conditions that can mitigate the effects of separation. This happens, for example, when a visible tool relates the separated sources of sensory signals (Takahashi et al., 2009; Takahashi & Watt, 2014) or some other relation is established between them (Helbig & Ernst, 2007). A powerful driver of sensory integration seems to be a cross-correlation between sensory signals (e.g., Parise & Ernst, 2016). Different sensory signals that emanate from the same source typically exhibit spatiotemporal correlations, so that—vice versa—correlations can serve as evidence of a single source. Cross-correlations evidently exist between kinematic variables of hand movements and cursor motions, even when hand and cursor are spatially separated and move in different planes. In fact, a reduction of kinematic cross-correlations between trajectories of hand and cursor has been shown to result in reduced integration strength in cursor-control tasks (Debats et al., 2017a). In the present experiments kinematic cross-correlations were present in each bimodal trial. We hypothesized that the causality evidence provided by spatial proximity would add to the causality evidence provided by these correlations. Possibly, these sources of causality evidence are not additive, meaning that a strong cross-correlation could have masked an influence of spatial proximity. A role of spatial proximity for the strength of sensory integration might still exist when the kinematic cross-correlations are reduced. If not, this would suggest that sensory integration of actions and their visual effects (or visual reafferences) might differ in principle from integration of exafferent sensory information. 
Sensory integration bears a certain similarity to adaptation. In the cursor-control task, sensory integration reduces discrepancies between estimates of cursor and hand positions, and so does adaptation: The judged position of the hand is typically shifted toward the physical position of the cursor in the preceding adaptation period (Cressman & Henriques, 2009, 2010; Simani, McGuire, & Sabes, 2007; Synofzik, Lindner, & Thier, 2008; Wilke, Synofzik, & Lindner, 2013; Zbib, Henriques, & Cressman, 2016), and the judged position of the cursor can be shifted toward the physical position of the hand in the adaptation trials (cf. Hatada, Miall, & Rossetti, 2006; van Beers, Wolpert, & Haggard, 2002). After adaptation, sensory integration seems to be based on the adapted position estimates (Rand & Heuer, 2017). But there are also obvious differences between integration and adaptation. For example, sensory integration is an essentially immediate phenomenon, whereas adaptation develops more slowly. When a sensory discrepancy is reduced by sensory coupling, this in general is not a precursor of adaptation (Smeets, van den Dobbelsteen, de Grave, van Beers, & Brenner, 2006). 
Similar to sensory integration, adaptation may also depend on sensory cross-correlations and not require spatial proximity. For example, adaptation to visuomotor rotations does not only occur when vision and proprioception refer to the same object as to the hand in prism-adaptation studies (for a review see e.g., Redding, Rossetti, & Wallace, 2005). Here the felt and seen hand are in close spatial proximity and move in the same plane. Rather adaptation to visuomotor rotations can also reliably be observed in cursor-control tasks where the direction of cursor motion is consistently rotated relative to the direction of hand movement (e.g., Cunningham, 1989; Cunningham & Welch, 1994; Krakauer, Pine, Ghilardi, & Ghez, 2000). Although adaptation to visuomotor rotations in cursor-control tasks has received a great deal of attention during the last few decades, we are not aware of a systematic comparison of adaptation effects when hand and cursor move in orthogonal or same planes. Adaptation studies have used the one or the other type of setup, and to our knowledge there are no obvious discrepancies between the findings obtained with them. Thus, the conclusion regarding the effect of spatial proximity on sensory integration might generalize to the effect of spatial proximity on visuomotor adaptation. 
Our variation of spatial proximity involved two components. First, in condition SamePlane movements of cursor and hand were in the same plane rather than in orthogonal planes, and second, the distance between the felt hand and the visible cursor was much smaller than in condition OrthogPlane. Both components of our variation of spatial proximity do not affect the kinematic cross-correlations. Thus, it is certainly justified to generalize our results to variations of just one of the components—integration strength should not vary, for example, when the distance between a vertical monitor and a horizontal workspace is varied. However, for the variation of temporal proximity other results should be expected. When the cursor motion is delayed relative to the hand movement, kinematic cross-correlations are reduced. Only lagged cross-correlations would remain unaffected, and it is likely that at sufficiently long lags, these fall beyond the temporal window of integration (e.g., van Wassenhove, Grant, & Poeppel, 2007). This is also seen, for example, by the finding that adaptation to lateral displacement of egocentric direction by means of wedge prisms disappears when visual reafferences are delayed by 0.3 s or longer (Held, Efstathiou, & Greene, 1966), and by the finding that adaptation to a visuomotor rotation is reduced when terminal visual feedback is delayed up to 1500 ms (Schween & Hegele, 2017). 
The second question addressed by the present experiments is about the noise that comes with the transformations required to perform the cursor-control task and to provide judgments of the positions of cursor and hand. For judging the hand position, the Cursor response task of Experiment 1 requires a transformation between modalities (proprioceptive to visual) and—when the monitor is in the frontal plane—an additional transformation between planes of motion (horizontal to frontal). The Hand response task of Experiment 2 requires the opposite transformations between modalities and planes of motion when the position of the cursor is judged. Debats et al. (2017b) used both response tasks in a single experiment with the frontal monitor setup. The cursor position judgments were much more variable with the Hand response task (entailing modality and spatial transformations) than with the Cursor response task. The hand position judgments, in contrast, were more variable with the Cursor response task (entailing modality and spatial transformations) than with the Hand response task. The findings with the SamePlane condition of the present experiments, that requires no spatial transformations, reveal that only a small fraction of the variability due to transformations results from transformations between separate planes of motion, but by far the larger fraction from transformations between modalities. 
Both in the earlier study (Debats et al., 2017b) and in the present experiments the modulations of variabilities (or reliabilities) in unimodal trials were largely consistent with the modulations of judgment biases in bimodal trials as predicted by the reliability-based-weighting mechanism of optimal integration models. Specifically, with the Cursor response task the variability of cursor-position judgments was smaller than the variability of hand-position judgments, and the bias of cursor judgments toward the position of the hand was smaller than the bias of hand judgments toward the position of the cursor. With the Hand response task, this was reversed, though the differences were smaller. Focusing on the biases, the bias of cursor-position judgments was smaller with the Cursor response task than with the Hand response task, whereas the bias of hand-position judgments was smaller with the Hand response task than with the Cursor response task. In other words, the bias was smaller when the judged position was in the same modality as the judgment than when a modality transformation was required. This observation corresponds to a finding of Ladwig and colleagues (2013) who studied the biases of judged amplitudes of hand movements and cursor motions when judgments were made by matching hand movements or cursor motions: Biases were stronger whenever a modality transformation was required. Although for that study the reliabilities of the unisensory estimates are not known, we suspect that the biases also obey a reliability-based-weighting mechanism. More generally, these findings indicate that sensory integration is governed not only by the characteristics of the sensory input, but also by the characteristics of the response tasks or the relation between sensory input and response tasks (see also Rohe & Noppeney, 2018). 
The variability of the position judgments in bimodal trials was systematically larger than the predicted optimal variabilities. In the current study we asked whether the suboptimality, or the “excess variability,” of the biased position judgments in bimodal trials could be related to the transformation between planes of motion as required when reporting the judgments. This would be indicated by a smaller excess variability in condition SamePlane, where judgments require a transformation between modalities but no spatial transformation, than in condition OrthogPlane, where both types of transformation are required. This difference was not seen in our data; if anything, the difference between the two spatial conditions with respect to excess variability was opposite to expectations. The origin of the excess variability of the bimodal position judgments thus remains unclear. 
Both the biases and the variabilities of position judgments in bimodal trials can be affected by experimental manipulations such as the two spatial conditions of the present experiments for two quite different reasons. First, there can be indirect effects of the experimental conditions, which result from their effects on integration strength and the variabilities of judgments in unimodal trials. Indirect effects are predicted by optimal-integration models that formalize reliability-based-weighting mechanisms. In the present experiments, as in previous ones (Debats et al., 2017a, 2017b; Debats & Heuer, 2018), there was considerable concordance between model predictions and observations. Second, there can be direct effects of experimental conditions that are revealed by discrepancies between observations and model predictions. In addition to the excess variability of position judgments in bimodal trials, there was only one such discrepancy in the present experiments: In the SamePlane condition of Experiment 1 (Cursor response task) the position judgments had a somewhat stronger bias toward the cursor position than predicted from a reliability-based-weighting mechanism, whereas in Experiment 2 (Hand response task) there was no correspondingly stronger-than-predicted bias toward the hand position. It thus seems that when hand and cursor are closely adjacent (as in the SamePlane condition), visual information receives a weight in sensory integration that is stronger than justified by its unimodal reliability when vision is the judgment modality, whereas this is not the case for proprioceptive information when proprioception is the judgment modality. The reason might be an attenuation of proprioceptive input in the presence of the visual input (e.g., Heuer & Rapp, 2012; Müsseler & Sutter, 2009) that specifically occurs when the hand and cursor are in close proximity and for the cursor response task only. 
In summary, the present experiments show that the strength of sensory integration in cursor-control tasks does not depend on the spatial separation of the hand and the cursor. The variability of cursor-position and hand-position judgments was found to be strongly affected by required transformations between different sensory modalities, but only little by required transformations between the spatially separated planes of motion. The typically observed suboptimal variability of the judgments of cursor and hand positions in bimodal trials was not reduced when the hand and cursor were in the same plane of motion. 
Acknowledgments
We thank A. Oppenborn and F. Steinbeck for assistance in running the experiments. The contribution of N.B. Debats was supported by German Research Foundation (DFG) grant HE1187/19-1. We furthermore acknowledge the financial support of the German Research Foundation and the Open Access Publication Fund of Bielefeld University for the article processing charges. 
Commercial relationships: none. 
Corresponding author: Nienke B. Debats. 
Address: Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany. 
References
Bock, O., Schneider, S., & Bloomberg, J. (2001). Conditions for interference versus facilitation during sequential sensorimotor adaptation. Experimental Brain Research, 138 (3), 359–365, https://doi.org/10.1007/s002210100704.
Bock, O., & Thomas, M. (2011). Proprioception plays a different role for sensorimotor adaptation to different distortions. Human Movement Science, 30 (3), 415–423, https://doi.org/10.1016/j.humov.2010.10.007.
Cressman, E. K., & Henriques, D. Y. P. (2009). Sensory Recalibration of Hand Position Following Visuomotor Adaptation. Journal of Neurophysiology, 102 (6), 3505–3518, https://doi.org/10.1152/jn.00514.2009.
Cressman, E. K., & Henriques, D. Y. P. (2010). Reach adaptation and proprioceptive recalibration following exposure to misaligned sensory input. Journal of Neurophysiology, 103 (4), 1888–1895, https://doi.org/10.1152/jn.01002.2009.
Cunningham, H. A. (1989). Aiming error under transformed spatial mappings suggests a structure for visual-motor maps. Journal of Experimental Psychology: Human Perception and Performance, 15 (3), 493–506.
Cunningham, H. A., & Welch, R. B. (1994). Multiple concurrent visual-motor mappings: Implications for models of adaptation. Journal of Experimental Psychology: Human Perception and Performance, 20 (5), 987–999.
Debats, N. B., Ernst, M. O., & Heuer, H. (2017a). Kinematic cross-correlation induces sensory integration across separate objects. European Journal of Neuroscience, 46 (12), 2826–2834, https://doi.org/10.1111/ejn.13758.
Debats, N. B., Ernst, M. O., & Heuer, H. (2017b). Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism. Journal of Neurophysiology, 117 (4), 1569–1580, https://doi.org/10.1152/jn.00724.2016.
Debats, N. B., & Heuer, H. (2018). Optimal integration of actions and their visual effects is based on both online and prior causality evidence. Scientific Reports, 8 (1), 9796, https://doi.org/10.1038/s41598-018-28251-x.
Ernst, M. O. (2006). A Bayesian view on multimodal cue integration. In Knoblich, G. Thornton, I. M. Grosjean, M. & Shiffrar M. (Eds.), Human body perception from the inside out (pp. 105–131). New York, NY: Oxford University Press.
Ernst, M. O. (2007). Learning to integrate arbitrary signals from vision and touch. Journal of Vision, 7 (5): 7, 1–14, https://doi.org/10.1167/7.5.7. [PubMed] [Article]
Ernst, M. O. (2012). Optimal multisensory integration: Assumptions and limits. In Stein B. E. (Ed.), The new handbook of multisensory processes (pp. 1084–1124). Cambridge, MA: The MIT Press.
Ernst, M. O., & Banks, M. S. (2002, January 24). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415 (6870), 429–433, https://doi.org/10.1038/415429a.
Gepshtein, S., Burge, J., Ernst, M. O., & Banks, M. S. (2005). The combination of vision and touch depends on spatial proximity. Journal of Vision, 5 (11): 7, 1013–1023, https://doi.org/10.1167/5.11.7. [PubMed] [Article]
Hatada, Y., Miall, R. C., & Rossetti, Y. (2006). Two waves of a long-lasting aftereffect of prism adaptation measured over 7 days. Experimental Brain Research, 169 (3), 417–426, https://doi.org/10.1007/s00221-005-0159-y.
Helbig, H. B., & Ernst, M. O. (2007). Knowledge about a common source can promote visual- haptic integration. Perception, 36 (10), 1523–1533.
Held, R. H., Efstathiou, A., & Greene, M. E. (1966). Adaptation to displaced and delayed visual feedback from the hand. Journal of Experimental Psychology, 72 (6), 887–891. Retrieved 3 August 2018, from /paper/Adaptation-to-Displaced-and-Delayed-Visual-Feedback-Held-Efstathiou/1dc288c28242cc58b62e93fa33faff6857620692
Heuer, H., & Rapp, K. (2012). Adaptation to novel visuo-motor transformations: Further evidence of functional haptic neglect. Experimental Brain Research, 218 (1), 129–140, https://doi.org/10.1007/s00221-012-3013-z.
Kirsch, W., Pfister, R., & Kunde, W. (2016). Spatial action-effect binding. Attention, Perception & Psychophysics, 78 (1), 133–142, https://doi.org/10.3758/s13414-015-0997-z.
Kleiner, M., Brainard, D., & Pelli, D. (2007). What's new in Psychtoolbox-3? Perception, 36 (14), 1–16. (ECVP Abstract Supplement 14). Retrieved from https://doi.org/10.1068/v070821
Krakauer, J. W., Pine, Z. M., Ghilardi, M. F., & Ghez, C. (2000). Learning of visuomotor transformations for vectorial planning of reaching trajectories. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 20 (23), 8916–8924.
Ladwig, S., Sutter, C., & Müsseler, J. (2012). Crosstalk between proximal and distal action effects during tool use. Zeitschrift Für Psychologie, 220 (1), 10–15, https://doi.org/10.1027/2151-2604/a000085.
Ladwig, S., Sutter, C., & Müsseler, J. (2013). Intra- and intermodal integration of discrepant visual and proprioceptive action effects. Experimental Brain Research, 231 (4), 457–468, https://doi.org/10.1007/s00221-013-3710-2.
Müsseler, J., & Sutter, C. (2009). Perceiving one's own movements when using a tool. Consciousness and Cognition, 18 (2), 359–365, https://doi.org/10.1016/j.concog.2009.02.004.
Parise, C. V., & Ernst, M. O. (2016). Correlation detection as a general mechanism for multisensory integration. Nature Communications, 7, 11543, https://doi.org/10.1038/ncomms11543.
Rand, M. K., & Heuer, H. (2013). Implicit and explicit representations of hand position in tool use. PLoS One, 8 (7), e68471, https://doi.org/10.1371/journal.pone.0068471.
Rand, M. K., & Heuer, H. (2016). Effects of reliability and global context on explicit and implicit measures of sensed hand position in cursor-control tasks. Frontiers in Psychology, 6, https://doi.org/10.3389/fpsyg.2015.02056.
Rand, M. K., & Heuer, H. (2017). Contrasting effects of adaptation to a visuomotor rotation on explicit and implicit measures of sensory coupling. Psychological Research, 1–16, https://doi.org/10.1007/s00426-017-0931-1.
Redding, G. M., Rossetti, Y., & Wallace, B. (2005). Applications of prism adaptation: a tutorial in theory and method. Neuroscience and Biobehavioral Reviews, 29 (3), 431–444, https://doi.org/10.1016/j.neubiorev.2004.12.004.
Reichenbach, A., & Diedrichsen, J. (2015). Processing reafferent and exafferent visual information for action and perception. Journal of Vision, 15 (8): 11, 1–12, https://doi.org/10.1167/15.8.11. [PubMed] [Article]
Reichenbach, A., Franklin, D. W., Zatka-Haas, P., & Diedrichsen, J. (2014). A dedicated binding mechanism for the visual control of movement. Current Biology, 24 (7), 780–785, https://doi.org/10.1016/j.cub.2014.02.030.
Rohe, T., & Noppeney, U. (2018). Reliability-weighted integration of audiovisual signals can be modulated by top-down control. ENeuro, ENEURO.0315-17.2018, https://doi.org/10.1523/ENEURO.0315-17.2018.
Schween, R., & Hegele, M. (2017). Feedback delay attenuates implicit but facilitates explicit adjustments to a visuomotor rotation. Neurobiology of Learning and Memory, 140, 124–133, https://doi.org/10.1016/j.nlm.2017.02.015.
Shams, L., & Beierholm, U. R. (2010). Causal inference in perception. Trends in Cognitive Sciences, 14 (9), 425–432, https://doi.org/10.1016/j.tics.2010.07.001.
Simani, M. C., McGuire, L. M. M., & Sabes, P. N. (2007). Visual-shift adaptation is composed of separable sensory and task-dependent effects. Journal of Neurophysiology, 98 (5), 2827–2841, https://doi.org/10.1152/jn.00290.2007.
Slutsky, D. A., & Recanzone, G. H. (2001). Temporal and spatial dependency of the ventriloquism effect. Neuroreport, 12 (1), 7–10.
Smeets, J. B. J., van den Dobbelsteen, J. J., de Grave, D. D. J., van Beers, R. J., & Brenner, E. (2006). Sensory integration does not lead to sensory calibration. Proceedings of the National Academy of Sciences, USA, 103 (49), 18781–18786, https://doi.org/10.1073/pnas.0607687103.
Spence, C. (2013). Just how important is spatial coincidence to multisensory integration? Evaluating the spatial rule. Annals of the New York Academy of Sciences, 1296, 31–49, https://doi.org/10.1111/nyas.12121.
Sülzenbrück, S., & Heuer, H. (2009). Functional independence of explicit and implicit motor adjustments. Consciousness and Cognition, 18 (1), 145–159, https://doi.org/10.1016/j.concog.2008.12.001.
Synofzik, M., Lindner, A., & Thier, P. (2008). The cerebellum updates predictions about the visual consequences of one's behavior. Current Biology: CB, 18 (11), 814–818, https://doi.org/10.1016/j.cub.2008.04.071.
Synofzik, M., Thier, P., & Lindner, A. (2006). Internalizing agency of self-action: Perception of one's own hand movements depends on an adaptable prediction about the sensory action outcome. Journal of Neurophysiology, 96 (3), 1592–1601, https://doi.org/10.1152/jn.00104.2006.
Takahashi, C., Diedrichsen, J., & Watt, S. J. (2009). Integration of vision and haptics during tool use. Journal of Vision, 9 (6): 3, 1–13, https://doi.org/10.1167/9.6.3. [PubMed] [Article]
Takahashi, C., & Watt, S. J. (2014). Visual-haptic integration with pliers and tongs: Signal “weights” take account of changes in haptic sensitivity caused by different tools. Cognition, 5, 109, https://doi.org/10.3389/fpsyg.2014.00109.
van Beers, R. J., Sittig, A. C., & Denier van der Gon, J. J. (1996). How humans combine simultaneous proprioceptive and visual position information. Experimental Brain Research, 111 (2), 253–261.
van Beers, R. J., Wolpert, D. M., & Haggard, P. (2002). When feeling is more important than seeing in sensorimotor adaptation. Current Biology: CB, 12 (10), 834–837.
van Dam, L. C. J., Parise, C. V., & Ernst, M. O. (2014). Modeling multisensory integration. In Bennett D. J. & Hill C. S. (Eds.), Sensory integration and the unity of consciousness (pp. 209–229). Cambridge, MA: The MIT Press. Retrieved from http://mitpress.universitypressscholarship.com/view/10.7551/mitpress/9780262027786.001.0001/upso-9780262027786-chapter-10
van Wassenhove, V., Grant, K. W., & Poeppel, D. (2007). Temporal window of integration in auditory-visual speech perception. Neuropsychologia, 45 (3), 598–607, https://doi.org/10.1016/j.neuropsychologia.2006.01.001.
Wilke, C., Synofzik, M., & Lindner, A. (2013). Sensorimotor recalibration depends on attribution of sensory prediction errors to internal causes. PLoS One, 8 (1), e54925, https://doi.org/10.1371/journal.pone.0054925.
Zbib, B., Henriques, D. Y. P., & Cressman, E. K. (2016). Proprioceptive recalibration arises slowly compared to reach adaptation. Experimental Brain Research, 234 (8), 2201–2213, https://doi.org/10.1007/s00221-016-4624-6.
Figure 1
 
Setup, task and transformations. (a) The experimental setup, in which participants made hand movement on the half-circular workspace on the horizontal digitizer table. The corresponding cursor motion could either be presented on the monitor in the frontal plane (OrthogPlane), or the cursor could appear on the horizontal monitor such that it was reflected by the mirror just below the chin-rest to create a virtual image of the cursor in the exact plane of the hand movements (SamePlane). (b) The cursor was visible during the outward trajectory and disappeared once the hand reached the workspace boundary. This movement endpoint was later—after the return movement—to be judged by the participants. (c) The direction of the cursor movement deviated slightly from the direction of the hand movement with eight levels of visuomotor rotation. This discrepancy was needed to observe the degree to which judgment of hand position were biases toward the cursor position, and vice versa. (d) The position judgments were provided either by placing a cursor on the remembered hand or cursor movement endpoint (upper panel, Cursor response task), where the cursor position along the semicircular path was controlled by small lateral movements of the stylus (inset in upper panel), or by placing the hand on the remembered hand or cursor movement endpoint (lower panel, Hand response task) without visual feedback (inset in lower panel). (e) When the hand and cursor appeared in separate planes of motion (condition OrthogPlane), the judgments made with the Cursor response task and the Hand response task involved either no transformations or both a modality and spatial transformation (i.e., horizontal to frontal plane of motion, or vice versa). When the hand and cursor appeared in the same planes of motion (condition SamePlane—not illustrated here), the spatial transformation was no longer involved.
Figure 1
 
Setup, task and transformations. (a) The experimental setup, in which participants made hand movement on the half-circular workspace on the horizontal digitizer table. The corresponding cursor motion could either be presented on the monitor in the frontal plane (OrthogPlane), or the cursor could appear on the horizontal monitor such that it was reflected by the mirror just below the chin-rest to create a virtual image of the cursor in the exact plane of the hand movements (SamePlane). (b) The cursor was visible during the outward trajectory and disappeared once the hand reached the workspace boundary. This movement endpoint was later—after the return movement—to be judged by the participants. (c) The direction of the cursor movement deviated slightly from the direction of the hand movement with eight levels of visuomotor rotation. This discrepancy was needed to observe the degree to which judgment of hand position were biases toward the cursor position, and vice versa. (d) The position judgments were provided either by placing a cursor on the remembered hand or cursor movement endpoint (upper panel, Cursor response task), where the cursor position along the semicircular path was controlled by small lateral movements of the stylus (inset in upper panel), or by placing the hand on the remembered hand or cursor movement endpoint (lower panel, Hand response task) without visual feedback (inset in lower panel). (e) When the hand and cursor appeared in separate planes of motion (condition OrthogPlane), the judgments made with the Cursor response task and the Hand response task involved either no transformations or both a modality and spatial transformation (i.e., horizontal to frontal plane of motion, or vice versa). When the hand and cursor appeared in the same planes of motion (condition SamePlane—not illustrated here), the spatial transformation was no longer involved.
Figure 2
 
Illustration of the detailed procedure of the bimodal trials. The tablet and monitor workspace are presented overlapping here, whereby the white items indicate images shown on the monitor, and black items indicate positions that were defined in both reference frames. Further explanation is provided in the main text under “Detailed procedure.”
Figure 2
 
Illustration of the detailed procedure of the bimodal trials. The tablet and monitor workspace are presented overlapping here, whereby the white items indicate images shown on the monitor, and black items indicate positions that were defined in both reference frames. Further explanation is provided in the main text under “Detailed procedure.”
Figure 3
 
Partial integration and regression analysis. (a) According to the so-called Coupling Prior model, partial integration fills the continuum between no integration (weights are zero) and complete fusion (the weights add up to one). The weights can be assessed experimentally through the relative judgment biases. For partial integration, the weights (and thus the relative biases) add up to any value between zero and one, with their sum indicating the integration strength. This illustration shows the predicted optimal biases and variability for an integration strength of 0.75; the unimodal variability is consistent with the average values for the OrthogPlane condition in Experiment 1. The dotted line indicated the optimal integration prediction for fusion (i.e., integration strength of one). (b) The relative biases were assessed using regression analysis, specifically of the deviation between the true and judged movement endpoint on the deviation between the true hand and cursor endpoints. This is illustrated here for the BiHand judgment of two exemplary participants for the OrthogPlane condition in Experiment 2. If the position judgment were on the horizontal line, this would have indicated that the participants' hand position judgments were—correctly scattered around the true hand position. If they had been scattered around the diagonal line, this would have indicated that their judgments were located around the true cursor position. The left and right panels illustrate one participant with low and one with high judgment variability, respectively.
Figure 3
 
Partial integration and regression analysis. (a) According to the so-called Coupling Prior model, partial integration fills the continuum between no integration (weights are zero) and complete fusion (the weights add up to one). The weights can be assessed experimentally through the relative judgment biases. For partial integration, the weights (and thus the relative biases) add up to any value between zero and one, with their sum indicating the integration strength. This illustration shows the predicted optimal biases and variability for an integration strength of 0.75; the unimodal variability is consistent with the average values for the OrthogPlane condition in Experiment 1. The dotted line indicated the optimal integration prediction for fusion (i.e., integration strength of one). (b) The relative biases were assessed using regression analysis, specifically of the deviation between the true and judged movement endpoint on the deviation between the true hand and cursor endpoints. This is illustrated here for the BiHand judgment of two exemplary participants for the OrthogPlane condition in Experiment 2. If the position judgment were on the horizontal line, this would have indicated that the participants' hand position judgments were—correctly scattered around the true hand position. If they had been scattered around the diagonal line, this would have indicated that their judgments were located around the true cursor position. The left and right panels illustrate one participant with low and one with high judgment variability, respectively.
Figure 4
 
Integration strength and unimodal variability. (a) The integration strength for condition OrthogPlane (light gray bars) and SamePlane (dark gray bars), for both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). (b) The standard deviation of the position judgments for the unimodal position judgments of cursor position (UniCursor trials) or hand position (UniHand trials), in both the OrthogPlane and SamePlane condition and both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). The line pattern indicates which transformations were involved in the position judgments.
Figure 4
 
Integration strength and unimodal variability. (a) The integration strength for condition OrthogPlane (light gray bars) and SamePlane (dark gray bars), for both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). (b) The standard deviation of the position judgments for the unimodal position judgments of cursor position (UniCursor trials) or hand position (UniHand trials), in both the OrthogPlane and SamePlane condition and both Experiment 1 (Cursor response task) and Experiment 2 (Hand response task). The line pattern indicates which transformations were involved in the position judgments.
Figure 5
 
Bimodal position judgments' characteristics. (a) The biases in the bimodal judgments of hand position toward the cursor position (BiHand trials; the standing bars) and the biases in the cursor position judgment toward the hand position (BiCursor trials; the hanging bars). The gray bars indicate the observed biases in the OrthogPlane (light gray bars) and SamePlane (dark gray bars) conditions; the open white bars indicate the biases predicted for optimal partial integration. The line pattern on the gray bars indicates which transformations were involved in the position judgments. (b) The standard deviation of the bimodal position judgments as observed (gray bars) and predicted (open bars). The color coding and line patterns are as in panel (a). The horizontal black lines indicate, for each bimodal variability, the variability of the corresponding unimodal trials.
Figure 5
 
Bimodal position judgments' characteristics. (a) The biases in the bimodal judgments of hand position toward the cursor position (BiHand trials; the standing bars) and the biases in the cursor position judgment toward the hand position (BiCursor trials; the hanging bars). The gray bars indicate the observed biases in the OrthogPlane (light gray bars) and SamePlane (dark gray bars) conditions; the open white bars indicate the biases predicted for optimal partial integration. The line pattern on the gray bars indicates which transformations were involved in the position judgments. (b) The standard deviation of the bimodal position judgments as observed (gray bars) and predicted (open bars). The color coding and line patterns are as in panel (a). The horizontal black lines indicate, for each bimodal variability, the variability of the corresponding unimodal trials.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×