Abstract
The neural bases of tactile space perception are poorly understood. While several neuroimaging studies have investigated the “what” pathway of tactile object perception, little is known about how the brain represents external spatial locations as perceived via touch.
In this functional magnetic resonance imaging (fMRI) study, we set out to explore the neural representation(s) of target locations as felt by the hand in the absence of visual input.
Subjects explored rectangular Lego brick platforms on which targets (smaller Lego blocks) were located at various locations (e.g. middle of platform, bottom left corner of the platform, etc.). Subjects performed a one-back matching task in which they compared the location of one target on a platform with that of the next platform, and indicated if the locations matched or not. An experimenter presented the stimuli to subjects, who used their right hand. Targets could be narrow or wide, with subjects being instructed to ignore the width of the targets and focus on their location. Subjects were blindfolded and had no visual input during the experiment, relying solely on their haptic perception.
In a control condition, the subjects' task was to determine if the width of a target matched that of the next target in a one-back matching task, using the same stimuli as before.
During baseline, subjects rested their hand.
A fronto-parietal network of activations were observed during the spatial exploration task. Interestingly, many areas normally involved in visual spatial perception were activated during haptic spatial perception, including parietal cortex. Visual motion-related (MT+) areas were also activated. In some subjects, V1 activation was observed, despite the lack of visual input.
The observed activations in parietal areas which are also recruited during visual spatial perception suggest the presence of multisensory representations.