Abstract
A 2D image that can be recognized easily by vision is not always recognized by touch. This is believed to be partly because haptic perception for 2D image makes heavy demands on working memory. During active exploration, we need to store not only the latest local sensory information, but also to integrate this with kinesthetic information of the hand and fingers location to generate a coherent percept. This tactile integration over time has not been studied as extensively as visual shape integration. Our initial purpose of the experiment was to compare the working memory storage of tactile exploration to that of visual exploration as measured in tactile and visual change detection tasks. We have reported a series of experiment that showed much longer search time and longer process speed per item to detect a change during tactile exploration than visual exploration (e.g., VSS '04). These results lead to the extremely small estimated tactile memory held during search as if suggesting that haptic system is almost amnestic outside the fingertips and little or no integration cross-position integration in 2D tactile perception either. Here, we tested the hypothesis that an extremely narrow field of view can explain our results. Using gaze-contingent moving window display system, an apparatus that actively narrows the visual field of view based on eye movements, we found that a narrow field of view also dramatically prolonged the process speed per item during active visual scanning. We manipulated the size of the window and realized that 2deg. visual field predicted the haptic data. These results indicate that our previous results can be attributed to a narrow field of view of tactile domain.