Purchase this article with an account.
Alexandra List, Lucica Iordanescu, Marcia Grabowecky, Satoru Suzuki; Haptic shape guides visual search. Journal of Vision 2012;12(9):1320. doi: https://doi.org/10.1167/12.9.1320.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Crossmodal research has shown that information from one sensory modality can influence perceptual and attentional processes in other modalities. Here, we demonstrate a novel crossmodal interaction between haptics and vision, in which haptic shape information influences visual attention. In our study, participants manually explored (unseen) objects and searched for a visual target while we recorded their eye movements. The shape of the manually held object facilitated search for similarly-shaped visual items, whether visual targets were typically graspable in size (e.g., a cell phone, a badge) or not (e.g., a planet, a high rise). This facilitation manifests as a reduction in both overall search times and initial saccade latencies when the haptic shape (e.g., a cone) is consistent with a visual target (e.g., a teepee, a party hat), compared to when it is inconsistent (e.g., a hockey puck, an orange). These haptic-visual facilitative effects occur despite the fact that the manually held shapes are anti-predictive of the visual target's shape, suggesting that this influence is not due to expectation or bias. Additionally, when the haptic shape is consistent with a distracter (instead of the target) in the visual search array, the initial saccades toward the target are disrupted. Together, these results suggest that this crossmodal influence is automatic and demonstrate a robust shape-specific haptic capture of visual attention.
Meeting abstract presented at VSS 2012
This PDF is available to Subscribers Only