Abstract
When viewing the actions of others, we often see them imperfectly, briefly disappearing from view or otherwise obscured. Previous research shows that individuals generate real-time action simulations aiding prediction of an action's future course, such as during brief occlusion from view (Graf et al, 2007).
The current study investigates whether the action simulation directly aids the perception of visually degraded actions. Dynamic human actions such as a basketball shot were presented using point-light (PL) actors embedded in a dynamic visual black-and-white noise background resembling “TV snow”. The PL actor was clearly visible at first (1–1.5 s), then briefly disappeared (400 ms ‘occlusion’) – during which the participant generates a real-time action simulation – and then reappeared (360 ms test motion). Prior to occlusion, the PL actor joints were easily visible squares of white pixels, but in the test motion, the PL joints were comprised of dynamic random white and black pixels. By changing the percentage of white versus black pixels in joints, and thus varying contrast against the noise background, the test motion was visually degraded. The test contrast was adjusted using an adaptive staircase method to measure contrast-thresholds for the detection of test motion appearance. In the crucial manipulation, the test motion was either a natural progression of the motion as it would have continued during occlusion, and thus temporally matching the simulation, or temporally shifted earlier or later (±300 ms). Contrast-thresholds for detection were lower for natural compared to shifted test motions, suggesting that when the visually degraded test motion temporally matches the simulation it is more easily detectable. Overall, these results suggest that real-time simulation of human actions during occlusion aids the detection of visually degraded actions, indicating a strong perceptual role for action simulation in human motion processing.