Purchase this article with an account.
Kazushi Maruya, Alex O. Holcombe, Shin'ya Nishida; Rapid encoding of relationships between spatially remote motion signals. Journal of Vision 2013;13(2):4. doi: 10.1167/13.2.4.
Download citation file:
© 2017 Association for Research in Vision and Ophthalmology.
For visual processing, the temporal correlation of remote local motion signals is a strong cue to detect meaningful large-scale structures in the retinal image, because related points are likely to move together regardless of their spatial separation. While the processing of multi-element motion patterns involved in biological motion and optic flow has been studied intensively, the encoding of simpler pairwise relationships between remote motion signals remains poorly understood. We investigated this process by measuring the temporal rate limit for perceiving the relationship of two motion directions presented at the same time at different spatial locations. Compared to luminance or orientation, motion comparison was more rapid. Performance remained very high even when interstimulus separation was increased up to 100°. Motion comparison also remained rapid regardless of whether the two motion directions were similar to or different from each other. The exception was a dramatic slowing when the elements formed an orthogonal “T,” in which two motions do not perceptually group together. Motion presented at task-irrelevant positions did not reduce performance, suggesting that the rapid motion comparison could not be ascribed to global optic flow processing. Our findings reveal the existence and unique nature of specialized processing that encodes long-range relationships between motion signals for quick appreciation of global dynamic scene structure.
This PDF is available to Subscribers Only