Abstract
The scientific investigation of eye movements in natural or simulated and “naturalistic” environments has historically been operating at the limit of what head-worn eye tracking technology is capable of. In this presentation, I will review efforts by myself and collaborators at RIT to push these limits, and to increase the scope of scientific inquiry into natural visual and motor behavior. This talk will end with a brief discussion of emerging methods, most of which are aimed at resolving long standing limitations to video-based eye tracking. Most notably, as a consequence of USB transfer limits and stringent power budgets, video based eye trackers are restricted to either a high spatial resolution of the eye image which improves the spatial accuracy of the final gaze estimate, or a high temporal sampling rate (i.e., a high number of eye frames per second), but not both.
Funding: Funding: Meta Reality Labs and R15EY031090