Abstract
Objects are fundamental units of perception which structure our experience of space. A striking finding about object representation is that objecthood warps our perception of spatial distance, such that two dots perceived within an object appear farther apart than two dots perceived in empty space — an illusion known as “Object-based Warping” (Vickery & Chun, 2010). However, just as objects are fundamental to our experience of space, events are fundamental to our experience of time. Does spatial object-based warping have a temporal event-based counterpart? Here, we show that it does: Just as two dots in an *object* are perceived as farther apart in *space*, we show that two probes within an *event* are perceived as further apart in *time* — introducing “Event-based Warping” to the literature. In our first experiment, subjects judged the duration between two auditory probes (i.e., two tones) with respect to a standard reference duration. There were two types of trials: In event trials, probes were presented during an auditory event (a brief period of noise during an otherwise-silent soundtrack). In non-event trials, probes were not presented during any auditory event but simply played in silence. Subjects judged probes within an event to be further apart than probes not within an event, showing that event representations warp perceived duration. Crucially, we further demonstrate that this temporal warping phenomenon also arises in vision. In a second, cross-modal experiment, observers judged the duration between two visual probes (i.e., two flashes) presented either within or not within an auditory event. Again, subjects judged probes on event trials as further apart in time than probes on non-event trials, showing that event-based warping occurs cross-modally. We suggest that object-based warping and event-based warping are instances of a more general phenomenon in which structural representations give rise to perceptual distortions.