Abstract
Experience unfolds as a stream of particular sensory events. Yet from such unstructured and specific input, humans are able to build structured and generalizable representations such as algebraic rules (Marcus, G. F., Vijayan, S., Rao, B., & Vishton, P. (1999). Rule learning by seven-month-old infants. Science, 283(Jan), 77–80). In Experiment 1, we probed several properties of learning in a similar scenario, in which participants viewed continuous streams of events with no instruction to find regularities. Events were visual changes of state (e.g., flashes of light, streams of bubbles), with weaker or stronger pairwise transition probabilities. We asked whether learners would be sensitive to directionality differences among pairwise relationships (AB vs BA), and, additionally, whether they would see such asymmetrical predictive relations as causal. We saw evidence of directionality sensitivity using a 2AFC task (t(18) = 4.26, p < 0.001). Subjects who were accurately aware of the predictive relationships also attributed causality to them when probed post-task (t(10) = 3.13, p = 0.01). This supports the idea that spontaneous sensitivity to event statistics can lead to the acquisition of structured and even causal representations without instruction to look for them. In Experiment 2, we investigated whether such event statistics could be used to construct novel categories of objects. Events took place surrounding different novel objects, which sometimes moved. In the presence of each object, event statistics could vary: either their movements followed, or preceded, another of the events (e.g., light flash); movements were unrelated to other, equally frequent events. Thus, objects differed purely on the direction of statistical contingency to a certain event. Participants reliably classified another new object according to this event structure, controlling for physical shape (Binomial test, p = 0.005). We suggest that sensitivity to such event statistics can support the acquisition of functional categories of objects.
Meeting abstract presented at VSS 2017