Abstract
Throughout our lives, we accumulate defining perceptual characteristics about objects in our environment, such as their typical colours. There is debate about the degree to which representations of perceptual features involves similar neural mechanisms as perceiving those features. Here, we explore this question using typical colour. Participants completed a target detection task using real and implied colour stimuli. In the real colour condition, we used isoluminant red and green shapes. In the implied colour condition, the stimulus set consisted of grey-scale luminance- and size-matched pictures of fruits and vegetables which are typically red (e.g., tomatoes) or green (e.g., cucumbers). We recorded brain activity using Magnetoencephalography (MEG) and applied Multivariate Pattern Analysis (MVPA) to investigate the temporal dynamics of real and implied colour processing. We show that (1) real colour objects can be reliably classified into red and green categories on the basis of MEG data, (2) trials showing grey-scale pictures of fruits and vegetables can similarly be classified into their natural colour categories, and (3) implied object-colour can be decoded when a classifier is trained on distinguishing real colour, but only with a delay of ~57ms. This delay suggests slower processing of implied colours in comparison to real colours. We tested and confirmed this prediction with a follow-up behavioural experiment, showing that participants are faster at judging whether an abstract shape is red or green than indicating the typical colour of a grey-scale object. Together, these results demonstrate that activating object-colour knowledge and perceiving colour evokes a common brain activation pattern, but that the key difference lies in the time that it takes to access this common colour representation.
Meeting abstract presented at VSS 2018