September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Neural Responses to Natural Versus AI-generated Affective Images
Author Affiliations
  • Yujun Chen
    UNIVERSITY OF FLORIDA
  • Faith Gilbert
    UNIVERSITY OF FLORIDA
  • Ethan Smith
    UNIVERSITY OF FLORIDA
  • Ruogu Fang
    UNIVERSITY OF FLORIDA
  • Andreas Keil
    UNIVERSITY OF FLORIDA
  • Mingzhou Ding
    UNIVERSITY OF FLORIDA
Journal of Vision September 2024, Vol.24, 473. doi:https://doi.org/10.1167/jov.24.10.473
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yujun Chen, Faith Gilbert, Ethan Smith, Ruogu Fang, Andreas Keil, Mingzhou Ding; Neural Responses to Natural Versus AI-generated Affective Images. Journal of Vision 2024;24(10):473. https://doi.org/10.1167/jov.24.10.473.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The International Affective Picture System (IAPS) contains 1,182 well-characterized photographs depicting natural scenes varying in affective content. These pictures are used extensively in affective neuroscience to investigate the neural correlates of emotional processing. Recently, in an effort to augment this dataset, we have begun to generate synthetic emotional images by combining IAPS pictures and diffusion-based AI models. The goal of this study is to compare the neural responses to IAPS pictures and matching AI-generated images. The stimulus set consisted of 60 IAPS pictures (20 pleasant, 20 neutral, 20 unpleasant) and 60 matching AI-generated images (20 pleasant, 20 neutral, 20 unpleasant). In a recording session, a total of 30 IAPS pictures and 30 matching AI-generated images were presented in random order, where each image was displayed for 3 seconds with neighboring images being separated by an interval of 2.8 to 3.5 seconds. Each experiment consisted of 10 recording sessions. The fMRI data was recorded on a 3T Siemens Prisma scanner. Pupil responses to image presentation were monitored using an MRI-compatible eyetracker. Our preliminary analysis of the fMRI data (N=3) showed that IAPS pictures and matching AI-generated images evoked similar neural responses in the visual cortex. In particular, MVPA (Multivariate Pattern Analysis) classifiers built to decode emotional categories from neural responses to IAPS pictures can be used to decode emotional categories from neural responses to AI-generated images and vice versa. Efforts to confirm these findings are underway by recruiting additional participants. Analysis is also being expanded to include the comparison of such measures as functional connectivity and pupillometry.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×