October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Visual long-term memory for image style
Author Affiliations & Notes
  • Yana Yu
    Kyoto University
  • Yuki Takeda
    Kyoto University
  • Hiroyuki Tsuda
    National Institute of Advanced Industrial Science and Technology
  • Jun Saiki
    Kyoto University
  • Footnotes
    Acknowledgements  This research was supported by JSPS KAKENHI Grant Number 18H05006.
Journal of Vision October 2020, Vol.20, 641. doi:https://doi.org/10.1167/jov.20.11.641
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yana Yu, Yuki Takeda, Hiroyuki Tsuda, Jun Saiki; Visual long-term memory for image style. Journal of Vision 2020;20(11):641. https://doi.org/10.1167/jov.20.11.641.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Previous studies have shown systematic bias in memory for visual information such as color, orientation and location. For instance, a study on visual working memory for color using method of adjustment found that responses drawn from working memory are significantly biased away from category boundaries and toward category centers (Bae, Olkkonen, Allred, & Flombaum, 2015). However, it is unclear whether systematic bias also occurs in memory for high-level visual features, such as the style of an image. Here we investigated long-term memory for image style using a method of adjustment. The style transfer algorithm based on deep learning (Gatys, Ecker, & Bethge, 2016) was used to convert scene photos into painting-like scene images. Based on painting styles of four well-known artworks (Rembrandt, Braque, Monet and Kandinsky), we created image series (72 image styles for each of 72 scenes) in which the style of painting changes continuously. The experiment consisted of a learning phase and a test phase, between which there was a 30-minute interval. In the learning phase, participants looked at 72 scene images, and each of the images was shown for 10s. Participants were asked to memorize the content and style of every scene image. In the test phase, the same set of scenes was shown in different styles from the learning phase, and participants adjusted the image styles to reproduce those appeared in the learning phase. Response frequency data show that more responses were made for near-prototypical stimuli, compared with mixed-style stimuli. The bias in the current study is different from that we found in a previous short-term memory experiment of image style, which showed fewer response for near-prototypical style, and more response for mixed-style stimuli. Taken together, the current study suggests that systematic bias also exists in memory for high-level features like image style.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×