Abstract
The generation of genuinely creative works of art could be considered as the final frontier in artificial intelligence (AI). Several AI research groups are pursuing this by programming algorithms which generate works in various media and styles. The ultimate test of success for such a machine artist would be to convince the onlooker that it was generated by a human being; an artistic Turing test. Previous research has shown that observers can distinguish artworks generated by a skilled human artist over those of a child or an animal using the perception of intentionality- the appearance of a planned final product (Hawley-Dolan & Winner, 2011). However, there is a little research on whether observers can differentiate between computer-generated art and man-made art, and if so, whether these judgments are driven by impressions of intentionality or surface characteristics that identify the mode of production. Furthermore, the decision that an artwork is computer-generated may reflect a negative aesthetic preference, as research has suggested that believing an artwork or musical composition to be computer-generated negatively affects its aesthetic appraisal (Kirk et al, 2009; Moffat & Kelly, 2006). The current study examined whether individuals were able to differentiate between works of art whose creative or representational abilities are defined by computer algorithms, from matched artworks created by human artists. Participants sorted artworks into computer- or human-generated and indicated their aesthetic preference on a 7 point Likert-scale. Results show that participants were able to successfully determine the provenance of the artworks. Perception of intentionality and surface characteristics as well as a subset of image statistics (Pyramid of Histograms of Orientation Gradients (PHOG), luminance spectra) were also investigated in relation to source decision criteria. The results have implications for the way in which AI algorithmic art is created, as well as providing insights into their aesthetic perception.
Meeting abstract presented at VSS 2015