Purchase this article with an account.
Jennifer Bittner; Classification images of multispectral and fused natural scenes . Journal of Vision 2016;16(12):318. https://doi.org/10.1167/16.12.318.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Image fusion is a specific type of image manipulation that aims to enhance human visual perception and performance. Through the combination of images captured in varying spectral bands (e.g. visible, thermal, night vision), image fusion attempts to provide a "best of both worlds" presentation within a single output image. Past studies using ideal observer analysis to examine the impact of image fusion upon visual perception (i.e., Bittner, et al. VSS 2014) have found that, contrary to fusion goals, simple Landolt C images captured in single spectral bands can affect human efficiency as much or more than their corresponding fused images. The current work expands this examination in technique and stimulus complexity, using response classification to consider the proportion of information (i.e. stimulus features) utilized by humans in natural scenes as they are manipulated between single-band imagery and image fusion presentations. Classification image results reveal that, in a simple 1-of-2 choice task of an actor holding a shovel on opposing sides of his body, different areas of the stimulus are utilized by humans dependent on the type of imagery presented. Specifically, the areas used in the fused images parallel those used in the component thermal imagery but not in the component visible imagery. Initial results of a second study indicate that these patterns augment when given a change to task and stimulus content. This work applying response classification to image fusion provides not only an examination of the stimulus components used by the visual system in scenes, but also bridges the worlds of applied and vision science research, expanding the general framework for analyzing the influence of image enhancement in direct relation to the human visual system.
Meeting abstract presented at VSS 2016
This PDF is available to Subscribers Only