May 2008
Volume 8, Issue 6
Free
Vision Sciences Society Annual Meeting Abstract  |   May 2008
Modeling interactions between visually-responsive and movement-related neurons in FEF during saccade visual search
Author Affiliations
  • Braden A. Purcell
    Department of Psychology, Vanderbilt University
  • Richard P. Heitz
    Department of Psychology, Vanderbilt University
  • Jeremiah Y. Cohen
    Department of Psychology, Vanderbilt University, and Vanderbilt Brain Institute
  • Gordon D. Logan
    Department of Psychology, Vanderbilt University, and Center for Integrative & Cognitive Neuroscience
  • Jeffrey D. Schall
    Department of Psychology, Vanderbilt University, and Center for Integrative & Cognitive Neuroscience
  • Thomas J. Palmeri
    Department of Psychology, Vanderbilt University, and Center for Integrative & Cognitive Neuroscience
Journal of Vision May 2008, Vol.8, 1080. doi:https://doi.org/10.1167/8.6.1080
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Braden A. Purcell, Richard P. Heitz, Jeremiah Y. Cohen, Gordon D. Logan, Jeffrey D. Schall, Thomas J. Palmeri; Modeling interactions between visually-responsive and movement-related neurons in FEF during saccade visual search. Journal of Vision 2008;8(6):1080. https://doi.org/10.1167/8.6.1080.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Neural activity in the frontal eye field (FEF) has been implicated in mapping visual information about object importance to explicit decisions about where to move the eyes (Schall, 2001). These components seem to be represented by two classes of neurons: visual neurons respond differentially according to the relevance of visual stimuli and movement neurons exhibit pre-motor activity before saccadic eye movements. Boucher et al. (2007) accounted for the behavior of movement cells as accumulators to a threshold, but the activity of visual cells was ignored. We evaluated the hypothesis that movement cell activity can be accounted for in terms of input from visual cells. Single-unit neurophysiological data were recorded from visual neurons in the FEF of awake behaving macaque monkeys performing a visual search task. For a given cell, trials were classified by whether the target or a distractor appeared in the cell's receptive field. Simulations sampled recorded activity from these two populations of trials. This activity serves as input to algorithms designed to account for the temporal dynamics of movement cell activity, as well as reaction time distributions and accuracy. At present, two models have been evaluated. The first model instantiates the hypothesis that movement cells compute the difference in activity of visual cells with the target in their receptive field and cells with a distractor in their receptive field. The second model instantiates the hypothesis that movement cells integrate this difference over time. In both cases, reaction time is defined as the time when this measure derived from visual activity crosses a threshold. Preliminary analyses indicate that the latter provides a better quantitative account of reaction time distributions. However, further model exploration is necessary to conclude whether either of these models provides the best account of not only the behavioral data but neurophysiological data as well.

Purcell, B. A. Heitz, R. P. Cohen, J. Y. Logan, G. D. Schall, J. D. Palmeri, T. J. (2008). Modeling interactions between visually-responsive and movement-related neurons in FEF during saccade visual search [Abstract]. Journal of Vision, 8(6):1080, 1080a, http://journalofvision.org/8/6/1080/, doi:10.1167/8.6.1080. [CrossRef]
Footnotes
 Supported by AFOSR, NSF SBE-0542013, NEI R01-EY08890, P30-EY08126, VU ACCRE and Ingram Chair of Neuroscience.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×