Purchase this article with an account.
Bret Fortenberry, Anatoli Gorchetchnikov, Stephen Grossberg; Computing head direction from interacting visual and vestibular cues during visually-based navigation in the rat. Journal of Vision 2009;9(8):1119. doi: https://doi.org/10.1167/9.8.1119.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Visually-based navigation depends upon reliable estimates of head direction (HD). Visual and path integration cues combine for this purpose in a brain system that includes dorsal tegmental nucleus, lateral mammillary nuclei, anterior dorsal thalamic nucleus, and the post subiculum. Learning is needed to combine such different cues to provide the most reliable estimates of HD. A neural model is developed to explain how these cues combine adaptively to generate a consistent and reliable HD estimate, in both light or darkness, that explains the following types of data: Each HD cell is tuned to a preferred head orientation (Taube et al. 1990: Sharp et al 1995: Taube 1995). The cell's firing rate is maximal at the preferred direction and decreases as the head turns from the preferred direction. The preferred direction is controlled by the vestibular system when visual cues are not available. However, a well-established visual cue will control the preferred direction when the cue is in the animal's field of view (Taube 1995; Zugaro et al. 2001; Zugaro et al. 2004). Distal visual cues are a more effective than proximal cues for controlling the preferred direction (Zugaro et al. 2001). The introduction of novel cues in either a novel or familiar environment can gain control over a cell's preferred direction within minutes (Goodridge et al. 1998). Turning out the lights or removing all familiar cues does not change the cell's firing activity, but it may cause drift in the cell's preferred direction (Taube et al. 1990; Goodridge et al. 1998).
This PDF is available to Subscribers Only