Abstract
AbstractWhile primates are primarily visual animals, how visual information is processed on its way to memory structures and contributes to the generation of visuospatial behaviors is poorly understood. Recent imaging data demonstrate the existence of scene-sensitive areas in the dorsal visual path that are likely to combine visual information from successive egocentric views, while behavioral evidence indicates the memory of surrounding visual space in extraretinal coordinates. The present work focuses on the computational nature of a panoramic representation that is proposed to link visual and mnemonic functions during natural behavior. In a spiking neural network model of the dorsal visual path it is shown how time-integration of spatial views can give rise to such a representation and how it can subsequently be used to perform memory-based spatial reorientation and visual search. More generally, the model predicts a common role of view-based allocentric memory storage in spatial and non-spatial mnemonic behaviors.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献