Affiliation:
1. University of Sussex, Brighton, UK
Abstract
Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.
Funder
EPSRC and activeAI
European Union’s Horizon 2020 Research and Innovation Program
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献