Author:
Sezener Eren,Grabska-Barwińska Agnieszka,Kostadinov Dimitar,Beau Maxime,Krishnagopal Sanjukta,Budden David,Hutter Marcus,Veness Joel,Botvinick Matthew,Clopath Claudia,Häusser Michael,Latham Peter E.
Abstract
AbstractThe dominant view in neuroscience is that changes in synaptic weights underlie learning. It is unclear, however, how the brain is able to determine which synapses should change, and by how much. This uncertainty stands in sharp contrast to deep learning, where changes in weights are explicitly engineered to optimize performance. However, the main tool for doing that, backpropagation, is not biologically plausible, and networks trained with this rule tend to forget old tasks when learning new ones. Here we introduce the Dendritic Gated Network (DGN), a variant of the Gated Linear Network [1, 2], which offers a biologically plausible alternative to backpropagation. DGNs combine dendritic “gating” (whereby interneurons target dendrites to shape neuronal response) with local learning rules to yield provably efficient performance. They are significantly more data efficient than conventional artificial networks and are highly resistant to forgetting, and we show that they perform well on a variety of tasks, in some cases better than backpropagation. The DGN bears similarities to the cerebellum, where there is evidence for shaping of Purkinje cell responses by interneurons. It also makes several experimental predictions, one of which we validate with in vivo cerebellar imaging of mice performing a motor task.
Publisher
Cold Spring Harbor Laboratory
Reference98 articles.
1. Online learning with gated linear networks;arXiv preprint,2017
2. Veness, J. et al. Gated linear networks. Proceedings of the AAAI Conference on Artificial Intelligence (To Appear) (2021).
3. Layered reward signalling through octopamine and dopamine in Drosophila
4. A subset of dopamine neurons signals reward for odour memory in Drosophila
5. Deep learning in neural networks: An overview
Cited by
14 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献