Author:
Shaham Nimrod,Chandra Jay,Kreiman Gabriel,Sompolinsky Haim
Abstract
AbstractHumans have the remarkable ability to continually store new memories, while maintaining old memories for a lifetime. How the brain avoids catastrophic forgetting of memories due to interference between encoded memories is an open problem in computational neuroscience. Here we present a model for continual learning in a recurrent neural network combining Hebbian learning, synaptic decay and a novel memory consolidation mechanism. Memories undergo stochastic rehearsals with rates proportional to the memory’s basin of attraction, causing self-amplified consolidation, giving rise to memory lifetimes that extend much longer than synaptic decay time, and capacity proportional to a power of the number of neurons. Perturbations to the circuit model cause temporally-graded retrograde and anterograde deficits, mimicking observed memory impairments following neurological trauma.
Publisher
Cold Spring Harbor Laboratory
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献