Abstract
Abstract
The retrieval capabilities of associative neural networks are known to be impaired by fast noise, which endows neuron behavior with some degree of stochasticity, and by slow noise, due to interference among stored memories; here, we allow for another source of noise, referred to as “synaptic noise,” which may stem from i. corrupted information provided during learning, ii. shortcomings occurring in the learning stage, or iii. flaws occurring in the storing stage, and which accordingly affects the couplings among neurons. Indeed, we prove that this kind of noise can also yield to a break-down of retrieval and, just like the slow noise, its effect can be softened by relying on density, namely by allowing p-body interactions among neurons.
Funder
Sapienza Università di Roma
Publisher
Springer Science and Business Media LLC
Subject
General Physics and Astronomy
Reference23 articles.
1. K.-L. Du, M.N.S. Swamy, Neural Networks and Statistical Learning (Springer, London, 2014)
2. D.J. Amit, Modeling Brain Functions (Cambridge University Press, Cambridge, 1989)
3. A.C.C. Coolen, R. Kuhn, P. Sollich, Theory of Neural Information Processing Systems (Oxford Press, Oxford, 2005)
4. A. Fachechi, E. Agliari, A. Barra, Dreaming neural networks: forgetting spurious memories and reinforcing pure ones. Neural Netw. 112, 24–40 (2019)
5. E. Agliari, F. Alemanno, A. Barra, A. Fachechi, Dreaming neural networks: rigorous results. J. Stat. Mech. 2019, 083503 (2019)
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献