Affiliation:
1. Department of Physiology, University of Bern
2. Kirchhoff-Institute for Physics, Heidelberg University
Abstract
In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean-gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural-gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling, and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural-gradient descent.
Funder
European Union 7th Framework Programme
Horizon 2020 Framework Programme
Swiss National Fonds
Manfred Stärk Foundation
Publisher
eLife Sciences Publications, Ltd
Subject
General Immunology and Microbiology,General Biochemistry, Genetics and Molecular Biology,General Medicine,General Neuroscience
Reference56 articles.
1. Bayesian Synaptic Plasticity Makes Predictions about Plasticity Experiments in Vivo;Aitchison,2014
2. Synaptic plasticity as Bayesian inference;Aitchison;Nature Neuroscience,2021
3. Chapter 2: Differential Geometrical Theory of Statistics
4. Natural Gradient Works Efficiently in Learning;Amari;Neural Computation,1998
5. Fisher information and natural gradient learning in random deep networks;Amari,2019
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献