Abstract
AbstractIn theory, neurons modelled as single layer perceptrons can implement all linearly separable computations. In practice, however, these computations may require arbitrarily precise synaptic weights. This is a strong constraint since both, biological neurons and their artificial counterparts, have to cope with limited precision. Here, we explore how the non-linear processing in dendrites helps overcoming this constraint. We start by finding a class of computations which requires increasing precision with the number of inputs in a perceptron and show that it can be implemented without this constraint in a neuron with sub-linear subunits. Then, we complement this analytical study by a simulation of a biophysical neuron model with two passive dendrites and a soma, and show that it can implement this computation. This works demonstrates a new role of dendrites in neural computation: by distributing the computation across independent subunits, the same computation can be performed more efficiently with less precise tuning of the synaptic weights. We hope that this works not only offers new insight into the importance of dendrites for biological neurons, but also paves the way for new, more efficient architectures of artificial neuromorphic chips.Author SummaryIn theory, we know how much neurons can compute, in practice, the number of possible synaptic weights values limits their computation capacity. Such a limitation holds true for artificial and synthetic neurons. We introduce here a computation where the required means evolve significantly with the number of inputs, this poses a problem as neurons receive multiple thousands of inputs. We study here how the neurons’ receptive element-called dendrites-can mitigate such a problem. We show that, without dendrites, the largest synaptic weight need to be multiple orders of magnitude larger than the smallest to implement the computation. Yet a neuron with dendrites implements the same computation with constant synaptic weights whatever the number of inputs. This study paves the way for the use of dendritic neurons in a new generation of artificial neural network and neuromorphic chips with a considerably better cost-benefit balance.
Publisher
Cold Spring Harbor Laboratory
Reference20 articles.
1. Thin Dendrites of Cerebellar Interneurons Confer Sublinear Synaptic Integration and a Gradient of Short-Term Plasticity
2. Mechanisms underlying subunit independence in pyramidal neuron dendrites
3. Romain D. Cazé , Mark D. Humphries , and Boris S. Gutkin . Spiking and saturating dendrites differentially expand single neuron computation capacity. Nips, pages 1—9, 2012.
4. Romain D. Cazé , Sarah Jarvis , Amanda Joy Foust , and Simon R. Schultz . Dendrites Enable a Robust Mechanism for neuronal stimulus selectivity. Neural Computation, 29(1-17), 2017.
5. On the capabilities of neural networks using limited precision weights;Neural networks: the official journal of the International Neural Network Society,2002