Abstract
Abstract
Mixed-signal neuromorphic computers often emulate some variant of the LIF neuron model. While, in theory, two-layer networks of these neurons are universal function approximators, single-layer networks consisting of slightly more complex neurons can, at the cost of universality, be more efficient. In this paper, we discuss a family of LIF neurons with passive dendrites. We provide rules that describe how input channels targeting different dendritic compartments interact, and test in how far these interactions can be harnessed in a spiking neural network context. We find that a single layer of two-compartment neurons approximates some functions at smaller errors than similarly sized hidden-layer networks. Single-layer networks with with three compartment neurons can approximate functions such as XOR and four-quadrant multiplication well; adding more compartments only offers small improvements in accuracy. From the perspective of mixed-signal neuromorphic systems, our results suggest that only small modifications to the neuron circuit are necessary to construct more computationally powerful and energy efficient systems that move more computation into the dendritic, analogue domain.
Funder
Natural Sciences and Engineering Research Council of Canada
Air Force Office of Scientific Research
Reference37 articles.
1. Information processing in dendritic trees;Mel;Neural Comput.,1994
2. Single-cell models;Koch,2002
3. Computational subunits in thin dendrites of pyramidal cells;Polsky;Nat. Neurosci.,2004
4. Dendritic computation;London;Annu. Rev. Neurosci.,2005
5. A wafer-scale neuromorphic hardware system for large-scale neural modeling;Schemmel,2010
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献