Author:
Deckers Lucas,Van Damme Laurens,Van Leekwijck Werner,Tsang Ing Jyh,Latré Steven
Abstract
Spiking neural network (SNN) distinguish themselves from artificial neural network (ANN) because of their inherent temporal processing and spike-based computations, enabling a power-efficient implementation in neuromorphic hardware. In this study, we demonstrate that data processing with spiking neurons can be enhanced by co-learning the synaptic weights with two other biologically inspired neuronal features: (1) a set of parameters describing neuronal adaptation processes and (2) synaptic propagation delays. The former allows a spiking neuron to learn how to specifically react to incoming spikes based on its past. The trained adaptation parameters result in neuronal heterogeneity, which leads to a greater variety in available spike patterns and is also found in the brain. The latter enables to learn to explicitly correlate spike trains that are temporally distanced. Synaptic delays reflect the time an action potential requires to travel from one neuron to another. We show that each of the co-learned features separately leads to an improvement over the baseline SNN and that the combination of both leads to state-of-the-art SNN results on all speech recognition datasets investigated with a simple 2-hidden layer feed-forward network. Our SNN outperforms the benchmark ANN on the neuromorphic datasets (Spiking Heidelberg Digits and Spiking Speech Commands), even with fewer trainable parameters. On the 35-class Google Speech Commands dataset, our SNN also outperforms a GRU of similar size. Our study presents brain-inspired improvements in SNN that enable them to excel over an equivalent ANN of similar size on tasks with rich temporal dynamics.
Funder
Fonds Wetenschappelijk Onderzoek
Reference50 articles.
1. A surrogate gradient spiking baseline for speech command recognition;Bittar;Front. Neurosci,2022
2. Firing-rate resonance in a generalized integrate-and-fire neuron with subthreshold resonance;Brunel;Phys. Rev. E,2003
3. “Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks,”
BuT.
FangW.
DingJ.
DaiP.
YuZ.
HuangT.
The Tenth International Conference on Learning Representations2022
4. Heterogeneous recurrent spiking neural network for spatio-temporal classification;Chakraborty;Front. Neurosci,2023
5. The heidelberg spiking data sets for the systematic evaluation of spiking neural networks;Cramer;IEEE Transact. Neural Netw. Learn. Syst,2020
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献