Abstract
AbstractNetworks of spiking neurons promise to combine energy efficiency with high performance. However, spiking models that match the performance of current state-of-the-art networks while requiring moderate computational resources are still lacking. Here we present an alternative framework to deep convolutional networks (CNNs), the ”Spike by Spike” network (SbS), together with an efficient backpropagation algorithm. SbS implements networks based on non-negative matrix factorisation (NNMF), but uses discrete events as signals instead of real values. On clean data, the performance of CNNs is matched by both NNMF-based networks and SbS. SbS are found to be most robust when the data is corrupted by noise, specially when this noise was not seen before.
Publisher
Cold Spring Harbor Laboratory
Reference13 articles.
1. Adam: A method for stochastic optimization;arXiv preprint,2017
2. Deep learning in neural networks: An overview
3. Efficient computation based on stochastic spikes;Neural computation 19,2007
4. Back-propagation learning in deep spike-by-spike networks;Frontiers in Computational Neuroscience,2019
5. Learning the parts of objects by non-negative matrix factorization;Nature 401,1999