Abstract
Abstract
Mixed-signal and fully digital neuromorphic systems have been of significant interest for deploying spiking neural networks in an energy-efficient manner. However, many of these systems impose constraints in terms of fan-in, memory, or synaptic weight precision that have to be considered during network design and training. In this paper, we present quantized rewiring (Q-rewiring), an algorithm that can train both spiking and non-spiking neural networks while meeting hardware constraints during the entire training process. To demonstrate our approach, we train both feedforward and recurrent neural networks with a combined fan-in/weight precision limit, a constraint that is, for example, present in the DYNAP-SE mixed-signal analog/digital neuromorphic processor. Q-rewiring simultaneously performs quantization and rewiring of synapses and synaptic weights through gradient descent updates and projecting the trainable parameters to a constraint-compliant region. Using our algorithm, we find trade-offs between the number of incoming connections to neurons and network performance for a number of common benchmark datasets.
Funder
Austrian Science Fund
H2020 Future and Emerging Technologies
Reference62 articles.
1. Carbontracker: tracking and predicting the carbon footprint of training deep learning models;Anthony,2020
2. Can programming be liberated from the von Neumann style? A functional style and its algebra of programs;Backus;Commun. ACM,1978
3. Deep rewiring: training very sparse deep networks;Bellec,2017
4. Long short-term memory and learning-to-learn in networks of spiking neurons;Bellec,2018
5. A solution to the learning dilemma for recurrent networks of spiking neurons;Bellec;Nat. Commun.,2020
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献