Abstract
Abstract
Neuromorphic computing provides a promising energy-efficient alternative to von-Neumann-type computing and learning architectures. However, the best neuromorphic hardware is useless without suitable inference and learning algorithms that can fully exploit hardware advantages. Such algorithms often have to deal with challenging constraints posed by neuromorphic hardware such as massive parallelism, sparse asynchronous communication, and analog and/or unreliable computing elements. This Focus Issue presents advances on various aspects of algorithms for neuromorphic computing. The collection of articles covers a wide range from very fundamental questions about the computational properties of the basic computing elements in neuromorphic systems, algorithms for continual learning, semantic segmentation, and novel efficient learning paradigms, up to algorithms for a specific application domain.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献