Affiliation:
1. Institute of Computer Science, Academy of Sciences of the Czech Republic, P. O. Box 5, 18207 Prague 8, Czech Republic
Abstract
Recently a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this letter, we investigate the energy complexity of recurrent networks, which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size [Formula: see text] with the time overhead of [Formula: see text] per one input bit, using the energy O(e), for any e such that [Formula: see text] and e=O(s), which shows the time-energy trade-off in recurrent networks. In addition, for the time overhead [Formula: see text] satisfying [Formula: see text], we obtain the lower bound of [Formula: see text] on the energy of such a simulation for some constant c>0 and for infinitely many s.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
21 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献