Author:
Sugiura Shuhei,Ariizumi Ryo,Asai Toru,Azuma Shun-ichi
Abstract
AbstractIn this paper, we prove the existence of a reservoir that has a finite-dimensional output and makes the reservoir computing model universal. Reservoir computing is a method for dynamical system approximation that trains the static part of a model but fixes the dynamical part called the reservoir. Hence, reservoir computing has the advantage of training models with a low computational cost. Moreover, fixed reservoirs can be implemented as physical systems. Such reservoirs have attracted attention in terms of computation speed and energy consumption. The universality of a reservoir computing model is its ability to approximate an arbitrary system with arbitrary accuracy. Two sufficient reservoir conditions to make the model universal have been proposed. The first is the combination of fading memory and the separation property. The second is the neighborhood separation property, which we proposed recently. To date, it has been unknown whether a reservoir with a finite-dimensional output can satisfy these conditions. In this study, we prove that no reservoir with a finite-dimensional output satisfies the former condition. By contrast, we propose a single output reservoir that satisfies the latter condition. This implies that, for any dimension, a reservoir making the model universal exists with the output of that specified dimension. These results clarify the practical importance of our proposed conditions.
Funder
Japan Society for the Promotion of Science
JST FOREST Program
Publisher
Springer Science and Business Media LLC
Reference22 articles.
1. Jaeger, H. The “echo state” approach to analysing and training recurrent neural networks—with an erratum note. German National Research Center for Information Technology GMD Technical Report, 148.34 (2001).
2. Maass, W. & Natschl, T. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002).
3. Steil, J. J. Backpropagation-decorrelation: Online recurrent learning with O(N) complexity. In 2004 IEEE International Joint Conference on Neural Networks 843–848 (2004).
4. Verstraeten, D., Schrauwen, B., D’Haene, M. & Stroobandt, D. An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007).
5. Lukoševičius, M. & Jaeger, H. Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009).