Affiliation:
1. Laboratory of Computational Neurophysics, Brain Science Institute, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea
2. Department of Physics and Astronomy, Seoul National University, Seoul 08826, Republic of Korea
Abstract
The echo state property (ESP) is a key concept for understanding the working principle of the most widely used reservoir computing model, the echo state network (ESN). The ESP is achieved most of the operation time under general conditions, yet the property is lost when a combination of driving input signals and intrinsic reservoir dynamics causes unfavorable conditions for forgetting the initial transient state. A widely used treatment, setting the spectral radius of the weight matrix below the unity, is not sufficient as it may not properly account for the nature of driving inputs. Here, we characterize how noisy driving inputs affect the dynamical properties of an ESN and the empirical evaluation of the ESP. The standard ESN with a hyperbolic tangent activation function is tested using the MNIST handwritten digit datasets at different additive white Gaussian noise levels. The correlations among the neurons, input mapping, and memory capacity of the reservoir nonlinearly decrease with the noise level. These trends agree with the deterioration of the MNIST classification accuracy against noise. In addition, the ESP index for noisy driving input is developed as a tool to help easily assess ESPs in practical applications. Bifurcation analysis explicates how the noise destroys an asymptotical convergence in an ESN and confirms that the proposed index successfully captures the ESP against noise. These results pave the way for developing noise-robust reservoir computing systems, which may promote the validity and utility of reservoir computing for real-world machine learning applications.
Funder
Korea Institute of Science and Technology
Reference52 articles.
1. Reservoir computing approaches to recurrent neural network training
2. An overview of reservoir computing: theory, applications and implementations;B. Schrauwen,2007
3. On the difficulty of training recurrent neural networks;R. Pascanu
4. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
5. The “echo state” approach to analysing and training recurrent neural networks;H. Jaeger,2001