Author:
Ma Haochun,Prosperino Davide,Räth Christoph
Abstract
AbstractReservoir computers are powerful machine learning algorithms for predicting nonlinear systems. Unlike traditional feedforward neural networks, they work on small training data sets, operate with linear optimization, and therefore require minimal computational resources. However, the traditional reservoir computer uses random matrices to define the underlying recurrent neural network and has a large number of hyperparameters that need to be optimized. Recent approaches show that randomness can be taken out by running regressions on a large library of linear and nonlinear combinations constructed from the input data and their time lags and polynomials thereof. However, for high-dimensional and nonlinear data, the number of these combinations explodes. Here, we show that a few simple changes to the traditional reservoir computer architecture further minimizing computational resources lead to significant and robust improvements in short- and long-term predictive performances compared to similar models while requiring minimal sizes of training data sets.
Funder
Deutsches Zentrum für Luft- und Raumfahrt e. V. (DLR)
Publisher
Springer Science and Business Media LLC
Reference47 articles.
1. S. L. Brunton and J. N. Kutz, Data-driven Science and Engineering: Machine Learning, Dynamical Systems, and Control ( Cambridge University Press, 2022)
2. Creswell, A. et al. Generative adversarial networks: An overview. IEEE Signal Process.Mag. 35, 53 (2018).
3. Zhang, J., Wang, Y., Molino, P., Li, L. & Ebert, D. S. Manifold: A model-agnostic framework for interpretation and diagnosis of machine learning models. IEEE Trans. Vis. Comput. Graph. 25, 364 (2018).
4. Roscher, R., Bohn, B., Duarte, M. F. & Garcke, J. Explainable machine learning for scientific insights and discoveries. IEEE Access 8, 42200 (2020).
5. Jaeger, H. The “echo state’’ approach to analysing and training recurrent neural networks-with an erratum note. Bonn Ger. Ger. Natl. Res. Center Inf. Technol. GMD Tech. Rep. 148, 13 (2001).
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献