Affiliation:
1. Chiba Institute of Technology
2. Oita University
3. The University of Tokyo
Abstract
Abstract
Reservoir computing (RC) can efficiently process time-series data by transferring the input signal to randomly connected recurrent neural networks (RNNs), which are referred to as a reservoir. The high-dimensional representation of time-series data in the reservoir significantly simplifies subsequent learning tasks.
Although this simple architecture allows fast learning and facile physical implementation, the learning performance is inferior to that of other state-of-the-art RNN models. In this study, to improve the learning ability of RC, we propose self-modulated RC (SM-RC) that extends RC by adding a self-modulation mechanism.
We demonstrated that SM-RC can perform attention tasks where input information is retained or discarded depending on the input signal. We also found that a chaotic state emerged as a result of learning in SM-RC. Furthermore, SM-RC significantly outperformed RC in NARMA and Lorentz model tasks. Because the SM-RC architecture only requires two additional gates, it is physically implementable as RC, thereby providing a new direction for realizing edge AI.
Publisher
Research Square Platform LLC
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. A Hardware Chaotic Neural Network with Gap Junction Models;IEEJ Transactions on Electronics, Information and Systems;2024-07-01