Author:
Fournier Samantha J,Urbani Pierfrancesco
Abstract
Abstract
In many complex systems, elementary units live in a chaotic environment and need to adapt their strategies to perform a task by extracting information from the environment and controlling the feedback loop on it. One of the main examples of systems of this kind is provided by recurrent neural networks. In this case, recurrent connections between neurons drive chaotic behavior, and when learning takes place, the response of the system to a perturbation should also take into account its feedback on the dynamics of the network itself. In this work, we consider an abstract model of a high-dimensional chaotic system as a paradigmatic model and study its dynamics. We study the model under two particular settings: Hebbian driving and FORCE training. In the first case, we show that Hebbian driving can be used to tune the level of chaos in the dynamics, and this reproduces some results recently obtained in the study of more biologically realistic models of recurrent neural networks. In the latter case, we show that the dynamical system can be trained to reproduce simple periodic functions. To do this, we consider the FORCE algorithm—originally developed to train recurrent neural networks—and adapt it to our high-dimensional chaotic system. We show that this algorithm drives the dynamics close to an asymptotic attractor the larger the training time. All our results are valid in the thermodynamic limit due to an exact analysis of the dynamics through dynamical mean field theory.
Subject
Statistics, Probability and Uncertainty,Statistics and Probability,Statistical and Nonlinear Physics