Abstract
Abstract
We combine momentum from machine learning with evolutionary dynamics, where momentum can be viewed as a simple mechanism of intergenerational memory similar to epigenetic mechanisms. Using information divergences as Lyapunov functions, we show that momentum accelerates the convergence of evolutionary dynamics including the continuous and discrete replicator equations and Euclidean gradient descent on populations. When evolutionarily stable states are present, these methods prove convergence for small learning rates or small momentum, and yield an analytic determination of the relative decrease in time to converge that agrees well with computations. The main results apply even when the evolutionary dynamic is not a gradient flow. We also show that momentum can alter the convergence properties of these dynamics, for example by breaking the cycling associated to the rock–paper–scissors landscape, leading to either convergence to the ordinarily non-absorbing equilibrium, or divergence, depending on the value and mechanism of momentum.
Reference51 articles.
1. Genetic algorithms and machine learning;Goldberg;Mach. Learn.,1988
2. Evolutionary algorithms for reinforcement learning;Moriarty;J. Artif. Intell. Res.,1999
3. Evolution strategies as a scalable alternative to reinforcement learning;Salimans,2017
4. Neural architecture search: a survey;Elsken;J. Mach. Learn. Res.,2019
5. Automl-zero: Evolving machine learning algorithms from scratch;Real,2020