Author:
Nakashima So,Kobayashi Tetsuya J.
Abstract
AbstractNatural selection is general and powerful concept not only to explain evolutionary processes of biological organisms but also to design engineering systems such as genetic algorithms and particle filters. There is a surge of interest, both from biology and engineering, in considering natural selection of intellectual agents that can learn individually. Learning by individual agents of better behaviors for survival may accelerate the evolutionary processes by natural selection. We have accumulating pieces of evidence that organisms can transmit its information to the next generation via epigenetic states or memes. Also, such idea is important for engineering applications to improve the genetic algorithms and the particle filter. To accelerate the evolutionary process, an agent should change their strategy so that the population fitness increases the most. Equivalently, an agent should update the strategy towards a gradient (derivative) of the population fitness with respect to the strategy. However, it has not yet been clarified whether and how an agent can estimate the gradient and accelerate the evolutionary process. We also lack methodology to quantify the acceleration to understand and predict the impact of learning. In this paper, we address these problems. We show that an learning agent can accelerate the evolutionary process by proposing ancestral learning, which uses the information transmitted from the ancestor (ancestral information) via epigenetic states or memes. Numerical experiments show that ancestral learning actually accelerates the evolutionary process. We next show that the ancestral information is sufficient to estimate the gradient. In particular, learning can accelerate the evolutionary process without communications between agents. Finally, to quantify the acceleration, we extend the Fisher’s fundamental theorem (FF-thm) for natural selection to ancestral learning. The conventional FF-thm relates the speed of evolution by natural selection to the variety of the individual fitness in the population. Our extended FF-thm relates the acceleration of the evolutionary process to the variety of individual fitness of the agent. By the theorem, we can quantitatively understand when and why learning is beneficial.
Publisher
Cold Spring Harbor Laboratory
Reference38 articles.
1. L. A. Urry , M. L. Cain , S. A. Wasserman , and P. V. Minorsky , Campbell Biology (Pearson, 2016).
2. T. Back , Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms (Oxford university press, 1996).
3. C. M. Bishop , Pattern recognition and machine learning (springer, 2006).
4. Optimal Mixed Strategies in Stochastic Environments
5. Phenotypic Diversity, Population Growth, and Information in Fluctuating Environments