Author:
Czégel Dániel,Giaffar Hamza,Zachar István,Tenenbaum Joshua B.,Szathmáry Eörs
Abstract
AbstractThe Bayesian framework offers a flexible language for the consistent modular assembly of statistical models used by both minds and machines. Another algorithmic domain capable of adaptation in potentially high-dimensional and uncertain environments is Darwinian evolution. The equivalence of their fundamental dynamical equations, replicator dynamics and Bayesian update, hints at a deeper algorithmic analogy. Here we show, based on a unified mathematical discussion of evolutionary dynamics and statistical learning in terms of Bayesian graphical models, that this is indeed the case. Building blocks of Bayesian computations, such as inference in hierarchical models, filtering in hidden Markov models, gradient likelihood optimization, and expectation-maximization dynamics of mixture models, map naturally to fundamental concepts of evolution: multilevel selection, quasispecies dynamics, phenotypic adaptation and ecological competition, respectively. We believe that these correspondences point towards a more comprehensive understanding of flavors of adaptive computation observed in Nature, as well as suggesting new ways to combine insights from the two domains in engineering applications.
Publisher
Cold Spring Harbor Laboratory
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献