Author:
Oladyshkin Sergey,Nowak Wolfgang
Abstract
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling.
Subject
General Physics and Astronomy
Reference61 articles.
1. Foundations of the Theory of Probability: Second English Edition;Kolmogorov,2018
2. Uncertainties in reservoir production forecasts;Lia;AAPG Bull.,1997
3. Bayesian statistics without tears: A sampling–resampling perspective;Smith;Am. Stat.,1992
4. Markov Chain Monte Carlo in Practice;Gilks,1996
5. The hydrologist’s guide to Bayesian model selection, averaging and combination
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献