Affiliation:
1. Department of Psychological Methods University of Amsterdam Amsterdam The Netherlands
Abstract
A staple of Bayesian model comparison and hypothesis testing Bayes factors are often used to quantify the relative predictive performance of two rival hypotheses. The computation of Bayes factors can be challenging, however, and this has contributed to the popularity of convenient approximations such as the Bayesian information criterion (BIC). Unfortunately, these approximations can fail in the case of informed prior distributions. Here, we address this problem by outlining an approximation to informed Bayes factors for a focal parameter . The approximation is computationally simple and requires only the maximum likelihood estimate and its standard error. The approximation uses an estimated likelihood of and assumes that the posterior distribution for is unaffected by the choice of prior distribution for the nuisance parameters. The resulting Bayes factor for the null hypothesis versus the alternative hypothesis is then easily obtained using the Savage–Dickey density ratio. Three real‐data examples highlight the speed and closeness of the approximation compared with bridge sampling and Laplace's method. The proposed approximation facilitates Bayesian reanalyses of standard frequentist results, encourages application of Bayesian tests with informed priors, and alleviates the computational challenges that often frustrate both Bayesian sensitivity analyses and Bayes factor design analyses. The approximation is shown to suffer under small sample sizes and when the posterior distribution of the focal parameter is substantially influenced by the prior distributions on the nuisance parameters. The proposed methodology may also be used to approximate the posterior distribution for under .
Funder
Nederlandse Organisatie voor Wetenschappelijk Onderzoek
Subject
Statistics, Probability and Uncertainty,Statistics and Probability