Abstract
AbstractIn Bayesian inverse problems, one aims at characterizing the posterior distribution of a set of unknowns, given indirect measurements. For non-linear/non-Gaussian problems, analytic solutions are seldom available: Sequential Monte Carlo samplers offer a powerful tool for approximating complex posteriors, by constructing an auxiliary sequence of densities that smoothly reaches the posterior. Often the posterior depends on a scalar hyper-parameter, for which limited prior information is available. In this work, we show that properly designed Sequential Monte Carlo (SMC) samplers naturally provide an approximation of the marginal likelihood associated with this hyper-parameter for free, i.e. at a negligible additional computational cost. The proposed method proceeds by constructing the auxiliary sequence of distributions in such a way that each of them can be interpreted as a posterior distribution corresponding to a different value of the hyper-parameter. This can be exploited to perform selection of the hyper-parameter in Empirical Bayes (EB) approaches, as well as averaging across values of the hyper-parameter according to some hyper-prior distribution in Fully Bayesian (FB) approaches. For FB approaches, the proposed method has the further benefit of allowing prior sensitivity analysis at a negligible computational cost. In addition, the proposed method exploits particles at all the (relevant) iterations, thus alleviating one of the known limitations of SMC samplers, i.e. the fact that all samples at intermediate iterations are typically discarded. We show numerical results for two distinct cases where the hyper-parameter affects only the likelihood: a toy example, where an SMC sampler is used to approximate the full posterior distribution; and a brain imaging example, where a Rao-Blackwellized SMC sampler is used to approximate the posterior distribution of a subset of parameters in a conditionally linear Gaussian model.
Funder
Università degli Studi di Genova
Publisher
Springer Science and Business Media LLC
Subject
Computational Theory and Mathematics,Statistics, Probability and Uncertainty,Statistics and Probability,Theoretical Computer Science
Reference31 articles.
1. Bernton, E, Heng, J, Doucet, A, Jacob, P. E.: Schrödinger bridge samplers. e-print 1912.13170, ArXiv, (2019)
2. Chopin, N., Papaspiliopoulos, O.: An introduction to sequential Monte Carlo. Springer, (2020)
3. Dau, H.-D., Chopin, N.: Waste-free sequential Monte Carlo. J. R. Stat. Soc. Ser. B Stat Methodol. 84(1), 114–148 (2022)
4. De Bortoli, V., Durmus, A., Pereyra, M., Fernandez Vidal, A.: Efficient stochastic optimisation by unadjusted Langevin Monte Carlo. Stat. Comput. (2021). https://doi.org/10.1007/s11222-020-09986-y
5. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. Royal Stat. Soci: Series B (Stat. Methodol.) 68(3), 411–436 (2006)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献