Abstract
Importance sampling is used to approximate Bayes’ rule in many computational approaches to Bayesian inverse problems, data assimilation and machine learning. This paper reviews and further investigates the required sample size for importance sampling in terms of the χ2-divergence between target and proposal. We illustrate through examples the roles that dimension, noise-level and other model parameters play in approximating the Bayesian update with importance sampling. Our examples also facilitate a new direct comparison of standard and optimal proposals for particle filtering.
Funder
National Science Foundation
Subject
General Physics and Astronomy
Reference35 articles.
1. Importance Sampling: Intrinsic Dimension and Computational Cost
2. Inverse Problems and Data assimilation;Sanz-Alonso;arXiv,2018
3. Bayesian Reasoning and Machine Learning;Barber,2012
4. On the consistency of graph-based Bayesian semi-supervised learning and the scalability of sampling algorithms;Garcia Trillos;J. Mach. Learn. Res.,2020
5. The Bayesian Update: Variational Formulations and Gradient Flows
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献