Author:
Corbella Alice,Spencer Simon E. F.,Roberts Gareth O.
Abstract
AbstractNovel Monte Carlo methods to generate samples from a target distribution, such as a posterior from a Bayesian analysis, have rapidly expanded in the past decade. Algorithms based on Piecewise Deterministic Markov Processes (PDMPs), non-reversible continuous-time processes, are developing into their own research branch, thanks their important properties (e.g., super-efficiency). Nevertheless, practice has not caught up with the theory in this field, and the use of PDMPs to solve applied problems is not widespread. This might be due, firstly, to several implementational challenges that PDMP-based samplers present with and, secondly, to the lack of papers that showcase the methods and implementations in applied settings. Here, we address both these issues using one of the most promising PDMPs, the Zig-Zag sampler, as an archetypal example. After an explanation of the key elements of the Zig-Zag sampler, its implementation challenges are exposed and addressed. Specifically, the formulation of an algorithm that draws samples from a target distribution of interest is provided. Notably, the only requirement of the algorithm is a closed-form differentiable function to evaluate the log-target density of interest, and, unlike previous implementations, no further information on the target is needed. The performance of the algorithm is evaluated against canonical Hamiltonian Monte Carlo, and it is proven to be competitive, in simulation and real-data settings. Lastly, we demonstrate that the super-efficiency property, i.e. the ability to draw one independent sample at a lesser cost than evaluating the likelihood of all the data, can be obtained in practice.
Funder
Engineering and Physical Sciences Research Council
Publisher
Springer Science and Business Media LLC
Subject
Computational Theory and Mathematics,Statistics, Probability and Uncertainty,Statistics and Probability,Theoretical Computer Science
Reference32 articles.
1. Andrieu, C., Livingstone, S.: Peskun-tierney ordering for Markovian Monte Carlo: Beyond the reversible scenario. Ann. Stat. 49(4), 1958–1981 (2021)
2. Baydin, A.G., Pearlmutter, B.A., Radul, A.A., Siskind, J.M.: Automatic differentiation in machine learning a survey. J. Mach. Learn. Res. 18, 1–43 (2018)
3. Bertazzi, A., Bierkens, J.: Adaptive schemes for piecewise deterministic Monte Carlo algorithms. (2020). arXiv preprint arXiv:2012.13924
4. Bertazzi, A., Bierkens, J., Dobson, P.: Approximations of Piecewise Deterministic Markov Processes and their convergence properties. (2021). arXiv preprint arXiv:2109.11827
5. Bierkens, J., Fearnhead, P., Roberts, G.: The Zig-Zag process and super-efficient sampling for Bayesian analysis of big data. Ann. Stat. 47(3), 1288–1320 (2019). https://doi.org/10.1214/18-AOS1715
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Sampling Algorithms in Statistical Physics: A Guide for Statistics and Machine Learning;Statistical Science;2024-02-01
2. NuZZ: Numerical Zig-Zag for general models;Statistics and Computing;2024-01-05
3. Incorporating testing volume into estimation of effective reproduction number dynamics;Journal of the Royal Statistical Society Series A: Statistics in Society;2023-12-13
4. Speed up Zig-Zag;The Annals of Applied Probability;2023-12-01
5. Concave-Convex PDMP-based Sampling;Journal of Computational and Graphical Statistics;2023-05-30