Affiliation:
1. Department of Computer Science Aalto University Espoo 02150 Finland
2. Earth Institute University of Columbia New York New York 10025 USA
Abstract
Statistical models can involve implicitly defined quantities, such as solutions to nonlinear ordinary differential equations (ODEs), that unavoidably need to be numerically approximated in order to evaluate the model. The approximation error inherently biases statistical inference results, but the amount of this bias is generally unknown and often ignored in Bayesian parameter inference. We propose a computationally efficient method for verifying the reliability of posterior inference for such models, when the inference is performed using Markov chain Monte Carlo methods. We validate the efficiency and reliability of our workflow in experiments using simulated and real data and different ODE solvers. We highlight problems that arise with commonly used adaptive ODE solvers and propose robust and effective alternatives, which, accompanied by our workflow, can now be taken into use without losing reliability of the inferences.
Subject
Statistics, Probability and Uncertainty,Statistics and Probability
Reference28 articles.
1. Mathematical analysis of the pharmacokinetic–pharmacodynamic (PKPD) behaviour of monoclonal antibodies: Predicting in vivo potency
2. Barber D. &Wang Y.(2014).Gaussian processes for Bayesian estimation in ordinary differential equations. InProceedings of the 31st International Conference on Machine Learning (Xing E. P. &Jebara T. Eds.) Proceedings of Machine Learning Research 32 PMLR(pp.1485–1493).Bejing China.
3. Automatic differentiation in machine learning: a survey;Baydin A. G.;Journal of Machine Learning Research,2018
4. Algorithmic Differentiation of Implicit Functions and Optimal Values
5. Betancourt M.(2018).A conceptual introduction to Hamiltonian Monte Carlo.https://doi.org/10.48550/arxiv.1701.02434