Affiliation:
1. Indiana University, USA
2. Tufts University, USA
Abstract
Bayesian inference, of posterior knowledge from prior knowledge and observed evidence, is typically defined by Bayes's rule, which says the posterior multiplied by the probability of an observation equals a joint probability. But the observation of a continuous quantity usually has probability zero, in which case Bayes's rule says only that the unknown times zero is zero. To infer a posterior distribution from a zero-probability observation, the statistical notion of
disintegration
tells us to specify the observation as an expression rather than a predicate, but does not tell us how to compute the posterior. We present the first method of computing a disintegration from a probabilistic program and an expression of a quantity to be observed, even when the observation has probability zero. Because the method produces an exact posterior term and preserves a semantics in which monadic terms denote measures, it composes with other inference methods in a modular way-without sacrificing accuracy or performance.
Funder
Lilly Endowment
National Science Foundation
DARPA
Defense Advanced Research Projects Agency
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design,Software
Reference45 articles.
1. Noncomputable Conditional Distributions
2. NETVISA: Network processing vertically integrated seismic analysis. Bulletin of the Seismological Society of America, 103(2A): 709–729. Philippe Audebaud and Christine Paulin-Mohring. 2009. Proofs of randomized algorithms in Coq;Arora Nimar S.;Science of Computer Programming,2013
3. A type theory for probability density functions
Cited by
21 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献