Abstract
AbstractThe Bayesian approach to solving inverse problems relies on the choice of a prior. This critical ingredient allows expert knowledge or physical constraints to be formulated in a probabilistic fashion and plays an important role for the success of the inference. Recently, Bayesian inverse problems were solved using generative models as highly informative priors. Generative models are a popular tool in machine learning to generate data whose properties closely resemble those of a given database. Typically, the generated distribution of data is embedded in a low-dimensional manifold. For the inverse problem, a generative model is trained on a database that reflects the properties of the sought solution, such as typical structures of the tissue in the human brain in magnetic resonance imaging. The inference is carried out in the low-dimensional manifold determined by the generative model that strongly reduces the dimensionality of the inverse problem. However, this procedure produces a posterior that does not admit a Lebesgue density in the actual variables and the accuracy attained can strongly depend on the quality of the generative model. For linear Gaussian models, we explore an alternative Bayesian inference based on probabilistic generative models; this inference is carried out in the original high-dimensional space. A Laplace approximation is employed to analytically derive the prior probability density function required, which is induced by the generative model. Properties of the resulting inference are investigated. Specifically, we show that derived Bayes estimates are consistent, in contrast to the approach in which the low-dimensional manifold of the generative model is employed. The MNIST data set is used to design numerical experiments that confirm our theoretical findings. It is shown that the approach proposed can be advantageous when the information contained in the data is high and a simple heuristic is considered for the detection of this case. Finally, the pros and cons of both approaches are discussed.
Funder
Physikalisch-Technische Bundesanstalt (PTB)
Publisher
Springer Science and Business Media LLC
Subject
Computational Mathematics,Statistics, Probability and Uncertainty,Statistics and Probability
Reference50 articles.
1. Adler J, Öktem O (2018) Deep bayesian inversion. arXiv preprint arXiv:1811.05910
2. Albert A, Strano E, Kaur J, González M (2018) Modeling urbanization patterns with generative adversarial networks. In: IGARSS 2018–2018 IEEE international geoscience and remote sensing symposium. IEEE, pp 2095–2098
3. Andrews HC, Hunt BR (1977) Digital image restoration. Prentice-Hall, Hoboken
4. Arridge S, Maass P, Öktem O, Schönlieb C-B (2019) Solving inverse problems using data-driven models. Acta Numer 28:1–174
5. Bai Y, Chen W, Chen J, Guo W (2020) Deep learning methods for solving linear inverse problems: research directions and paradigms. Signal Process 177:107729