Author:
Bai Tianming,Teckentrup Aretha L.,Zygalakis Konstantinos C.
Abstract
AbstractThis work is concerned with the use of Gaussian surrogate models for Bayesian inverse problems associated with linear partial differential equations. A particular focus is on the regime where only a small amount of training data is available. In this regime the type of Gaussian prior used is of critical importance with respect to how well the surrogate model will perform in terms of Bayesian inversion. We extend the framework of Raissi et. al. (2017) to construct PDE-informed Gaussian priors that we then use to construct different approximate posteriors. A number of different numerical experiments illustrate the superiority of the PDE-informed Gaussian priors over more traditional priors.
Publisher
Springer Science and Business Media LLC
Reference41 articles.
1. Alvarez, M.A., Rosasco, L., Lawrence, N.D.: Kernels for vector-valued functions: a review. Foundations and Trends® in Machine Learning 4(3), 195–266 (2012)
2. Babuska, I., Nobile, F., Tempone, R.: A stochastic collocation method for elliptic partial differential equations with random input data. SIAM J. Numerical Analysis 45, 1005–1034 (2007)
3. Bauschke, H.H., Burachik, R.S., Combettes, P.L., Elser, V., Luke, D.R., Wolkowicz, H.: Fixed-point Algorithms for Inverse Problems in Science and Engineering, vol. 49. Springer, New York, NY (2011)
4. Bonilla, E.V., Chai, K., Williams, C.: Multi-task Gaussian process prediction. Advances in neural information processing systems 20 (2007)
5. Brooks, S., Gelman, A., Jones, G., Meng, X.-L.: Handbook of Markov Chain Monte Carlo. CRC Press, Boca Raton, FL (2011)