Affiliation:
1. Statistical Sciences Los Alamos National Laboratory Los Alamos New Mexico USA
2. Statistical Sciences Sandia National Laboratories Albuquerque New Mexico USA
Abstract
More attention has been given to the computational cost associated with the fitting of an emulator. Substantially less attention is given to the computational cost of using that emulator for prediction. This is primarily because the cost of fitting an emulator is usually far greater than that of obtaining a single prediction, and predictions can often be obtained in parallel. In many settings, especially those requiring Markov Chain Monte Carlo, predictions may arrive sequentially and parallelization is not possible. In this case, using an emulator procedure which can produce accurate predictions efficiently can lead to substantial time savings in practice. In this paper, we propose a global model approximate Gaussian process framework via extension of a popular local approximate Gaussian process (laGP) framework. Our proposed emulator can be viewed as a treed Gaussian process where the leaf nodes are laGP models, and the tree structure is learned greedily as a function of the prediction stream. The suggested method (called leapGP) has interpretable tuning parameters which control the time‐memory trade‐off. One reasonable choice of settings leads to an emulator with a training cost and makes predictions rapidly with an asymptotic amortized cost of .
Funder
Laboratory Directed Research and Development
Subject
Statistics, Probability and Uncertainty,Statistics and Probability
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献