Abstract
Abstract
Prior beliefs about the latent function to shape inductive biases can be incorporated into a Gaussian process (GP) via the kernel. However, beyond kernel choices, the decision-making process of GP models remains poorly understood. In this work, we contribute an analysis of the loss landscape for GP models using methods from chemical physics. We demonstrate ν-continuity for Matérn kernels and outline aspects of catastrophe theory at critical points in the loss landscape. By directly including ν in the hyperparameter optimisation for Matérn kernels, we find that typical values of ν can be far from optimal in terms of performance. We also provide an a priori method for evaluating the effect of GP ensembles and discuss various voting approaches based on physical properties of the loss landscape. The utility of these approaches is demonstrated for various synthetic and real datasets. Our findings provide insight into hyperparameter optimisation for GPs and offer practical guidance for improving their performance and interpretability in a range of applications.
Funder
International Chair at the Interdisciplinary Institute for Artificial Intelligence at 3iA Cote d’Azur,
Engineering and Physical Sciences Research Council
Reference54 articles.
1. Measuring the robustness of Gaussian processes to kernel choice;Stephenson,2021
2. Structure discovery in nonparametric regression through compositional kernel search;Duvenaud,2013
3. When Gaussian process meets big data: a review of scalable GPs;Liu;IEEE Trans. Neural Netw. Learn. Syst.,2020
4. Fast sparse Gaussian process methods: the informative vector machine;Lawrence,2002