Abstract
AbstractEfficient global optimization is a widely used method for optimizing expensive black-box functions. In this paper, we study the worst-case oracle complexity of the efficient global optimization problem. In contrast to existing kernel-specific results, we derive a unified lower bound for the oracle complexity of efficient global optimization in terms of the metric entropy of a ball in its corresponding reproducing kernel Hilbert space. Moreover, we show that this lower bound nearly matches the upper bound attained by non-adaptive search algorithms, for the commonly used squared exponential kernel and the Matérn kernel with a large smoothness parameter $$\nu $$
ν
. This matching is up to a replacement of d/2 by d and a logarithmic term $$\log \frac{R}{\epsilon }$$
log
R
ϵ
, where d is the dimension of input space, R is the upper bound for the norm of the unknown black-box function, and $$\epsilon $$
ϵ
is the desired accuracy. That is to say, our lower bound is nearly optimal for these kernels.
Funder
Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung
nccr - on the move
Publisher
Springer Science and Business Media LLC
Reference42 articles.
1. Andersson, J.A., Gillis, J., Horn, G., Rawlings, J.B., Diehl, M.: CasADi: a software framework for nonlinear optimization and optimal control. Math. Program. Comput. 11(1), 1–36 (2019)
2. Bansal, S., Calandra, R., Xiao, T., Levine, S., Tomlin, C.J.: Goal-driven dynamics learning via Bayesian optimization. In: 2017 IEEE 56th Annual Conference on Decision and Control (CDC), pp. 5168–5173. IEEE (2017)
3. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(2) (2012)
4. Bull, A.D.: Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 12(10) (2011)
5. Cai, X., Scarlett, J.: On lower bounds for standard and robust Gaussian process bandit optimization. In: International Conference on Machine Learning, pp. 1216–1226. PMLR (2021)