Affiliation:
1. Mathematical Institute, University of Oxford, United Kingdom
2. Numerical Algorithms Group, Wilkinson House, United Kingdom
Abstract
We present two software packages for derivative-free optimization (DFO): DFO-LS for nonlinear least-squares problems and Py-BOBYQA for general objectives, both with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals and allows flexible initialization for expensive problems, whereby it can begin making progress after as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, regression-based model construction, and multiple restart strategies with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over sampling and regression techniques. The package Py-BOBYQA is a Python implementation of BOBYQA (Powell 2009), with novel features such as the implementation of robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce an adaptive accuracy measure for data profiles of noisy functions, striking a balance between measuring the true and the noisy objective improvement.
Funder
EPSRC Centre For Doctoral Training in Industrially Focused Mathematical Modelling
Numerical Algorithms Group Ltd
Publisher
Association for Computing Machinery (ACM)
Subject
Applied Mathematics,Software
Cited by
85 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献