Abstract
AbstractMethods based on Gaussian stochastic process (GSP) models and expected improvement (EI) functions have been promising for box-constrained expensive optimization problems. These include robust design problems with environmental variables having set-type constraints. However, the methods that combine GSP and EI sub-optimizations suffer from the following problem, which limits their computational performance. Efficient global optimization (EGO) methods often repeat the same or nearly the same experimental points. We present a novel EGO-type constraint-handling method that maintains a so-called tabu list to avoid past points. Our method includes two types of penalties for the key “infill” optimization, which selects the next test runs. We benchmark our tabu EGO algorithm with five alternative approaches, including DIRECT methods using nine test problems and two engineering examples. The engineering examples are based on additive manufacturing process parameter optimization informed using point-based thermal simulations and robust-type quality constraints. Our test problems span unconstrained, simply constrained, and robust constrained problems. The comparative results imply that tabu EGO offers very promising computational performance for all types of black-box optimization in terms of convergence speed and the quality of the final solution.
Funder
National Science Foundation
Publisher
Springer Science and Business Media LLC
Subject
Control and Optimization,Computer Graphics and Computer-Aided Design,Computer Science Applications,Control and Systems Engineering,Software
Reference34 articles.
1. Allen TT, Sui Z, Akbari K (2018) Exploratory text data analysis for quality hypothesis generation. Qual Eng 30(4):701–712
2. Audet C, Denni J, Moore D, Booker A, Frank P (2000) A surrogate-model-based method for constrained optimization. In: 8th symposium on multidisciplinary analysis and optimization 4891
3. Ben-Tal A, El Ghaoui L, Nemirovski A (2009) Robust optimization. Princeton University Press, Princeton
4. Bergstra J, Yamins D, and Cox D (2013) Making a science of model search: hyperparameter optimization in hundreds of dimensions for vision architectures. In International conference on machine learning, pp. 115–123
5. Bertsimas D, Nohadani O, Teo KM (2010) Non-convex robust optimization for problems with constraints. INFORMS J Comput 22(1):44–58. https://doi.org/10.1287/ijoc.1090.0319
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献