OPT-GAN: A Broad-Spectrum Global Optimizer for Black-Box Problems by Learning Distribution
-
Published:2023-06-26
Issue:10
Volume:37
Page:12462-12472
-
ISSN:2374-3468
-
Container-title:Proceedings of the AAAI Conference on Artificial Intelligence
-
language:
-
Short-container-title:AAAI
Author:
Lu Minfang,Ning Shuai,Liu Shuangrong,Sun Fengyang,Zhang Bo,Yang Bo,Wang Lin
Abstract
Black-box optimization (BBO) algorithms are concerned with finding the best solutions for problems with missing analytical details. Most classical methods for such problems are based on strong and fixed a priori assumptions, such as Gaussianity. However, the complex real-world problems, especially when the global optimum is desired, could be very far from the a priori assumptions because of their diversities, causing unexpected obstacles. In this study, we propose a generative adversarial net-based broad-spectrum global optimizer (OPT-GAN) which estimates the distribution of optimum gradually, with strategies to balance exploration-exploitation trade-off. It has potential to better adapt to the regularity and structure of diversified landscapes than other methods with fixed prior, e.g., Gaussian assumption or separability.
Experiments on diverse BBO benchmarks and high dimensional real world applications exhibit that OPT-GAN outperforms other traditional and neural net-based BBO algorithms. The code and Appendix are available at https://github.com/NBICLAB/OPT-GAN
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Improving OPT-GAN by Smooth Scale Mapping and Adaptive Exploration;2024 International Joint Conference on Neural Networks (IJCNN);2024-06-30