Benchmarking Parameter-Free AMaLGaM on Functions With and Without Noise

Author:

Bosman Peter A. N.1,Grahl Jörn2,Thierens Dirk3

Affiliation:

1. Centrum Wiskunde & Informatica, P.O. Box 94079, 1090 GB Amsterdam, The Netherlands

2. Johannes Gutenberg University Mainz, Department of Information Systems & Business Administration, Jakob Welder-Weg 9, D-55128 Mainz, Germany

3. Department of Information and Computing Sciences, Utrecht University, 3508 TB Utrecht, The Netherlands

Abstract

We describe a parameter-free estimation-of-distribution algorithm (EDA) called the adapted maximum-likelihood Gaussian model iterated density-estimation evolutionary algorithm (AMaLGaM-ID[Formula: see text]A, or AMaLGaM for short) for numerical optimization. AMaLGaM is benchmarked within the 2009 black box optimization benchmarking (BBOB) framework and compared to a variant with incremental model building (iAMaLGaM). We study the implications of factorizing the covariance matrix in the Gaussian distribution, to use only a few or no covariances. Further, AMaLGaM and iAMaLGaM are also evaluated on the noisy BBOB problems and we assess how well multiple evaluations per solution can average out noise. Experimental evidence suggests that parameter-free AMaLGaM can solve a wide range of problems efficiently with perceived polynomial scalability, including multimodal problems, obtaining the best or near-best results among all algorithms tested in 2009 on functions such as the step-ellipsoid and Katsuuras, but failing to locate the optimum within the time limit on skew Rastrigin-Bueche separable and Lunacek bi-Rastrigin in higher dimensions. AMaLGaM is found to be more robust to noise than iAMaLGaM due to the larger required population size. Using few or no covariances hinders the EDA from dealing with rotations of the search space. Finally, the use of noise averaging is found to be less efficient than the direct application of the EDA unless the noise is uniformly distributed. AMaLGaM was among the best performing algorithms submitted to the BBOB workshop in 2009.

Publisher

MIT Press - Journals

Subject

Computational Mathematics

Cited by 39 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Evolutionary Deep Reinforcement Learning via Hybridizing Estimation-of-Distribution Algorithms with Policy Gradients;2024 IEEE Congress on Evolutionary Computation (CEC);2024-06-30

2. Quality-diversity driven robust evolutionary optimization of optical designs;Computational Optics 2024;2024-06-17

3. An enhanced Kalman filtering and historical learning mechanism driven estimation of distribution algorithm;Swarm and Evolutionary Computation;2024-04

4. evosax: JAX-Based Evolution Strategies;Proceedings of the Companion Conference on Genetic and Evolutionary Computation;2023-07-15

5. A novel ensemble estimation of distribution algorithm with distribution modification strategies;Complex & Intelligent Systems;2023-03-20

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3