Abstract
AbstractReal-world applications often involve “uncertain” objectives, i.e., where optimisation algorithms observe objective values as a random variables with positive variance. In the past decade, several rigorous analysis results for evolutionary algorithms (EAs) on discrete problems show that EAs can cope with low-level uncertainties, i.e. when the variance of the uncertain objective value is small, and sometimes even benefit from uncertainty. Previous work showed that a large population combined with a non-elitist selection mechanism is a promising approach to handle high levels of uncertainty. However, the population size and the mutation rate can dramatically impact the performance of non-elitist EAs, and the optimal choices of these parameters depend on the level of uncertainty in the objective function. The performance and the required parameter settings for non-elitist EAs in some common objective-uncertainty scenarios are still unknown. We analyse the runtime of non-elitist EAs on two classical benchmark problems OneMax and LeadingOnes in in the one-bit, the bitwise, the Gaussian, and the symmetric noise models, and the dynamic binary value problem (DynBV). Our analyses are more extensive and precise than previous analyses of non-elitist EAs. In several settings, we prove that the non-elitist EAs outperform the current state-of-the-art results. Furthermore, we provide more precise guidance on how to choose the mutation rate, the selective pressure, and the population size as a function of the level of uncertainty.
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Computer Science Applications,General Computer Science
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Already Moderate Population Sizes Provably Yield Strong Robustness to Noise;Proceedings of the Genetic and Evolutionary Computation Conference;2024-07-14
2. Self-adaptation Can Help Evolutionary Algorithms Track Dynamic Optima;Proceedings of the Genetic and Evolutionary Computation Conference;2023-07-12
3. Comma Selection Outperforms Plus Selection on OneMax with Randomly Planted Optima;Proceedings of the Genetic and Evolutionary Computation Conference;2023-07-12
4. OneMax Is Not the Easiest Function for Fitness Improvements;Evolutionary Computation in Combinatorial Optimization;2023