1. Bergstra, J., Bardenet, R., Bengio, Y., Kégl, B.: Algorithms for hyper-parameter optimization. In: Advances in Neural Information Processing Systems, vol. 24 (2011)
2. Bergstra, J., Yamins, D., Cox, D.: Making a science of model search: hyperparameter optimization in hundreds of dimensions for vision architectures. In: International Conference on Machine Learning, pp. 115–123. PMLR (2013)
3. Dietrich, K., Mersmann, O.: Increasing the diversity of benchmark function sets through affine recombination. In: Parallel Problem Solving from Nature–PPSN XVII: 17th International Conference, PPSN 2022, Dortmund, Germany, 10–14 September 2022, Proceedings, Part I, pp. 590–602. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-14714-2_41
4. Doerr, C., Wang, H., Ye, F., van Rijn, S., Bäck, T.: IOHprofiler: a benchmarking and profiling tool for iterative optimization heuristics. arXiv e-prints:1810.05281 (2018). https://arxiv.org/abs/1810.05281
5. Feurer, M., Eggensperger, K., Falkner, S., Lindauer, M., Hutter, F.: Auto-Sklearn 2.0: Hands-free AutoML via meta-learning. J. Mach. Learn. Res. 23(261), 1–61 (2022)