Author:
Pretorius Kyle,Pillay Nelishia
Abstract
AbstractThe use of genetic algorithms (GAs) to evolve neural network (NN) weights has risen in popularity in recent years, particularly when used together with gradient descent as a mutation operator. However, crossover operators are often omitted from such GAs as they are seen as being highly destructive and detrimental to the performance of the GA. Designing crossover operators that can effectively be applied to NNs has been an active area of research with success limited to specific problem domains. The focus of this study is to use genetic programming (GP) to automatically evolve crossover operators that can be applied to NN weights and used in GAs. A novel GP is proposed and used to evolve both reusable and disposable crossover operators to compare their efficiency. Experiments are conducted to compare the performance of GAs using no crossover operator or a commonly used human designed crossover operator to GAs using GP evolved crossover operators. Results from experiments conducted show that using GP to evolve disposable crossover operators leads to highly effectively crossover operators that significantly improve the results obtained from the GA.
Funder
National Research Foundation of South Africa
Multichoice Research Chair in Machine Learning
University of Pretoria
Publisher
Springer Science and Business Media LLC
Reference34 articles.
1. X. Yao, Evolving artificial neural networks. Proc. IEEE 87(9), 1423–1447 (1999)
2. P.J. Angeline, G.M. Saunders, J.B. Pollack, An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Netw. 5(1), 54–65 (1994). https://doi.org/10.1109/72.265960
3. S. Haflidason, R. Neville, On the significance of the permutation problem in neuroevolution. In: Proceedings of the 11th annual conference on genetic and evolutionary computation. GECCO ’09. (Association for Computing Machinery, New York, NY, 2009) pp. 787–794. https://doi.org/10.1145/1569901.1570010
4. P.J.B. Hancock, Genetic algorithms and permutation problems: a comparison of recombination operators for neural net structure specification. In: [Proceedings] COGANN-92: International workshop on combinations of genetic algorithms and neural networks, (1992) pp. 108–122
5. R. Zhou, C. Muise, T. Hu, Permutation-invariant representation of neural networks with neuron embeddings, in Genetic programming. ed. by E. Medvet, G. Pappa, B. Xue (Springer, Cham, 2022), pp.294–308