Affiliation:
1. School of Mechanical and Electrical Engineering, Guangzhou University , Guangzhou, 510006 , China
2. School of Computer Science and Engineering, South China University of Technology , Guangzhou, 510006 , China
3. Graduate School of Business and Law, RMIT University , Melbourne, 3000 , Australia
Abstract
Abstract
Because of the good performance of convolutional neural network (CNN), it has been extensively used in many fields, such as image, speech, text, etc. However, it is easily affected by hyperparameters. How to effectively configure hyperparameters at a reasonable time to improve the performance of CNNs has always been a complex problem. To solve this problem, this paper proposes a method to automatically optimize CNN hyperparameters based on the local autonomous competitive harmony search (LACHS) algorithm. To avoid the influence of complicated parameter adjustment of LACHS algorithm on its performance, a parameter dynamic adjustment strategy is adopted, which makes the pitch adjustment probability PAR and step factor BW dynamically adjust according to the actual situation. To strengthen the fine search of neighborhood space and reduce the possibility of falling into local optima for a long time, an autonomous decision-making search strategy based on the optimal state is designed. To help the algorithm jump out of the local fitting situation, this paper proposes a local competition mechanism to make the new sound competes with the worst harmonic progression of local selection. In addition, an evaluation function is proposed, which integrates the training times and recognition accuracy. To achieve the purpose of saving the calculation cost without affecting the search result, it makes the training time for each model depending on the learning rate and batch size. In order to prove the feasibility of LACHS algorithm in configuring CNN superparameters, the classification of the Fashion-MNIST dataset and CIFAR10 dataset is tested. The comparison is made between CNN based on empirical configuration and CNN based on classical algorithms to optimize hyperparameters automatically. The results show that the performance of CNN based on the LACHS algorithm has been improved effectively, so this algorithm has certain advantages in hyperparametric optimization. In addition, this paper applies the LACHS algorithm to expression recognition. Experiments show that the performance of CNN optimized based on the LACHS algorithm is better than that of the same type of artificially designed CNN. Therefore, the method proposed in this paper is feasible in practical application.
Funder
National Natural Science Foundation of China
Natural Science Foundation of Guangdong Province
Guangzhou Science and Technology Plan
Publisher
Oxford University Press (OUP)
Subject
Computational Mathematics,Computer Graphics and Computer-Aided Design,Human-Computer Interaction,Engineering (miscellaneous),Modeling and Simulation,Computational Mechanics
Reference56 articles.
1. Hyperparameter optimization in convolutional neural network using genetic algorithms;Aszemi;International Journal of Advanced Computer Science and Applications,2019
2. Differential evolution for neural architecture search;Awad,2020
3. Algorithms for hyper-parameter optimization;Bergstra,2011
4. Random search for hyper-parameter optimization;Bergstra;Journal of Machine Learning Research,2012
5. Geometric selective harmony search;Castelli;Information Sciences,2014
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献