Abstract
In the single pruning algorithm, channel pruning or filter pruning is used to compress the deep convolution neural network, and there are still many redundant parameters in the compressed model. Directly pruning the filter will largely cause the loss of key information and affect the accuracy of model classification. To solve these problems, a parallel pruning algorithm combined with image enhancement is proposed. Firstly, in order to improve the generalization ability of the model, a data enhancement method of random erasure is introduced. Secondly, according to the trained batch normalization layer scaling factor, the channels with small contribution are cut off, the model is initially thinned, and then the filters are pruned. By calculating the geometric median of the filters, redundant filters similar to them are found and pruned, and their similarity is measured by calculating the distance between filters. Pruning was done using VGG19 and DenseNet40 on cifar10 and cifar100 data sets. The experimental results show that this algorithm can improve the accuracy of the model, and at the same time, it can compress the calculation and parameters of the model to a certain extent. Finally, this method is applied in practice, and combined with transfer learning, traffic objects are classified and detected on the mobile phone.
Funder
Hubei Provincial Department of Education
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference28 articles.
1. A Novel Fusion Pruning Algorithm Based on Information Entropy Stratification and IoT Application
2. Convolutional neural network model pruning combined with tensor decomposition compression method;Gong;Comput. Appl.,2020
3. Deep network pruning algorithm based on gradient;Wang;Comput. Appl.,2020
4. Pruning Filters for Efficient ConvNets;Li;arXiv,2016
5. Learning Efficient Convolutional Networks through Network Slimming;Liu;Proceedings of the IEEE International Conference on Computer Vision (ICCV),2017
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献