Author:
Augasta M.,Kathirvalavakumar T.
Abstract
AbstractThe neural network with optimal architecture speeds up the learning process and generalizes the problem well for further knowledge extraction. As a result researchers have developed various techniques for pruning the neural networks. This paper provides a survey of existing pruning techniques that optimize the architecture of neural networks and discusses their advantages and limitations. Also the paper evaluates the effectiveness of various pruning techniques by comparing the performance of some traditional and recent pruning algorithms based on sensitivity analysis, mutual information and significance on four real datasets namely Iris, Wisconsin breast cancer, Hepatitis Domain and Pima Indian Diabetes.
Reference46 articles.
1. P. M. Atkinson, A. R. L. Tatnall, Neural networks in remote sensing, Int. J. Remote Sens. 18(4), 699, 1997
2. A. Fangju, A New Pruning Algorithm for Feedforward Neural Networks, Fourth International Workshop on Advanced Computational Intelligence, IEEE Conference Publication, Wuhan, Hubei, China 19–21 October 2011, 286–289
3. A. Yoan, A. Sorjamaa, P. Bas, O. Simula, C. Jutten, A. Lendasse, 3. OP-ELM: optimally pruned extreme learning machine, IEEE Trans. Neural Networks 21(1), 158–162, 2010
4. S. Ahmmed, K. Abdullah-Al-Mamun, M. Islam, A novel algorithm for designing three layered artificial neural networks, Int. J. Soft. Comput. 2(3), 450–458, 2007
5. O. Aran, O. T. Yildiz, E. Alpaydin, An incremental framework based on cross validation for estimating the architecture of a multilayer perceptron, Int. J. Pttern. Recogn. Artif. Intell. 23(2), 159–190, 2009
Cited by
76 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献