Abstract
Abstract
In this paper, we analyze the convergence of a back-propagation (BP) neural network with momentum term containing multiple hidden layers. When the learning rate is constant and the momentum coefficient is adaptively changed under certain conditions, we give both the weak and strong convergence results of the algorithm, and give corresponding theoretical proofs for both convergence results.
Subject
General Physics and Astronomy
Reference12 articles.
1. Parameter convergence and learning curves for neural networks;Fine;J. Neural Computation,1999
2. Diffusion approximations for the constant learning rate backpropagation algorithm and resistance to locol minima;Finnoff;Neural Comput.,1994
3. A new acceleration technique for the back-propagation algorithm;Yu,1993
4. Improved backpropagation learning in nerural networks with windowed momentum;Istook;International Journal of Neural System,2002
5. A backpropagation algorithm with adaptive learning rate and momentum coefficient;Yu,2002
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献