Affiliation:
1. Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg, Manitoba, Canada R3T 5V6, Canada
Abstract
Simulations indicate that the deterministic Boltzmann machine, unlike the stochastic Boltzmann machine from which it is derived, exhibits unstable behavior during contrastive Hebbian learning of nonlinear problems, including oscillation in the learning algorithm and extreme sensitivity to small weight perturbations. Although careful choice of the initial weight magnitudes, the learning rate, and the annealing schedule will produce convergence in most cases, the stability of the resulting solution depends on the parameters in a complex and generally indiscernible way. We show that this unstable behavior is the result of over parameterization (excessive freedom in the weights), which leads to continuous rather than isolated optimal weight solution sets. This allows the weights to drift without correction by the learning algorithm until the free energy landscape changes in such a way that the settling procedure employed finds a different minimum of the free energy function than it did previously and a gross output error occurs. Because all the weight sets in a continuous optimal solution set produce exactly the same network outputs, we define reliability, a measure of the robustness of the network, as a new performance criterion.
Publisher
World Scientific Pub Co Pte Lt
Subject
Computer Networks and Communications,General Medicine
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献