Author:
T. Nguyen Son,M. Pham Tu,Hoang Anh,V. Trieu Linh,T. Cao Trung
Abstract
Traditional neural network training is usually based on the maximum likelihood to obtain the appropriate network parameters including weights and biases given the training data. However, if the available data are finite and noisy, the maximum likelihood-based network training can cause the neural network after being trained to overfit the noisy data. This problem has been overcome by using the Bayesian inference applied to the neural network training in various applications. The Bayesian inference can allow values of regularization parameters to be found using only the training data. In addition, the Bayesian approach also allows different models (e.g., neural networks with different numbers of hidden units to be compared using only the training data). Neural networks trained with Bayesian inference can be also known as Bayesian neural networks (BNNs). This chapter focuses on BNNs for classification problems with considerations of model complexity of BNNs conveniently handled by a method known as the evidence framework.
Reference15 articles.
1. MacKay JC. The evidence framework applied to classification networks. Neural Computation. 1992;:720-736
2. Bishop CM. Neural Networks for Pattern Recognition. New York: Oxford University Press; 1995
3. Nabney IT. Netlab: Algorithms for Pattern Recognition (Advances in Pattern Recognition). Springer Nature Customer Service Center LLC; 2002
4. Thanh SN, Goetz S. Dissolved gas analysis of insulating oil for power transformer fault diagnosis with Bayesian neural network. JST: Smart Systems and Devices. 2022;(3):61-68
5. Thanh SN, Johnson CG. Protein secondary structure prediction using an optimized Bayesian classification neural network. In: The 5th International Conference on Neural Computing Theory and Application, Vilamoura, Algarve, Portugal. 20–22 Sep 2013