Abstract
AbstractThe selection of weight initialization in an artificial neural network is one of the key aspects and affects the learning speed, convergence rate and correctness of classification by an artificial neural network. In this paper, we investigate the effects of weight initialization in an artificial neural network. Nguyen-Widrow weight initialization, random initialization, and Xavier initialization method are paired with five different activation functions. This paper deals with a feedforward neural network, consisting of an input layer, a hidden layer, and an output layer. The paired combination of weight initialization methods with activation functions are examined and tested and compared based on their best achieved loss rate in training. This work aims to better understand how weight initialization methods in neural networks, in combination with activation functions, affect the learning speed in comparison after a fixed number of training epochs.
Funder
FHNW University of Applied Sciences and Arts Northwestern Switzerland
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Cognitive Neuroscience,Computer Vision and Pattern Recognition,Mathematics (miscellaneous)
Reference15 articles.
1. Apicella A, Donnarumma F, Isgrò F, Prevete R (2021) A survey on modern trainable activation functions. Neural Netw 138:14–32. https://doi.org/10.1016/j.neunet.2021.01.026
2. Chollet F, Allarire JJ (2017) Deep learning with R. Manning Publications Co., Shelter Island
3. Dolezel P, Skrabanek P, Gago L (2016) Weight initialization possibilities for feedforward neural network with linear saturated activation functions. IFAC-PapersOnLine 49(25):49–54. https://doi.org/10.1016/j.ifacol.2016.12.009
4. Dua D, Graff C (2019) UCI machine learning repository [http://archive.ics.uci.edu/ml]. University of California, School of Information and Computer Science, Irvine, CA (2019)
5. Hendrycks D, Gimpel K (2016) Gaussian error linear units (GELUs). http://arxiv.org/abs/1606.08415
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献