Affiliation:
1. Nanjing University of Posts and Telecommunications
Abstract
This paper dose the theoretical analysis for solving the algorithm complexity of a neural network using rational cubic spline weight functions with linear denominator. It concludes that the time complexity of the training algorithm has linear relations with the neural network’s input dimensions, output dimensions as well as the number of samples.
Publisher
Trans Tech Publications, Ltd.
Reference5 articles.
1. Limin Fu. A neural-network model for learning domain rules based on its activation function characteristics [J]. IEEE Trans. On Neural Networks, 1998, 9(5): 787-795.
2. Daiyuan Zhang. New algorithm for training feedforward neural networks with cubic spline weight functions[J]. Systems Engineering and Electronics, 2006, 28(9): 1434-1437, 1446.
3. Daiyuan Zhang. New Theories and Methods on Neural Networks[M]. Beijing: Tsinghua University Press, (2006).
4. L.L. Stachó, R. Vajda. Hermite interpolation sequences over fields [J]. Linear Algebra and its Applications, 2013, 439(1): 66-77.
5. Temperton C. A algorithm for solution of cyclic tridiagonal systems [J]. Journal of Computational Physics, (1975).