A note on the applications of one primary function in deep neural networks
-
Published:2021-12-04
Issue:
Volume:
Page:
-
ISSN:0219-6913
-
Container-title:International Journal of Wavelets, Multiresolution and Information Processing
-
language:en
-
Short-container-title:Int. J. Wavelets Multiresolut Inf. Process.
Affiliation:
1. Department of Mathematical Sciences, Zhejiang Sci-Tech University, Hangzhou 310018, P. R. China
Abstract
By applying fundamental mathematical knowledge, this paper proves that the function [Formula: see text] is an integer no less than [Formula: see text] has the property that the difference between the function value of middle point of arbitrarily two adjacent equidistant distribution nodes on [Formula: see text] and the mean of function values of these two nodes is a constant depending only on the number of nodes if and only if [Formula: see text] By them, we establish an important result about deep neural networks that the function [Formula: see text] can be interpolated by a deep Rectified Linear Unit (ReLU) network with depth [Formula: see text] on the equidistant distribution nodes in interval [Formula: see text] and the error of approximation is [Formula: see text] Then based on the main result that has just been proven and the Chebyshev orthogonal polynomials, we construct a deep network and give the error estimate of approximation to polynomials and continuous functions, respectively. In addition, this paper constructs one deep network with local sparse connections, shared weights and activation function [Formula: see text] and discusses its density and complexity.
Funder
National Natural Science Foundation of China
Zhejiang Provincial Natural Science Foundation of China
Publisher
World Scientific Pub Co Pte Ltd
Subject
Applied Mathematics,Information Systems,Signal Processing