Abstract
AbstractIn this paper, we study the rate of pointwise approximation for the neural network operators of the Kantorovich type. This result is obtained proving a certain asymptotic expansion for the above operators and then by establishing a Voronovskaja type formula. A central role in the above resuts is played by the truncated algebraic moments of the density functions generated by suitable sigmoidal functions. Furthermore, to improve the rate of convergence, we consider finite linear combinations of the above neural network type operators, and also in the latter case, we obtain a Voronovskaja type theorem. Finally, concrete examples of sigmoidal activation functions have been deeply discussed, together with the case of rectified linear unit (ReLu) activation function, very used in connection with deep neural networks.
Funder
Fondazione Cassa di Risparmio di Perugia
Gruppo Nazionale per l’Analisi Matematica, la Probabilità e le loro Applicazioni
Università degli Studi di Perugia
Publisher
Springer Science and Business Media LLC
Reference56 articles.
1. Adell, J.A., Cardenas-Morales, D.: Quantitative generalized Voronovskaja’s formulae for Bernstein polynomials. J. Approx. Theory 231, 41–52 (2018)
2. Agarap, A.F.: Deep Learning using Rectified Linear Units (ReLU). arXiv:1803.08375 (2018)
3. Agostinelli, F., Hoffman, M., Sadowski, P., Baldi, P.: Learning Activation Functions to Improve Deep Neural Networks. arXiv:1412.6830v3 (2015)
4. Aral, A., Acar, T., Rasa, I.: The new forms of Voronovskaya’s theorem in weighted spaces. Positivity 20(1), 25–40 (2016)
5. Asdrubali, F., Baldinelli, G., Bianchi, F., Costarelli, D., Rotili, A., Seracini, M., Vinti, G.: Detection of thermal bridges from thermographic images by means of image processing approximation algorithms. Appl. Math. Comput. 317, 160–171 (2018)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献