Abstract
AbstractIn the present paper we considered the problems of studying the best approximation order and inverse approximation theorems for families of neural network (NN) operators. Both the cases of classical and Kantorovich type NN operators have been considered. As a remarkable achievement, we provide a characterization of the well-known Lipschitz classes in terms of the order of approximation of the considered NN operators. The latter result has inspired a conjecture concerning the saturation order of the considered families of approximation operators. Finally, several noteworthy examples have been discussed in detail
Funder
Gruppo Nazionale per l’Analisi Matematica, la Probabilità e le loro Applicazioni
European Union - NextGenerationEU under the Italian Ministry of University and Research
Università degli Studi di Perugia
Publisher
Springer Science and Business Media LLC
Reference32 articles.
1. Ancellin, M., Després, B.: A functional equation with polynomial solutions and application to neural networks. C. R. Math. 358(9–10), 1059–1072 (2020)
2. Grohs, P., Voigtlaender, F.: Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces, Foundations of Computational Mathematics Paper Numb, vol .142 (2023)
3. Petersen, P., Voigtlaender, F.: Equivalence of approximation by convolutional neural networks and fully-connected networks. Proc. Am. Math. Soc. 148, 1567–1581 (2020)
4. Yang, Y., Zhou, D.-X.: Optimal rates of approximation by shallow ReLU$$^k$$ neural networks and applications to nonparametric regression, arXiv, https://doi.org/10.48550/arXiv.2304.01561 (2023)
5. Cardaliaguet, P., Euvrard, G.: Approximation of a function and its derivative with a neural network. Neural Netw. 5(2), 207–220 (1992)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献