Abstract
<abstract><p>A recent line of works established the approximation complexity estimation of deep ReLU networks for the bandlimited functions in the MSE (mean square error) sense. In this note, we significantly enhance this result, that is, we estimate the approximation complexity in the $ L_{\infty} $ sense. The key to the proof is to establish a frequency decomposition lemma which may be of independent interest.</p></abstract>
Publisher
American Institute of Mathematical Sciences (AIMS)
Reference35 articles.
1. A. R. Barron, Universal approximation bounds for superpositions of a sigmoidal function, IEEE T. Inform. Theory, 39 (1993), 930–945. https://doi.org/10.1109/18.256500
2. P. L. Bartlett, P. M. Long, More theorems about scalesensitive dimensions and learning, Proceedings of the Eighth Annual Conference on Computational Learning Theory, 1995,392–401.
3. L. Chen, C. Wu, A note on the expressive power of deep rectified linear unit networks in high-dimensional spaces, Math. Methods Appl. Sci., 42 (2019), 3400–3404. https://doi.org/10.1002/mma.5575
4. N. Cohen, O. Sharir, A. Shashua, On the expressive power of deep learning: A tensor analysis, 29th Annual Conference on Learning Theory, 2016,698–728.
5. R. Eldan, O. Shamir, The power of depth for feedforward neural networks, 29th Annual Conference on Learning Theory, 2016,907–940.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献