Abstract
In this survey we discuss various approximation-theoretic problems that arise in the multilayer feedforward perceptron (MLP) model in neural networks. The MLP model is one of the more popular and practical of the many neural network models. Mathematically it is also one of the simpler models. Nonetheless the mathematics of this model is not well understood, and many of these problems are approximation-theoretic in character. Most of the research we will discuss is of very recent vintage. We will report on what has been done and on various unanswered questions. We will not be presenting practical (algorithmic) methods. We will, however, be exploring the capabilities and limitations of this model.
Publisher
Cambridge University Press (CUP)
Subject
General Mathematics,Numerical Analysis
Reference139 articles.
1. Chen X. and White H. (1999), ‘Improved rates and asymptotic normality for non-parametric neural network estimators’, preprint.
2. Approximation of continuous functions by superpositions of plane waves;Vostrecov;Dokl. Akad. Nauk SSSR,1961
3. Generalization and Approximation Capabilities of Multilayer Networks
Cited by
790 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献