Abstract
We consider best approximation problems in a nonlinear subset ℳ of a Banach space of functions (𝒱,∥•∥). The norm is assumed to be a generalization of the L
2-norm for which only a weighted Monte Carlo estimate ∥•∥n can be computed. The objective is to obtain an approximation v ∈ ℳ of an unknown function u ∈ 𝒱 by minimizing the empirical norm ∥u − v∥n. We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the specified nonlinear least squares setting. Several model classes are examined and the analytical statements about the RIP are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.
Funder
Deutsche Forschungsgemeinschaft
Einstein Stiftung Berlin
Berlin International Graduate School in Model and Simulation based Research
Subject
Applied Mathematics,Modeling and Simulation,Numerical Analysis,Analysis,Computational Mathematics
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献