Abstract
AbstractWe consider a general regularised interpolation problem for learning a parameter vector from data. The well-known representer theorem says that under certain conditions on the regulariser there exists a solution in the linear span of the data points. This is at the core of kernel methods in machine learning as it makes the problem computationally tractable. Most literature deals only with sufficient conditions for representer theorems in Hilbert spaces and shows that the regulariser being norm-based is sufficient for the existence of a representer theorem. We prove necessary and sufficient conditions for the existence of representer theorems in reflexive Banach spaces and show that any regulariser has to be essentially norm-based for a representer theorem to exist. Moreover, we illustrate why in a sense reflexivity is the minimal requirement on the function space. We further show that if the learning relies on the linear representer theorem, then the solution is independent of the regulariser and in fact determined by the function space alone. This in particular shows the value of generalising Hilbert space learning theory to Banach spaces.
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Computational Mathematics
Reference22 articles.
1. Argyriou, A., Micchelli, C.A., Pontil, M.: When is there a representer theorem? vector versus matrix regularizers. J. Mach. Learn. Res. 10, 2507–2529 (2009)
2. Asplund, E.: Positivity of duality mappings. Bull. Amer. Math. Soc. 73(2), 200–203 (1967)
3. Blaz̆ek, J.: Some remarks on the duality mapping. Acta Univ. Carolinae Math. Phys. 23(2), 15–19 (1982)
4. Brezis, H.: Functional Analysis, Sobolev Spaces and Partial Differential Equations. Springer, New York. https://doi.org/10.1007/978-0-387-70914-7 (2011)
5. Browder, F.E.: Multi-valued monotone nonlinear mappings and duality mappings in banach spaces. Trans. Am. Math. Soc. 118, 338–351 (1965)
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献