Affiliation:
1. Department of Mathematics Emory University Atlanta Georgia USA
2. School of Computational Science and Engineering Georgia Institute of Technology Atlanta Georgia USA
Abstract
SummaryA general, rectangular kernel matrix may be defined as where is a kernel function and where and are two sets of points. In this paper, we seek a low‐rank approximation to a kernel matrix where the sets of points and are large and are arbitrarily distributed, such as away from each other, “intermingled”, identical, and so forth. Such rectangular kernel matrices may arise, for example, in Gaussian process regression where corresponds to the training data and corresponds to the test data. In this case, the points are often high‐dimensional. Since the point sets are large, we must exploit the fact that the matrix arises from a kernel function, and avoid forming the matrix, and thus ruling out most algebraic techniques. In particular, we seek methods that can scale linearly or nearly linearly with respect to the size of data for a fixed approximation rank. The main idea in this paper is to geometrically select appropriate subsets of points to construct a low rank approximation. An analysis in this paper guides how this selection should be performed.
Subject
Applied Mathematics,Algebra and Number Theory
Reference60 articles.
1. Applied mathematical sciences;Kress R,2013
2. Boundary Integral Equations
3. The numerical solution of the eigenvalue problem for compact integral operators;Atkinson KE;Trans Am Math Soc,1967
4. Rapid solution of integral equations of classical potential theory
5. Eigenvalue Problems for Exponential-Type Kernels
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献