Abstract
AbstractWe revisit the classical kernel method of approximation/interpolation theory in a very specific context from the particular point of view of partial differential equations. The goal is to highlight the role of regularization by casting it in terms of actual smoothness of the interpolant obtained by the procedure. The latter will be merely continuous on the data set but smooth otherwise. While the method obtained fits into the category of RKHS methods and hence shares their main features, it explicitly uses smoothness, via a dimension dependent (pseudo-)differential operator, to obtain a flexible and robust interpolant, which can adapt to the shape of the data while quickly transitioning away from it and maintaining continuous dependence on them. The latter means that a perturbation or pollution of the data set, small in size, leads to comparable results in classification applications. The method is applied to both low dimensional examples and a standard high dimensioal benchmark problem (MNIST digit classification).
Publisher
Springer Science and Business Media LLC