Conditional mean embedding and optimal feature selection via positive definite kernels
-
Published:2024
Issue:1
Volume:44
Page:79-103
-
ISSN:1232-9274
-
Container-title:Opuscula Mathematica
-
language:en
-
Short-container-title:Opuscula Math.
Author:
Jorgensen Palle E.T.,Song Myung-Sin,Tian James
Abstract
Motivated by applications, we consider new operator-theoretic approaches to conditional mean embedding (CME). Our present results combine a spectral analysis-based optimization scheme with the use of kernels, stochastic processes, and constructive learning algorithms. For initially given non-linear data, we consider optimization-based feature selections. This entails the use of convex sets of kernels in a construction o foptimal feature selection via regression algorithms from learning models. Thus, with initial inputs of training data (for a suitable learning algorithm), each choice of a kernel \(K\) in turn yields a variety of Hilbert spaces and realizations of features. A novel aspect of our work is the inclusion of a secondary optimization process over a specified convex set of positive definite kernels, resulting in the determination of "optimal" feature representations.
Publisher
AGHU University of Science and Technology Press
Subject
General Mathematics