Affiliation:
1. School of Mathematics, Cardiff University, Cardiff CF24 4AG, UK
Abstract
In this work, we proposed a sparse version of the Support Vector Regression (SVR) algorithm that uses regularization to achieve sparsity in function estimation. To achieve this, we used an adaptive L0 penalty that has a ridge structure and, therefore, does not introduce additional computational complexity to the algorithm. In addition to this, we used an alternative approach based on a similar proposal in the Support Vector Machine (SVM) literature. Through numerical studies, we demonstrated the effectiveness of our proposals. We believe that this is the first time someone discussed a sparse version of Support Vector Regression (in terms of variable selection and not in terms of support vector selection).
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference16 articles.
1. Support Vector Network;Cortes;Mach. Learn.,1995
2. Sparse Support Vector Regression based on orthogonal forward selection for the generalized kernel method;Wang;Neurocomputing,2006
3. Ertin, E., and Potter, L.C. (2005). Intelligent Computing: Theory and Applications III, SPIE.
4. Regression Shrinkage and Selection via the LASSO;Tibshirani;J. R. Stat. Soc. Ser. B,1996
5. Regularization and variable selection via the elastic net;Zou;J. R. Stat. Soc. Ser. B,2005