Abstract
AbstractSupport vector machine (SVM) is one of the most important class of machine learning models and algorithms, and has been successfully applied in various fields. Nonlinear optimization plays a crucial role in SVM methodology, both in defining the machine learning models and in designing convergent and efficient algorithms for large-scale training problems. In this paper we present the convex programming problems underlying SVM focusing on supervised binary classification. We analyze the most important and used optimization methods for SVM training problems, and we discuss how the properties of these problems can be incorporated in designing useful algorithms.
Publisher
Springer Science and Business Media LLC
Subject
Management Science and Operations Research,General Decision Sciences
Reference76 articles.
1. Astorino, A., & Fuduli, A. (2015). Support vector machine polyhedral separability in semisupervised learning. Journal of Optimization Theory and Applications, 164, 1039–1050.
2. Bertsekas, D. P. (1999). Nonlinear programming. Athena Scientific.
3. Bishop, C. M. (2006). Pattern recognition and machine learning. Springer.
4. Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on computational learning theory, COLT ’92 (pp. 144–152) New York, NY, USA, ACM.
5. Byrd, R. H., Chin, G. M., Neveitt, W., & Nocedal, J. (2011). On the use of stochastic hessian information in optimization methods for machine learning. SIAM Journal on Optimization, 21, 977–995.
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献