Affiliation:
1. College of Business, University of Texas at San Antonio, USA
Abstract
As machine learning techniques, support vector machines are quadratic programming models and are recent revolutionary development for classification analysis. Primal and dual formulations of support vector machine models for both two-class and multi-class classification are discussed. The dual formulations in high dimensional feature space using inner product kernels are emphasized. Nonlinear classification function or discriminant functions in high dimensional feature spaces can be constructed through the use of inner product kernels without actually mapping the data from the input space to the high dimensional feature spaces. Furthermore, the size of the dual formulation is independent of the dimension of the input space and independent of the kernels used. Two illustrative examples, one for two-class and the other for multi-class classification, are used to demonstrate the formulations of these SVM models.
Reference31 articles.
1. K-SVCR. A support vector machine for multi-class classification
2. Sceadan: Using Concatenated N-Gram Vectors for Improved File and Data Type Classification
3. Chang, C.-C., & Lin, C.-J. (2011). LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2(3), 27:1–27:27. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Artificial Intelligence;Handbook of Research on Manufacturing Process Modeling and Optimization Strategies;2017
2. Mathematical Programming Models for Classification;Encyclopedia of Business Analytics and Optimization;2014