Affiliation:
1. School of Information Science and Engineering, Yunnan University, Kunming 650500, China
Abstract
Sequential minimal optimization (SMO) method is an algorithm for solving optimization problems arising from the training process of support vector machines (SVM). The SMO algorithm is mainly used to solve the optimization problem of the objective function of SVM, and it can have high accuracy. However, its optimization accuracy can be improved. Fractional order calculus is an extension of integer order calculus, which can more accurately describe the actual system and get more accurate results. In this paper, the fractional order sequential minimal optimization (FOSMO) method is proposed based on the SMO method and fractional order calculus for classification. Firstly, an objective function is expressed by a fractional order function using the FOSMO method. The representation and meaning of fractional order terms in the objective function are studied. Then the fractional derivative of Lagrange multipliers is obtained according to fractional order calculus. Lastly, the objective function is optimized based on fractional order Lagrange multipliers, and then some experiments are carried out on the linear and nonlinear classification cases. Some experiments are carried out on two-classification and multi-classification situations, and experimental results show that the FOSMO method can obtain better accuracy than the normal SMO method.
Funder
National Natural Science Foundation of China
Subject
Statistics and Probability,Statistical and Nonlinear Physics,Analysis
Reference37 articles.
1. Support-vector networks;Cortes;Mach. Learn.,1995
2. Platt, J.C. (1998). Advances in Kernel Methods-Support Vector Learning, Microsoft.
3. Rifkin, R.M. (2002). Everything Old is New Again: A Fresh Look at Historical Approaches in Machine Learning, Massachusetts Institute of Technology.
4. Developing parallel sequential minimal optimization for fast training support vector machine;Cao;Neurocomputing,2006
5. Sequential minimal optimization for quantum-classical hybrid algorithms;Nakanishi;Phys. Rev. Res.,2021
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献