Linear feature-weighted support vector machine |
| |
Authors: | Hong-jie Xing Ming-hu Ha Bao-gang Hu Da-zeng Tian |
| |
Affiliation: | 1. College of Mathematics and Computer Science, Hebei University, Baoding, Hebei, 071002, P.R.China 2. National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, 100091, P.R.China 3. College of Physics Sciences of Technology, Hebei University, Baoding, Hebei, 071002, P.R.China
|
| |
Abstract: | The existing support vector machines (SVMs) are all assumed that all the features of training samples have equal contributions to construct the optimal separating hyperplane. However, for a certain real-world data set, some features of it may possess more relevances to the classification information, while others may have less relevances. In this paper, the linear feature-weighted support vector machine (LFWSVM) is proposed to deal with the problem. Two phases are employed to construct the proposed model. First, the mutual information (MI) based approach is used to assign appropriate weights for each feature of the whole given data set. Second, the proposed model is trained by the samples with their features weighted by the obtained feature weight vector. Meanwhile, the feature weights are embedded in the quadratic programming through detailed theoretical deduction to obtain the dual solution to the original optimization problem. Although the calculation of feature weights may add an extra computational cost, the proposed model generally exhibits better generalization performance over the traditional support vector machine (SVM) with linear kernel function. Experimental results upon one synthetic data set and several benchmark data sets confirm the benefits in using the proposed method. Moreover, it is also shown in experiments that the proposed MI based approach to determining feature weights is superior to the other two mostly used methods. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|