Adjusted support vector machines based on a new loss function |
| |
Authors: | Shuchun Wang Wei Jiang Kwok-Leung Tsui |
| |
Affiliation: | 1.Golden Arc Capital, Inc.,New York,USA;2.Department of Systems Engineering & Engineering Management,Stevens Institute of Technology,Hoboken,USA;3.School of Industrial & Systems Engineering,Georgia Institute of Technology,Atlanta,USA |
| |
Abstract: | Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|