首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.
Multicategory Classification by Support Vector Machines   总被引:8,自引:0,他引:8  
We examine the problem of how to discriminate between objects of three or more classes. Specifically, we investigate how two-class discrimination methods can be extended to the multiclass case. We show how the linear programming (LP) approaches based on the work of Mangasarian and quadratic programming (QP) approaches based on Vapnik's Support Vector Machine (SVM) can be combined to yield two new approaches to the multiclass problem. In LP multiclass discrimination, a single linear program is used to construct a piecewise-linear classification function. In our proposed multiclass SVM method, a single quadratic program is used to construct a piecewise-nonlinear classification function. Each piece of this function can take the form of a polynomial, a radial basis function, or even a neural network. For the k > 2-class problems, the SVM method as originally proposed required the construction of a two-class SVM to separate each class from the remaining classes. Similarily, k two-class linear programs can be used for the multiclass problem. We performed an empirical study of the original LP method, the proposed k LP method, the proposed single QP method and the original k QP methods. We discuss the advantages and disadvantages of each approach.  相似文献   

2.
基于支持向量机的磨粒识别   总被引:1,自引:0,他引:1  
由于神经网络的局限性,上个世纪末,支持向量机被提出和发展,它在模式识别方面有广泛的应用发展前途,并由最初的二元分类发展到现在的多元分类.本文根据支持向量机的最新发展,把最小二乘支持向量机应用在磨粒识别上,并取得了好的结果.  相似文献   

3.
本文介绍了支持向量分类机,并引入具有更好识别能力的KMOD核函数建立了SVM信用卡分类模型.利用澳大利亚和德国的信用卡数据进行了数值实验,结果表明该模型在分类准确率、支持向量方面优于基于RBF的SVM模型.  相似文献   

4.
标准支持向量机(SVM)抗噪声能力不强,当训练样本中存在有噪声或者野点时,会影响最优分类面的产生,最终导致分类结果出现偏差。针对这一问题,提出了一种考虑最小包围球的加权支持向量机(WSVM),给每个样本点赋予不同的权值,以此来降低噪声或野点对分类结果的影响。对江汉油田某区块的oilsk81,oilsk83和oilsk85三口油井的测井数据进行交叉验证,其中核函数采用了线性、指数和RBF这3种不同的核函数。测试结果显示,无论是在SVM还是在WSVM中,核函数选择RBF识别率都是最高的,同时提出的WSVM不受核函数的影响,识别稳定性好,且在交叉验证中识别率都能够达到100%。  相似文献   

5.
The performance of kernel-based method, such as support vector machine (SVM), is greatly affected by the choice of kernel function. Multiple kernel learning (MKL) is a promising family of machine learning algorithms and has attracted many attentions in recent years. MKL combines multiple sub-kernels to seek better results compared to single kernel learning. In order to improve the efficiency of SVM and MKL, in this paper, the Kullback–Leibler kernel function is derived to develop SVM. The proposed method employs an improved ensemble learning framework, named KLMKB, which applies Adaboost to learning multiple kernel-based classifier. In the experiment for hyperspectral remote sensing image classification, we employ feature selected through Optional Index Factor (OIF) to classify the satellite image. We extensively examine the performance of our approach in comparison to some relevant and state-of-the-art algorithms on a number of benchmark classification data sets and hyperspectral remote sensing image data set. Experimental results show that our method has a stable behavior and a noticeable accuracy for different data set.  相似文献   

6.
Optimal kernel selection in twin support vector machines   总被引:2,自引:0,他引:2  
In twin support vector machines (TWSVMs), we determine pair of non-parallel planes by solving two related SVM-type problems, each of which is smaller than the one in a conventional SVM. However, similar to other classification methods, the performance of the TWSVM classifier depends on the choice of the kernel. In this paper we treat the kernel selection problem for TWSVM as an optimization problem over the convex set of finitely many basic kernels, and formulate the same as an iterative alternating optimization problem. The efficacy of the proposed classification algorithm is demonstrated with some UCI machine learning benchmark datasets.  相似文献   

7.
基于高斯RBF核支持向量机预测棉花商品期货主力和次主力合约协整关系的价差序列,确定最优SVM参数,并选择合适的开平仓阈值,进行同品种跨期套利.再与多项式核支持向量机套利结果对比,得到在所有开平仓阈值上,基于高斯RBF核支持向量机套利的收益率都明显高于多项式核支持向量机套利的收益率.  相似文献   

8.
§ 1  IntroductionIf you knock the word“SVM”in the SCI index tool on International network,youwould take on thousands of records immediately.This shows its great effects on ourworld.SVM,namely,support vector machines have been successfully applied to a numberof applications ranging from particle identification and text categorization to engine knockdetection,bioinformatics and database marketing[1— 6] .The approach is systematic andproperly motivated by statistical learning theory[7] .…  相似文献   

9.
非平行支持向量机是支持向量机的延伸,受到了广泛的关注.非平行支持向量机构造允许非平行的支撑超平面,可以描述不同类别之间的数据分布差异,从而适用于更广泛的问题.然而,对非平行支持向量机模型与支持向量机模型之间的关系研究较少,且尚未有等价于标准支持向量机模型的非平行支持向量机模型.从支持向量机出发,构造出新的非平行支持向量机模型,该模型不仅可以退化为标准支持向量机,保留了支持向量机的稀疏性和核函数可扩展性.同时,可以描述不同类别之间的数据分布差异,适用于更广泛的非平行结构数据等.最后,通过实验初步验证了所提模型的有效性.  相似文献   

10.
A Clifford support vector machine (CSVM) learns the decision surface from multi distinct classes of the multiple input points using the Clifford geometric algebra. In many applications, each multiple input point may not be fully assigned to one of these multi-classes. In this paper, we apply a fuzzy membership to each multiple input point and reformulate the CSVM for multiclass classification to make different input points have their own different contributions to the learning of decision surface. We call the proposed method Clifford fuzzy SVM.  相似文献   

11.
In this work, we create a quality map of a slate deposit, using the results of an investigation based on surface geology and continuous core borehole sampling. Once the quality of the slate and the location of the sampling points have been defined, different kinds of support vector machines (SVMs)—SVM classification (multiclass one-against-all), ordinal SVM and SVM regression—are used to draw up the quality map. The results are also compared with those for kriging.  相似文献   

12.
信用分类是信用风险管理中一个重要环节,其主要目的是根据信用申请客户提供的资料从申请客户中区分出可信客户和违约客户,以便为信用决策者提供决策依据.为了正确区分不同的信用客户,特别是违约客户,结合核主元分析和支持向量机算法构造基于核主元分析的带可变惩罚因子最小二乘模糊支持向量机模型对信用数据进行了分类处理.在基于核主元分析的带可变惩罚因子最小二乘模糊支持向量机模型中,首先对样本数据进行预处理,然后利用核主元分析以非线性方式降低数据的维数,最后利用带可变惩罚因子最小二乘模糊支持向量机模型对降维后数据进行分类分析.为了验证,选择两个公开的信用数据集来进行实证分析.实证结果表明:基于核主元分析的带可变惩罚因子最小二乘模糊支持向量机模型取得了较好的分类结果,可为信用决策者提供重要的决策参考依据.  相似文献   

13.
随着人们创新水平的不断提高,为了更加准确的实现机器人的导航任务,提出了一种基于改进的粒子群优化支持向量机中的参数的方法.首先利用主成分分析法对数据进行降维,然后利用改进的粒子群优化算法,对SVM中的惩罚参数c和核函数的参数g进行优化,最后代入到SVM中,以此来达到运用SVM对机器人的导航任务进行分类识别.相对于其他算法,容易发现改进的粒子群优化算法优化后的支持向量机可以达到很好的效果.这种识别分类可以帮助人们很好的对机器人进行导航,对今后机器人的研究具有很大的应用价值.  相似文献   

14.
The goal of classification (or pattern recognition) is to construct a classifier with small misclassification error. The notions of consistency and universal consistency are important to the construction of classification rules. A consistent rule guarantees us that taking more samples essentially suffices to roughly reconstruct the unknown distribution. Support vector machine (SVM) algorithm is one of the most important rules in two category classification. How to effectively extend the SVM for multicategory classification is still an on-going research issue. Different versions of multicategory support vector machines (MSVMs) have been proposed and used in practice. We study the one designed by Lee, Lin and Wahba with hinge loss functional. The consistency of MSVMs is established under a mild condition. As a corollary, the universal consistency holds true if the reproducing kernel Hilbert space is dense in C norm. In addition, an example is given to demonstrate the main results. Dedicated to Charlie Micchelli on the occasion of his 60th birthday Supported in part by NSF of China under Grants 10571010 and 10171007.  相似文献   

15.
Extreme learning machine (ELM) not only is an effective classifier in supervised learning, but also can be applied on unsupervised learning and semi-supervised learning. The model structure of unsupervised extreme learning machine (US-ELM) and semi-supervised extreme learning machine (SS-ELM) are same as ELM, the difference between them is the cost function. We introduce kernel function to US-ELM and propose unsupervised extreme learning machine with kernel (US-KELM). And SS-KELM has been proposed. Wavelet analysis has the characteristics of multivariate interpolation and sparse change, and Wavelet kernel functions have been widely used in support vector machine. Therefore, to realize a combination of the wavelet kernel function, US-ELM, and SS-ELM, unsupervised extreme learning machine with wavelet kernel function (US-WKELM) and semi-supervised extreme learning machine with wavelet kernel function (SS-WKELM) are proposed in this paper. The experimental results show the feasibility and validity of US-WKELM and SS-WKELM in clustering and classification.  相似文献   

16.
Unsupervised classification is a highly important task of machine learning methods. Although achieving great success in supervised classification, support vector machine (SVM) is much less utilized to classify unlabeled data points, which also induces many drawbacks including sensitive to nonlinear kernels and random initializations, high computational cost, unsuitable for imbalanced datasets. In this paper, to utilize the advantages of SVM and overcome the drawbacks of SVM-based clustering methods, we propose a completely new two-stage unsupervised classification method with no initialization: a new unsupervised kernel-free quadratic surface SVM (QSSVM) model is proposed to avoid selecting kernels and related kernel parameters, then a golden-section algorithm is designed to generate the appropriate classifier for balanced and imbalanced data. By studying certain properties of proposed model, a convergent decomposition algorithm is developed to implement this non-covex QSSVM model effectively and efficiently (in terms of computational cost). Numerical tests on artificial and public benchmark data indicate that the proposed unsupervised QSSVM method outperforms well-known clustering methods (including SVM-based and other state-of-the-art methods), particularly in terms of classification accuracy. Moreover, we extend and apply the proposed method to credit risk assessment by incorporating the T-test based feature weights. The promising numerical results on benchmark personal credit data and real-world corporate credit data strongly demonstrate the effectiveness, efficiency and interpretability of proposed method, as well as indicate its significant potential in certain real-world applications.  相似文献   

17.
We improve the twin support vector machine(TWSVM)to be a novel nonparallel hyperplanes classifier,termed as ITSVM(improved twin support vector machine),for binary classification.By introducing the diferent Lagrangian functions for the primal problems in the TWSVM,we get an improved dual formulation of TWSVM,then the resulted ITSVM algorithm overcomes the common drawbacks in the TWSVMs and inherits the essence of the standard SVMs.Firstly,ITSVM does not need to compute the large inverse matrices before training which is inevitable for the TWSVMs.Secondly,diferent from the TWSVMs,kernel trick can be applied directly to ITSVM for the nonlinear case,therefore nonlinear ITSVM is superior to nonlinear TWSVM theoretically.Thirdly,ITSVM can be solved efciently by the successive overrelaxation(SOR)technique or sequential minimization optimization(SMO)method,which makes it more suitable for large scale problems.We also prove that the standard SVM is the special case of ITSVM.Experimental results show the efciency of our method in both computation time and classification accuracy.  相似文献   

18.
Support vector machines (SVMs) training may be posed as a large quadratic program (QP) with bound constraints and a single linear equality constraint. We propose a (block) coordinate gradient descent method for solving this problem and, more generally, linearly constrained smooth optimization. Our method is closely related to decomposition methods currently popular for SVM training. We establish global convergence and, under a local error bound assumption (which is satisfied by the SVM QP), linear rate of convergence for our method when the coordinate block is chosen by a Gauss-Southwell-type rule to ensure sufficient descent. We show that, for the SVM QP with n variables, this rule can be implemented in O(n) operations using Rockafellar’s notion of conformal realization. Thus, for SVM training, our method requires only O(n) operations per iteration and, in contrast to existing decomposition methods, achieves linear convergence without additional assumptions. We report our numerical experience with the method on some large SVM QP arising from two-class data classification. Our experience suggests that the method can be efficient for SVM training with nonlinear kernel.  相似文献   

19.
In this paper, we consider a scale adjusted-type distance-based classifier for high-dimensional data. We first give such a classifier that can ensure high accuracy in misclassification rates for two-class classification. We show that the classifier is not only consistent but also asymptotically normal for high-dimensional data. We provide sample size determination so that misclassification rates are no more than a prespecified value. We propose a classification procedure called the misclassification rate adjusted classifier. We further develop the classifier to multiclass classification. We show that the classifier can still enjoy asymptotic properties and ensure high accuracy in misclassification rates for multiclass classification. Finally, we demonstrate the proposed classifier in actual data analyses by using a microarray data set.  相似文献   

20.
In this research, a robust optimization approach applied to multiclass support vector machines (SVMs) is investigated. Two new kernel based-methods are developed to address data with input uncertainty where each data point is inside a sphere of uncertainty. The models are called robust SVM and robust feasibility approach model (Robust-FA) respectively. The two models are compared in terms of robustness and generalization error. The models are compared to robust Minimax Probability Machine (MPM) in terms of generalization behavior for several data sets. It is shown that the Robust-SVM performs better than robust MPM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号