首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
FCM和PCM的混合模型可以克服它们单独聚类时的缺点,在聚类效果上有很大改进,但是对于特征不明显的样本而言,这种混合模型的聚类效果并不太好,为了克服这一缺点,本文引入Mercer核,提出了一种新的基于核的混合c-均值聚类模型(KIPCM),运用核函数使得在原始空间不可分的数据点在核空间变得可分。通过数值实验,得到了较为合理的中心值以及较高的正确分类率,证实了本文算法的可行性和有效性。  相似文献   

2.
城市气温是对城市气候特性评价的一个重要指标.提出核概率聚类算法并将其应用于城市气温的模式分类中,以此寻找城市发展上的共同点.该算法在概率聚类算法上引入了核学习方法的思想,能够很好地处理噪音和孤立点,实现更为准确的聚类.实验结果表明,与相关聚类算法相比,核概率聚类算法聚类效果好,且算法能够很快地收敛.  相似文献   

3.
区域经济发展智能预测方法   总被引:2,自引:0,他引:2  
肖健华 《经济数学》2005,22(1):57-63
分析了影响区域经济发展的各种因素,指出由于这些因素相互制约、相互影响,使得传统的经济预测方法越来越难以胜任区域经济发展预测的需要.论述了核方法在处理非线性、不确定性和不精确性数据上存在的优势,建立了基于核方法三种经济预测模型,并将这三种预测模型与其它两种预测方法一起,对区域经济的发展进行组合预测.最后,采用数据融合的方法将各个体模型的预测结果进行集成,作为最终的输出.实际的结果表明,基于核方法的组合预测技术能取得较为理想的预测效果.  相似文献   

4.
本文对非参数回归曲线提出一种新的核估计量和窗宽选择方法及其修正偏倚置信带 .仅利用该回归曲线的估计量和选择数据的窗宽构造这些置信带 .证明了在大样本的意义下 ,这种修正偏倚置信带和Bonferroni型带具有渐近修正范围概率的性质 .并且通过MonteCarlo实验研究了它在小样本中的性质 .在模拟研究中已经证明 ,这种修正偏倚置信带方法是很有效的 ,即使在样本容量n=1 0 0的情况下 ,它也接近给定的范围概率 .  相似文献   

5.
针对模糊相似关系传递核的获取问题进行研究.首先给出模糊相似关系传递核的一些基本性质.之后,利用这些性质构造了三个算法来获取可能为传递核的模糊等价关系.最后,通过实验比较并分析这三种算法在获取传递核时的能力.  相似文献   

6.
给出了一些特殊的p核p群的分类,并给出了一些非正则p核p群的例子.  相似文献   

7.
在分析非线性数据处理新方法—核方法理论的基础,研究基于一类分类方法的多类分类的基本原理,提出应用于多类分类的可信度函数,使分类的结果更具有可信度.最后,以某企业对供应商关系调查数据为例,将这种方法应用于企业商业关系分析中,结果表明该方法的有效性,为非线性数据分类提供了一种新方法.  相似文献   

8.
在光滑粒子流体动力学(Smooth Particle Hydrodynamics:SPH)核近似方法原理的基础上,通过泰勒级数展开已提出的计算函数导数的FODF-SPH(Frist Order Derivative Free:FODF)方法进行了修正,并分别推导出一元和多元函数的修正公式.用不同的粒子间距和不同的光滑长度计算一元函数的导数和多元函数的偏导数,修正公式与FODF-SPH方法误差进行了对比分析.结果表明,提出的修正措施在提高精度、减少误差及加快收敛速度等方面起了很大的作用.  相似文献   

9.
在支持向量机预测建模中,核函数用来将低维特征空间中的非线性问题映射为高维特征空间中的线性问题.核函数的特征对于支持向量机的学习和预测都有很重要的影响.考虑到两种典型核函数—全局核(多项式核函数)和局部核(RBF核函数)在拟合与泛化方面的特性,采用了一种基于混合核函数的支持向量机方法用于预测建模.为了评价不同核函数的建模效果、得到更好的预测性能,采用遗传算法自适应进化支持向量机模型的各项参数,并将其应用于装备费用预测的实际问题中.实际计算表明采用混合核函数的支持向量机较单一核函数时有更好的预测性能,可以作为一种有效的预测建模方法在装备管理中推广应用.  相似文献   

10.
本文介绍了支持向量分类机,并引入具有更好识别能力的KMOD核函数建立了SVM信用卡分类模型.利用澳大利亚和德国的信用卡数据进行了数值实验,结果表明该模型在分类准确率、支持向量方面优于基于RBF的SVM模型.  相似文献   

11.
We introduce and develop the notion of spherical polyharmonics, which are a natural generalisation of spherical harmonics. In particular we study the theory of zonal polyharmonics, which allows us, analogously to zonal harmonics, to construct Poisson kernels for polyharmonic functions on the union of rotated balls. We find the representation of Poisson kernels and zonal polyharmonics in terms of the Gegenbauer polynomials. We show the connection between the classical Poisson kernel for harmonic functions on the ball, Poisson kernels for polyharmonic functions on the union of rotated balls, and the Cauchy-Hua kernel for holomorphic functions on the Lie ball.  相似文献   

12.
Kernel smoothing provides a simple way of finding a structure in data. Oneof the most popular settings where kernel smoothing ideas can be applied isthe simple regression model. In the context of kernel estimates of aregression function, the choice of a kernel from the different points ofview can be investigated. The aim of this paper is to present constructionsof minimum variance kernels and smooth kernels by means of the Legendrepolynomials and the Gegenbauer polynomials as well. Some of these kernelshave been introduced, e.g., in [2], [3], and [5], but here another approachby using the variational calculus is presented.  相似文献   

13.
In this paper, we discuss the conditions on polynomial projections that allow special integral representations for their kernels. These representations are based on a new biorthonormality criterion. As an application, we obtain new integral representations for the Legendre and Gegenbauer kernels.  相似文献   

14.
Kernel Fisher discriminant analysis (KFDA) is a popular classification technique which requires the user to predefine an appropriate kernel. Since the performance of KFDA depends on the choice of the kernel, the problem of kernel selection becomes very important. In this paper we treat the kernel selection problem as an optimization problem over the convex set of finitely many basic kernels, and formulate it as a second order cone programming (SOCP) problem. This formulation seems to be promising because the resulting SOCP can be efficiently solved by employing interior point methods. The efficacy of the optimal kernel, selected from a given convex set of basic kernels, is demonstrated on UCI machine learning benchmark datasets.  相似文献   

15.
In this paper extensions of the classical Fourier, fractional Fourier and Radon transforms to superspace are studied. Previously, a Fourier transform in superspace was already studied, but with a different kernel. In this work, the fermionic part of the Fourier kernel has a natural symplectic structure, derived using a Clifford analysis approach. Several basic properties of these three transforms are studied. Using suitable generalizations of the Hermite polynomials to superspace (see [H. De Bie, F. Sommen, Hermite and Gegenbauer polynomials in superspace using Clifford analysis, J. Phys. A 40 (2007) 10441-10456]) an eigenfunction basis for the Fourier transform is constructed.  相似文献   

16.
We propose a method for support vector machine classification using indefinite kernels. Instead of directly minimizing or stabilizing a nonconvex loss function, our algorithm simultaneously computes support vectors and a proxy kernel matrix used in forming the loss. This can be interpreted as a penalized kernel learning problem where indefinite kernel matrices are treated as noisy observations of a true Mercer kernel. Our formulation keeps the problem convex and relatively large problems can be solved efficiently using the projected gradient or analytic center cutting plane methods. We compare the performance of our technique with other methods on several standard data sets.  相似文献   

17.
Optimal kernel selection in twin support vector machines   总被引:2,自引:0,他引:2  
In twin support vector machines (TWSVMs), we determine pair of non-parallel planes by solving two related SVM-type problems, each of which is smaller than the one in a conventional SVM. However, similar to other classification methods, the performance of the TWSVM classifier depends on the choice of the kernel. In this paper we treat the kernel selection problem for TWSVM as an optimization problem over the convex set of finitely many basic kernels, and formulate the same as an iterative alternating optimization problem. The efficacy of the proposed classification algorithm is demonstrated with some UCI machine learning benchmark datasets.  相似文献   

18.
Selecting important features in nonlinear kernel spaces is a difficult challenge in both classification and regression problems. This article proposes to achieve feature selection by optimizing a simple criterion: a feature-regularized loss function. Features within the kernel are weighted, and a lasso penalty is placed on these weights to encourage sparsity. This feature-regularized loss function is minimized by estimating the weights in conjunction with the coefficients of the original classification or regression problem, thereby automatically procuring a subset of important features. The algorithm, KerNel Iterative Feature Extraction (KNIFE), is applicable to a wide variety of kernels and high-dimensional kernel problems. In addition, a modification of KNIFE gives a computationally attractive method for graphically depicting nonlinear relationships between features by estimating their feature weights over a range of regularization parameters. The utility of KNIFE in selecting features through simulations and examples for both kernel regression and support vector machines is demonstrated. Feature path realizations also give graphical representations of important features and the nonlinear relationships among variables. Supplementary materials with computer code and an appendix on convergence analysis are available online.  相似文献   

19.
This paper presents an error analysis for classification algorithms generated by regularization schemes with polynomial kernels. Explicit convergence rates are provided for support vector machine (SVM) soft margin classifiers. The misclassification error can be estimated by the sum of sample error and regularization error. The main difficulty for studying algorithms with polynomial kernels is the regularization error which involves deeply the degrees of the kernel polynomials. Here we overcome this difficulty by bounding the reproducing kernel Hilbert space norm of Durrmeyer operators, and estimating the rate of approximation by Durrmeyer operators in a weighted L1 space (the weight is a probability distribution). Our study shows that the regularization parameter should decrease exponentially fast with the sample size, which is a special feature of polynomial kernels. Dedicated to Charlie Micchelli on the occasion of his 60th birthday Mathematics subject classifications (2000) 68T05, 62J02. Ding-Xuan Zhou: The first author is supported partially by the Research Grants Council of Hong Kong (Project No. CityU 103704).  相似文献   

20.
This paper considers using asymmetric kernels in local linear smoothing to estimate a regression curve with bounded support. The asymmetric kernels are either beta kernels if the curve has a compact support or gamma kernels if the curve is bounded from one end only. While possessing the standard benefits of local linear smoothing, the local linear smoother using the beta or gamma kernels offers some extra advantages in aspects of having finite variance and resistance to sparse design. These are due to their flexible kernel shape and the support of the kernel matching the support of the regression curve.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号