首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
线性低秩逼近与非线性降维   总被引:4,自引:0,他引:4       下载免费PDF全文
综合分析介绍了在线性与非线性数据约化两方面的最新工作: 对线性情形, 讨论了列分块矩阵奇异值分解的结构分析和稀疏低秩逼近方法与算法; 对非线性情形, 研究了非线性降维与流形学习的方法. 这些问题均为数据挖掘 与机器学习领域极受关注的研究课题.  相似文献   

2.
本文分别论述全矩阵、距平矩阵以及归一化矩阵的奇异正交分解(Singular Value Decomposition,简称SVD)算法的理论基础,推导了任意矩阵的SVD分解过程并且在任意矩阵SVD分解的基础上,给出两种本征正交分解(Proper Orthogonal Decomposition,简称POD)算法,将POD算法与Galerkin投影相结合可以将偏微分方程的高维或者无穷维解投影到POD模态构成的完备空间中进行降阶模拟,进而得到高度近似的低维解,比较用不同阶POD模态降阶前后解的稳定性及精确性.最后给出数值算例分析两种本征正交分解算法的优劣性及适用性.  相似文献   

3.
在非负矩阵分解中,初值的选择对于算法效果有很大的影响.一些基于奇异值分解的初始化方法已有人提出~([7,8]),但当矩阵维数过大时,直接对原矩阵进行奇异值分解是耗时的.本文提出了一种更节时的初始化方法 (KFV-NMF),而且通过数值实验,此算法既在一定程度上保持了计算精度,也节省了计算时间.  相似文献   

4.
O-对称矩阵的奇异值分解及其算法   总被引:3,自引:0,他引:3  
本文研究了具有轴对称结构矩阵的奇异值分解,找出了这类矩阵奇异值分解与其子阵奇异值分解之间的定量关系.利用这些定量关系给出这类矩阵奇异值分解和Moore-Penrose逆的算法,据此可极大地节省求该类矩阵奇异值分解和Moore-Penrose逆时的计算量和存储量.  相似文献   

5.
用改进的截断与转换的矩阵奇异值分解算法,设计实现了基于字频特征的中文文本分类器.理论分析与实验结果表明,采用的方法提高了数值计算精度,降低了文本集特征空间的维数,简化了文本分类算法的时间复杂度,提高了文本分类准确率.  相似文献   

6.
酉延拓矩阵的奇异值分解及其广义逆   总被引:1,自引:0,他引:1  
从普通奇异值分解出发,导出了酉延拓矩阵的奇异值和奇异向量与母矩阵的奇异值和奇异向量间的定量关系,同时对酉延拓矩阵的满秩分解及g逆,反射g逆,最小二乘g逆,最小范数g逆作了定量分析,得到了酉延拓矩阵的满秩分解矩阵F*和G*与母矩阵A的分解矩阵F和G之间的关系.最后给出了相应的快速求解算法,并举例说明该算法大大降低了分解的计算量和存储量,提高了计算效率.  相似文献   

7.
用随机奇异值分解算法求解矩阵恢复问题   总被引:1,自引:0,他引:1       下载免费PDF全文
许雪敏  向华 《数学杂志》2017,37(5):969-976
本文研究了大型低秩矩阵恢复问题.利用随机奇异值分解(RSVD)算法,对稀疏矩阵做奇异值分解.该算法与Lanczos方法相比,在误差精度一致的同时运算时间大大降低,且该算法对相对低秩矩阵也有效.  相似文献   

8.
本文研究了大型低秩矩阵恢复问题.利用随机奇异值分解(RSVD)算法,对稀疏矩阵做奇异值分解.该算法与Lanczos方法相比,在误差精度一致的同时运算时间大大降低,且该算法对相对低秩矩阵也有效.  相似文献   

9.
本文利用四元数矩阵的奇异值分解给出四元数EP矩阵的一个刻画,并得到四元数EP矩阵减序,左(右)星序,星序的相应刻画定理与性质定理.  相似文献   

10.
以辽东湾某生态监测区水质监测数据为例,以矩阵的奇异值分解和K means算法为分类工具,给出生态监测区水质监测数据的分类方法.方法具有以下特点:通过奇异值分解简化并加速了类比过程,通过动态设置类K避免了K means算法先设定类数的不足,还探讨了对少量新增监测数据的归类问题.方法对近海海水水质监测数据分类具有普适性.  相似文献   

11.
Dimension reduction in today's vector space based information retrieval system is essential for improving computational efficiency in handling massive amounts of data. A mathematical framework for lower dimensional representation of text data in vector space based information retrieval is proposed using minimization and a matrix rank reduction formula. We illustrate how the commonly used Latent Semantic Indexing based on the Singular Value Decomposition (LSI/SVD) can be derived as a method for dimension reduction from our mathematical framework. Then two new methods for dimension reduction based on the centroids of data clusters are proposed and shown to be more efficient and effective than LSI/SVD when we have a priori information on the cluster structure of the data. Several advantages of the new methods in terms of computational efficiency and data representation in the reduced space, as well as their mathematical properties are discussed.Experimental results are presented to illustrate the effectiveness of our methods on certain classification problems in a reduced dimensional space. The results indicate that for a successful lower dimensional representation of the data, it is important to incorporate a priori knowledge in the dimension reduction algorithms.  相似文献   

12.
Singular value decomposition (SVD) is a useful tool in functional data analysis (FDA). Compared to principal component analysis (PCA), SVD is more fundamental, because SVD simultaneously provides the PCAs in both row and column spaces. We compare SVD and PCA from the FDA view point, and extend the usual SVD to variations by considering different centerings. A generalized scree plot is proposed to select an appropriate centering in practice. Several useful matrix views of the SVD components are introduced to explore different features in data, including SVD surface plots, image plots, curve movies, and rotation movies. These methods visualize both column and row information of a two-way matrix simultaneously, relate the matrix to relevant curves, show local variations, and highlight interactions between columns and rows. Several toy examples are designed to compare the different variations of SVD, and real data examples are used to illustrate the usefulness of the visualization methods.  相似文献   

13.
Beginning from the first kind of integral equation, two kinds of methods, i.e. Singular Value Decomposition (SVD) and transform that are fit for the inversion calculation of NMR multi-relaxation data from rock, have been derived. The mathematical processing was discussed in detail. The advantage and disadvantage of the two methods have been compared in theory and application. From the view of the degree of freedom of the inversion solution, we discussed the resolution of the solution of NMR relaxation inversion and the selection of optimal inversion modeling. Results showed that SVD method is fit for the inversion of NMR relaxation data with a higher signal to noise ratio and transform inversion method is more flexible and can be used in NMR data with a lower signal to noise ratio. It is advisable that transform inversion method be selected for the multi-relaxation inversion of rock NMR data. SVD method can be used when the signal to noise ratio is better than 80. In order to ensure the accuracy, the number of T2 points should be varied from 30 to 50. The results of the research are useful for NMR core analysis and the interpretation of NMR logging data.  相似文献   

14.
We consider the general problem of analysing and modelling call centre arrival data. A method is described for analysing such data using singular value decomposition (SVD). We illustrate that the outcome from the SVD can be used for data visualization, detection of anomalies (outliers), and extraction of significant features from noisy data. The SVD can also be employed as a data reduction tool. Its application usually results in a parsimonious representation of the original data without losing much information. We describe how one can use the reduced data for some further, more formal statistical analysis. For example, a short‐term forecasting model for call volumes is developed, which is multiplicative with a time series component that depends on day of the week. We report empirical results from applying the proposed method to some real data collected at a call centre of a large‐scale U.S. financial organization. Some issues about forecasting call volumes are also discussed. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

15.
We present our recent work on both linear and nonlinear data reduction methods and algorithms: for the linear case we discuss results on structure analysis of SVD of column-partitioned matrices and sparse low-rank approximation; for the nonlinear case we investigate methods for nonlinear dimensionality reduction and manifold learning. The problems we address have attracted great deal of interest in data mining and machine learning.  相似文献   

16.
Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can sometimes be difficult to interpret, because they are linear combinations of all the original variables. To facilitate interpretation, sparse PCA produces modified PCs with sparse loadings, i.e. loadings with very few non-zero elements. In this paper, we propose a new sparse PCA method, namely sparse PCA via regularized SVD (sPCA-rSVD). We use the connection of PCA with singular value decomposition (SVD) of the data matrix and extract the PCs through solving a low rank matrix approximation problem. Regularization penalties are introduced to the corresponding minimization problem to promote sparsity in PC loadings. An efficient iterative algorithm is proposed for computation. Two tuning parameter selection methods are discussed. Some theoretical results are established to justify the use of sPCA-rSVD when only the data covariance matrix is available. In addition, we give a modified definition of variance explained by the sparse PCs. The sPCA-rSVD provides a uniform treatment of both classical multivariate data and high-dimension-low-sample-size (HDLSS) data. Further understanding of sPCA-rSVD and some existing alternatives is gained through simulation studies and real data examples, which suggests that sPCA-rSVD provides competitive results.  相似文献   

17.
18.
The tensor SVD (t‐SVD) for third‐order tensors, previously proposed in the literature, has been applied successfully in many fields, such as computed tomography, facial recognition, and video completion. In this paper, we propose a method that extends a well‐known randomized matrix method to the t‐SVD. This method can produce a factorization with similar properties to the t‐SVD, but it is more computationally efficient on very large data sets. We present details of the algorithms and theoretical results and provide numerical results that show the promise of our approach for compressing and analyzing image‐based data sets. We also present an improved analysis of the randomized and simultaneous iteration for matrices, which may be of independent interest to the scientific community. We also use these new results to address the convergence properties of the new and randomized tensor method as well.  相似文献   

19.
Linear dimension reduction plays an important role in classification problems. A variety of techniques have been developed for linear dimension reduction to be applied prior to classification. However, there is no single definitive method that works best under all circumstances. Rather a best method depends on various data characteristics. We develop a two-step adaptive procedure in which a best dimension reduction method is first selected based on the various data characteristics, which is then applied to the data at hand. It is shown using both simulated and real life data that such a procedure can significantly reduce the misclassification rate.  相似文献   

20.
An approximation theory by bandlimited functions (≡ Paley-Wiener functions) on Riemannian manifolds of bounded geometry is developed. Based on this theory multiscale approximations to smooth functions in Sobolev and Besov spaces on manifolds are obtained. The results have immediate applications to the filtering, denoising and approximation and compression of functions on manifolds. There exists applications to problems arising in data dimension reduction, image processing, computer graphics, visualization and learning theory.   相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号