首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Lower Rank Approximation of Matrices Based on Fast and Robust Alternating Regression
Abstract:We introduce fast and robust algorithms for lower rank approximation to given matrices based on robust alternating regression. The alternating least squares regression, also called criss-cross regression, was used for lower rank approximation of matrices, but it lacks robustness against outliers in these matrices. We use robust regression estimators and address some of the complications arising from this approach. We find it helpful to use high breakdown estimators in the initial iterations, followed by M estimators with monotone score functions in later iterations towards convergence. In addition to robustness, the computational speed is another important consideration in the development of our proposed algorithm, because alternating robust regression can be computationally intensive for large matrices. Based on a mix of the least trimmed squares (LTS) and Huber's M estimators, we demonstrate that fast and robust lower rank approximations are possible for modestly large matrices.
Keywords:Criss-cross regression  L1 estimator  LTS estimator  M estimator  Singular value decomposition
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号