首页 | 本学科首页   官方微博 | 高级检索  
     


Fixed-rank matrix factorizations and Riemannian low-rank optimization
Authors:Bamdev Mishra  Gilles Meyer  Silvère Bonnabel  Rodolphe Sepulchre
Affiliation:1. Department of Electrical Engineering and Computer Science, University of Liège, 4000?, Liege, Belgium
3. Robotics Center, Mines ParisTech, Boulevard Saint-Michel, 60, 75272?, Paris, France
2. Department of Engineering, University of Cambridge, Cambridge, CB2 1PZ, UK
Abstract:
Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号