首页 | 本学科首页   官方微博 | 高级检索  
     


Scaled memoryless symmetric rank one method for large-scale optimization
Authors:Wah June Leong  Malik Abu Hassan
Affiliation:a Institute for Mathematical Research, University Putra Malaysia, 43400 Serdang, Selangor, Malaysia
b Department of Mathematics, University Putra Malaysia, 43400 Serdang, Selangor, Malaysia
Abstract:This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization problems is developed. The basic idea is to incorporate the SR1 update within the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore we propose the memoryless SR1 method, which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that positive definiteness of the updating matrices are preserved and at the same time improves the condition of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally convergent with a linear rate of convergence. Numerical results show that the optimally scaled memoryless SR1 method is very encouraging.
Keywords:Large-scale optimization   Memoryless quasi-Newton method   Symmetric rank one update   Optimal scaling
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号