首页 | 本学科首页   官方微博 | 高级检索  
     


The convergence of conjugate gradient method with nonmonotone line search
Authors:Zhen-Jun Shi  Shengquan Wang
Affiliation:a Department of Mathematics and Computer Science, Central State University, Wilberforce, Ohio 45384, USA
b Department of Computer and Information Science, The University of Michigan, Dearborn, Michigan 48128, USA
Abstract:The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.
Keywords:Unconstrained optimization   Conjugate gradient method   Global convergence
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号