首页 | 本学科首页   官方微博 | 高级检索  
     


Rate of Convergence of a Restarted CG-DESCENT Method
Authors:Aiping Qu  Min Li
Affiliation:1. School of Computer , Wuhan University , Wuhan , P. R. China;2. Department of Mathematics , Huaihua University , Huaihua , P. R. China;3. Department of Mathematics , Huaihua University , Huaihua , P. R. China
Abstract:In this article, we investigate the convergence rate of the CG-DESCENT method proposed by Hager and Zhang [1 W. W. Hager and H. Zhang ( 2005 ). A new conjugate gradient method with guaranteed descent and an effcient line search . SIAM Journal of Optimization 16 : 170192 .[Crossref], [Web of Science ®] [Google Scholar]]. Under reasonable conditions, we show that the CG-DESCENT method with the Wolfe line search will be n-step superlinear and even quadratic convergence if some restart technique is used. Some numerical results are also reported to verify the theoretical results.
Keywords:n-step quadratic convergence  Restart conjugate gradient method  Unconstrained optimization
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号