Rate of Convergence of a Restarted CG-DESCENT Method |
| |
Authors: | Aiping Qu Min Li |
| |
Affiliation: | 1. School of Computer , Wuhan University , Wuhan , P. R. China;2. Department of Mathematics , Huaihua University , Huaihua , P. R. China;3. Department of Mathematics , Huaihua University , Huaihua , P. R. China |
| |
Abstract: | In this article, we investigate the convergence rate of the CG-DESCENT method proposed by Hager and Zhang [1 W. W. Hager and H. Zhang ( 2005 ). A new conjugate gradient method with guaranteed descent and an effcient line search . SIAM Journal of Optimization 16 : 170 – 192 .[Crossref], [Web of Science ®] , [Google Scholar]]. Under reasonable conditions, we show that the CG-DESCENT method with the Wolfe line search will be n-step superlinear and even quadratic convergence if some restart technique is used. Some numerical results are also reported to verify the theoretical results. |
| |
Keywords: | n-step quadratic convergence Restart conjugate gradient method Unconstrained optimization |
|
|