首页 | 本学科首页   官方微博 | 高级检索  
     


Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications
Authors:Han Jiye  Liu Guanghui  Sun Defeng  Yin Hongxia
Affiliation:(1) The Academy of Mathematics and Systems Sciences, Institute of Applied Mathematics, 100080 Beijing, China;(2) the Chinese Academy of Sciences, Institute of Applied Mathematics, 100080 Beijing, China;(3) Department of Industrial Engineering and Management Sciences, Northwestern University, 60208 Evanston, IL, USA;(4) School of Mathematics, University of New York Wales, 2052 Sydney, Australia;(5) Hua Luo-keng Institute for Applied Mathematics and Information Science, the Chinese Academy of Sciences, 100039 Beijing, China
Abstract:Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiere algorithm, it is shown that some negative values of the conjugate parameter do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
Keywords:Conjugate gradient method   descent condition   global convergence
本文献已被 CNKI SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号