首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Practical convergence conditions for unconstrained optimization
Authors:Melanie L Lenard
Institution:(1) University of Toledo, Toledo, Ohio, USA
Abstract:Convergence properties of descent methods are investigated for the case where the usual requirement that an exact line search be performed at each iteration is relaxed. The error at each iteration is measured by the relative decrease in the directional derivative in the search direction. The objective function is assumed to have continuous second derivatives and the eigenvalues of the Hessian are assumed to be bounded above and below by positive constants. Sufficient conditions are given for establishing that a method converges, or that a method converges at a linear rate.These results are used to prove that the order of convergence for a specific conjugate gradient method is linear, provided the error at each iteration is suitably restricted.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号