首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 609 毫秒
1.
Although the study of global convergence of the Polak–Ribière–Polyak (PRP), Hestenes–Stiefel (HS) and Liu–Storey (LS) conjugate gradient methods has made great progress, the convergence of these algorithms for general nonlinear functions is still erratic, not to mention under weak conditions on the objective function and weak line search rules. Besides, it is also interesting to investigate whether there exists a general method that converges under the standard Armijo line search for general nonconvex functions, since very few relevant results have been achieved. So in this paper, we present a new general form of conjugate gradient methods whose theoretical significance is attractive. With any formula β k  ≥ 0 and under weak conditions, the proposed method satisfies the sufficient descent condition independently of the line search used and the function convexity, and its global convergence can be achieved under the standard Wolfe line search or even under the standard Armijo line search. Based on this new method, convergence results on the PRP, HS, LS, Dai–Yuan–type (DY) and Conjugate–Descent–type (CD) methods are established. Preliminary numerical results show the efficiency of the proposed methods.  相似文献   

2.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

3.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

4.
Min Li 《Optimization Letters》2018,12(8):1911-1927
Based on the memoryless BFGS quasi-Newton method, a family of three-term nonlinear conjugate gradient methods are proposed. For any line search, the directions generated by the new methods are sufficient descent. Using some efficient techniques, global convergence results are established when the line search fulfills the Wolfe or the Armijo conditions. Moreover, the r-linear convergence rate of the methods are analyzed as well. Numerical comparisons show that the proposed methods are efficient for the unconstrained optimization problems in the CUTEr library.  相似文献   

5.

This paper considers sufficient descent Riemannian conjugate gradient methods with line search algorithms. We propose two kinds of sufficient descent nonlinear conjugate gradient method and prove that these methods satisfy the sufficient descent condition on Riemannian manifolds. One is a hybrid method combining a Fletcher–Reeves-type method with a Polak–Ribière–Polyak-type method, and the other is a Hager–Zhang-type method, both of which are generalizations of those used in Euclidean space. Moreover, we prove that the hybrid method has a global convergence property under the strong Wolfe conditions and the Hager–Zhang-type method has the sufficient descent property regardless of whether a line search is used or not. Further, we review two kinds of line search algorithm on Riemannian manifolds and numerically compare our generalized methods by solving several Riemannian optimization problems. The results show that the performance of the proposed hybrid methods greatly depends on the type of line search used. Meanwhile, the Hager–Zhang-type method has the fast convergence property regardless of the type of line search used.

  相似文献   

6.
An efficient descent method for unconstrained optimization problems is line search method in which the step size is required to choose at each iteration after a descent direction is determined. There are many ways to choose the step sizes, such as the exact line search, Armijo line search, Goldstein line search, and Wolfe line search, etc. In this paper we propose a new inexact line search for a general descent method and establish some global convergence properties. This new line search has many advantages comparing with other similar inexact line searches. Moreover, we analyze the global convergence and local convergence rate of some special descent methods with the new line search. Preliminary numerical results show that the new line search is available and efficient in practical computation.  相似文献   

7.
强Wolfe条件不能保证标准CD共轭梯度法全局收敛.本文通过建立新的共轭参数,提出无约束优化问题的一个新谱共轭梯度法,该方法在精确线搜索下与标准CD共轭梯度法等价,在标准wolfe线搜索下具有下降性和全局收敛性.初步的数值实验结果表明新方法是有效的,适合于求解非线性无约束优化问题.  相似文献   

8.
To guarantee global convergence of the standard (unmodified) PRP nonlinear conjugate gradient method for unconstrained optimization, the exact line search or some Armijo type line searches which force the PRP method to generate descent directions have been adopted. In this short note, we propose a non-descent PRP method in another way. We prove that the unmodified PRP method converges globally even for nonconvex minimization by the use of an approximate descent inexact line search.  相似文献   

9.
Conjugate gradient methods are interesting iterative methods that solve large scale unconstrained optimization problems. A lot of recent research has thus focussed on developing a number of conjugate gradient methods that are more effective. In this paper, we propose another hybrid conjugate gradient method as a linear combination of Dai-Yuan (DY) method and the Hestenes-Stiefel (HS) method. The sufficient descent condition and the global convergence of this method are established using the generalized Wolfe line search conditions. Compared to the other conjugate gradient methods, the proposed method gives good numerical results and is effective.  相似文献   

10.
一个充分下降的有效共轭梯度法   总被引:2,自引:0,他引:2  
对于大规模无约束优化问题,本文提出了一个充分下降的共轭梯度法公式,并建立相应的算法.该算法在不依赖于任何线搜索条件下,每步迭代都能产生一个充分下降方向.若采用标准Wolfe非精确线搜索求步长,则在常规假设条件下可获得算法良好的全局收敛性最后,对算法进行大规模数值试验,并采用Dolan和More的性能图对试验效果进行刻画,结果表明该算法是有效的.  相似文献   

11.
一个修正HS共轭梯度法及其收敛性   总被引:2,自引:0,他引:2  
It is well-known that the direction generated by Hestenes-Stiefel (HS) conjugate gradient method may not be a descent direction for the objective function. In this paper, we take a little modification to the HS method, then the generated direction always satisfies the sufficient descent condition. An advantage of the modified Hestenes-Stiefel (MHS) method is that the scalar βkH Sffikeeps nonnegative under the weak Wolfe-Powell line search. The global convergence result of the MHS method is established under some mild conditions. Preliminary numerical results show that the MHS method is a little more efficient than PRP and HS methods.  相似文献   

12.
A three-term conjugate gradient algorithm for large-scale unconstrained optimization using subspace minimizing technique is presented. In this algorithm the search directions are computed by minimizing the quadratic approximation of the objective function in a subspace spanned by the vectors: ?g k+1, s k and y k . The search direction is considered as: d k+1 = ?g k+1 + a k s k + b k y k , where the scalars a k and b k are determined by minimization the affine quadratic approximate of the objective function. The step-lengths are determined by the Wolfe line search conditions. We prove that the search directions are descent and satisfy the Dai-Liao conjugacy condition. The suggested algorithm is of three-term conjugate gradient type, for which both the descent and the conjugacy conditions are guaranteed. It is shown that, for uniformly convex functions, the directions generated by the algorithm are bounded above, i.e. the algorithm is convergent. The numerical experiments, for a set of 750 unconstrained optimization test problems, show that this new algorithm substantially outperforms the known Hestenes and Stiefel, Dai and Liao, Dai and Yuan and Polak, Ribiére and Poliak conjugate gradient algorithms, as well as the limited memory quasi-Newton method L-BFGS and the discrete truncated-Newton method TN.  相似文献   

13.
刘金魁 《数学季刊》2014,(1):142-150
Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new descent gradient method based on the LS method.It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search.Finally,we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP~+method.  相似文献   

14.
In this paper, a new spectral PRP conjugate gradient algorithm has been developed for solving unconstrained optimization problems, where the search direction was a kind of combination of the gradient and the obtained direction, and the steplength was obtained by the Wolfe-type inexact line search. It was proved that the search direction at each iteration is a descent direction of objective function. Under mild conditions, we have established the global convergence theorem of the proposed method. Numerical results showed that the algorithm is promising, particularly, compared with the existing several main methods.  相似文献   

15.
Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength αk, computed by Wolfe line search conditions, by means of a positive parameter ηk, in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms.  相似文献   

16.
本文对求解无约束优化问题给出两类新的变参数下降算法.在Wolfe线搜索下无需给定充分下降条件,即可证明它们的全局收敛性.大量数值试验表明它们是非常有效的和稳定的,能够广泛用于科学计算.  相似文献   

17.
《Optimization》2012,61(9):1387-1400
Although the Hesteness and Stiefel (HS) method is a well-known method, if an inexact line search is used, researches about its convergence rate are very rare. Recently, Zhang, Zhou and Li [Some descent three-term conjugate gradient methods and their global convergence, Optim. Method Softw. 22 (2007), pp. 697–711] proposed a three-term Hestenes–Stiefel method for unconstrained optimization problems. In this article, we investigate the convergence rate of this method. We show that the three-term HS method with the Wolfe line search will be n-step superlinearly and even quadratically convergent if some restart technique is used under reasonable conditions. Some numerical results are also reported to verify the theoretical results. Moreover, it is more efficient than the previous ones.  相似文献   

18.
In most applications, denoising image is fundamental to subsequent image processing operations. This paper proposes a spectral conjugate gradient (CG) method for impulse noise removal, which is based on a two-phase scheme. The noise candidates are first identified by the adaptive (center-weighted) median filter; then these noise candidates are restored by minimizing an edge-preserving regularization functional, which is accomplished by the proposed spectral CG method. A favorite property of the proposed method is that the search direction generated at each iteration is descent. Under strong Wolfe line search conditions, its global convergence result could be established. Numerical experiments are given to illustrate the efficiency of the spectral conjugate gradient method for impulse noise removal.  相似文献   

19.
This article proposes new conjugate gradient method for unconstrained optimization by applying the Powell symmetrical technique in a defined sense. Using the Wolfe line search conditions, the global convergence property of the method is also obtained based on the spectral analysis of the conjugate gradient iteration matrix and the Zoutendijk condition for steepest descent methods. Preliminary numerical results for a set of 86 unconstrained optimization test problems verify the performance of the algorithm and show that the Generalized Descent Symmetrical Hestenes-Stiefel algorithm is competitive with the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP+) algorithms.  相似文献   

20.
Wolfe线搜索下一个全局收敛的混合共轭梯度法   总被引:2,自引:0,他引:2  
江羡珍  韩麟  简金宝 《计算数学》2012,34(1):103-112
对无约束优化问题, 本文给出了一个新的混合共轭梯度法公式. 在标准Wolfe非精确线搜索下,证明了由新公式所产生的算法具有下降性和全局收敛性, 并对算法进行了数值试验, 其结果表明该算法是有效的.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号