共查询到19条相似文献,搜索用时 62 毫秒
1.
提出一类求解无约束最优化问题的混合共轭梯度算法,新算法有机地结合了DY算法和HS算法的优点,并采用非单调线搜索技术在较弱条件下证明了算法的全局收敛性.数值实验表明新算法具有良好的计算效能. 相似文献
2.
通过求解带有罚参数的优化问题设计共轭梯度法是一种新思路.基于Fatemi的优化问题求解,通过估计步长和选择合适的罚参数建立一个谱三项共轭梯度法,为证得算法的全局收敛性对谱参数进行修正.在标准Wolfe线搜索下证明了该谱三项共轭梯度算法的充分下降性以及全局收敛性.最后,在选取相同算例的多个算法测试结果中表明新方法数值试验性能表现良好. 相似文献
3.
一类新的非单调记忆梯度法及其全局收敛性 总被引:1,自引:0,他引:1
在非单调Armijo线搜索的基础上提出一种新的非单调线搜索,研究了一类在该线搜索下的记忆梯度法,在较弱条件下证明了其全局收敛性。与非单调Armijo线搜索相比,新的非单调线搜索在每次迭代时可以产生更大的步长,从而使目标函数值充分下降,降低算法的计算量。 相似文献
4.
为解决大规模无约束优化问题,该文结合WYL共轭梯度法和谱共轭梯度法,给出了一种WYL型谱共轭梯度法.在不依赖于任何线搜索的条件下,该方法产生的搜索方向均满足充分下降性,且在强Wolfe线搜索下证明了该方法的全局收敛性.与WYL共轭梯度法的收敛性相比,WYL型谱共轭梯度法推广了线搜索中参数σ的取值范围.最后,相应的数值结果表明了该方法是有效的. 相似文献
5.
《数学的实践与认识》2015,(18)
谱共轭梯度算法是求解大规模无约束最优化问题的有效算法之一.基于Hestenes-Stiefel算法与谱共轭梯度算法,提出一种谱Hestenes-Stiefel共轭梯度算法.在Wolfe线搜索下,算法产生的搜索方向具有下降性质,且全局收敛性也能得到证明.通过对CUTEr函数库中部分著名的函数进行试验,利用著名的DolanMore评价体系,展示了新算法的有效性. 相似文献
6.
一类非精确线性搜索共轭梯度新算法 总被引:4,自引:0,他引:4
本文通过对迭代参数的适当选取,给出了一类共轭梯度新算法。在算法的迭代过程中,迭代方向保持下降性,在一般的非精确线性搜索条件下,算法的全局收敛性得到了证明。 相似文献
7.
8.
9.
本文给出求解界约束优化问题的一种新的非单调谱投影梯度算法. 该算法是将谱投影梯度算法与Zhang and Hager [SIAM Journal on Optimization,2004,4(4):1043-1056]提出的非单调线搜索结合得到的方法. 在合理的假设条件下,证明了算法的全局收敛性.数值实验结果表明,与已有的界约束优化问题的谱投影梯度法比较,利用本文给出的算法求解界约束优化问题是有竞争力的. 相似文献
10.
11.
Armijo线性搜索下Hager-Zhang共轭梯度法的全局收敛性 总被引:2,自引:0,他引:2
Hager和Zhang[4]提出了一种新的非线性共轭梯度法(简称 HZ 方法), 并证明了该方法在 Wolfe搜索和 Goldstein 搜索下求解强凸问题的全局收敛性.但是HZ方法在标准Armijo 搜索下求解非凸问题是否全局收敛尚不清楚.该文提出了一种保守的HZ共轭梯度法,并且证明了这种方法在 Armijo 线性搜索下求解非凸优化问题的全局收敛性.此外,作者给出了一些 数值结果以检验该方法的有效性. 相似文献
12.
13.
14.
In this paper, by means of an active set strategy, we present a projected spectral gradient algorithm for solving large-scale bound constrained optimization problems. A nice property of the active set estimation technique is that it can identify the active set at the optimal point without requiring strict complementary condition, which is potentially used to solve degenerated optimization problems. Under appropriate conditions, we show that this proposed method is globally convergent. We also do some numerical experiments by using some bound constrained problems from CUTEr library. The numerical comparisons with SPG, TRON, and L-BFGS-B show that the proposed method is effective and promising. 相似文献
15.
LenysBello MarcosRaydan 《计算数学(英文版)》2005,23(3):225-232
The spectral gradient method has proved to be effective for solving large-scale unconstrained optimization problems. It has been recently extended and combined with the projected gradient method for solving optimization problems on convex sets. This combination includes the use of nonmonotone line search techniques to preserve the fast local convergence. In this work we further extend the spectral choice of steplength to accept preconditioned directions when a good preconditioner is available. We present an algorithmthat combines the spectral projected gradient method with preconditioning strategies toincrease the local speed of convergence while keeping the global properties. We discuss implementation details for solving large-scale problems. 相似文献
16.
研究一种新的无约束优化超记忆梯度算法,算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用Wolfe线性搜索产生步长,在较弱的条件下证明了算法的全局收敛性。新算法在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题。 相似文献
17.
Conjugate Gradient Methods with Armijo-type Line Searches 总被引:14,自引:0,他引:14
Yu-Hong DAIState Key Laboratory of Scientific Engineering Computing Institute of Computational Mathematics Academy of Mathematics System Sciences Chinese Academy of Sciences Beijing China 《应用数学学报(英文版)》2002,18(1):123-130
Abstract Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods.Under these line searches, global convergence results are established for several famous conjugate gradientmethods, including the Fletcher-Reeves method, the Polak-Ribiere-Polyak method, and the conjugate descentmethod. 相似文献
18.
The spectral gradient method has proved to be effective for solving large-scale uncon-strained optimization problems.It has been recently extended and combined with theprojected gradient method for solving optimization problems on convex sets.This combi-nation includes the use of nonmonotone line search techniques to preserve the fast localconvergence.In this work we further extend the spectral choice of steplength to accept pre-conditioned directions when a good preconditioner is available.We present an algorithmthat combines the spectral projected gradient method with preconditioning strategies toincrease the local speed of convergence while keeping the global properties.We discussimplementation details for solving large-scale problems. 相似文献
19.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence result under traditional line searches such as Armijo, Wolfe and Goldstein line searches. In this paper a convergent version of Liu–Storey conjugate gradient method (LS in short) is proposed for minimizing functions that have Lipschitz continuous partial derivatives. By estimating the Lipschitz constant of the derivative of objective functions, we can find an adequate step size at each iteration so as to guarantee the global convergence and improve the efficiency of LS method in practical computation. 相似文献