共查询到18条相似文献,搜索用时 78 毫秒
1.
一类非精确线性搜索共轭梯度新算法 总被引:4,自引:0,他引:4
本文通过对迭代参数的适当选取,给出了一类共轭梯度新算法。在算法的迭代过程中,迭代方向保持下降性,在一般的非精确线性搜索条件下,算法的全局收敛性得到了证明。 相似文献
2.
3.
4.
本文我们讨论了一簇共轭梯度法,它可被看作是FR法和DY法的凸组合.我们提出了两种Armijo型线搜索,并在这两种线搜索下,讨论了共轭梯度法簇的全局收敛性. 相似文献
5.
6.
7.
8.
共轭梯度法是求解无约束最优化问题的有效方法.本文在βkDY的基础上对βk引入参数,提出了一类新共轭梯度法,并证明其在强Wolfe线性搜索条件下具有充分下降性和全局收敛性. 相似文献
9.
10.
11.
In this paper, a new steplength formula is proposed for unconstrained optimization,which can determine the step-size only by one step and avoids the line search step. Global convergence of the five well-known conjugate gradient methods with this formula is analyzed,and the corresponding results are as follows:(1) The DY method globally converges for a strongly convex LC~1 objective function;(2) The CD method, the FR method, the PRP method and the LS method globally converge for a general, not necessarily convex, LC~1 objective function. 相似文献
12.
A new conjugate gradient method is proposed in this paper. For any (inexact) line search, our scheme satifies the sufficient descent property. The method is proved to be globally convergent if the restricted Wolfe-Powell line search is used. Preliminary numerical result shows that it is efficient. 相似文献
13.
Conjugate Gradient Methods with Armijo-type Line Searches 总被引:14,自引:0,他引:14
Yu-Hong DAIState Key Laboratory of Scientific Engineering Computing Institute of Computational Mathematics Academy of Mathematics System Sciences Chinese Academy of Sciences Beijing China 《应用数学学报(英文版)》2002,18(1):123-130
Abstract Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods.Under these line searches, global convergence results are established for several famous conjugate gradientmethods, including the Fletcher-Reeves method, the Polak-Ribiere-Polyak method, and the conjugate descentmethod. 相似文献
14.
共轭梯度法是求解无约束优化问题的一种重要的方法.本文提出一族新的共轭梯度法,证明了其在推广的Wolfe非精确线搜索条件下具有全局收敛性.最后对算法进行了数值实验,实验结果验证了该算法的有效性. 相似文献
15.
一种修正的谱CD共轭梯度算法的全局收敛性 总被引:2,自引:0,他引:2
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising. 相似文献
16.
李灿 《数学的实践与认识》2016,(15):245-250
求解无约束优化问题的共轭梯度法,其搜索方向的下降性往往依赖于所采用的线性搜索.将提出一种修正的CD算法,其搜索方向d_k始终满足1-1/u≤(-g_k~Td_k)/(‖g_k‖~2)≤1+1/u(u1),即算法在不依赖任何线性搜索的情况下能始终产生充分下降方向.同时,当采用精确线性搜索时,该修正的CD算法就是标准的CD共轭梯度法.在适当条件下,还证明了修正的CD算法在强Wolfe线性搜索下具有全局收敛性.最后,我们给出了相应的数值结果,说明了算法是一种有效的算法. 相似文献
17.
提出一类求解无约束最优化问题的混合共轭梯度算法,新算法有机地结合了DY算法和HS算法的优点,并采用非单调线搜索技术在较弱条件下证明了算法的全局收敛性.数值实验表明新算法具有良好的计算效能. 相似文献
18.
In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under mild conditions, the global convergence of modified PRP method with the strong Wolfe-Powell line search is established. Preliminary numerical results show that the modified method is efficient. 相似文献