共查询到18条相似文献,搜索用时 62 毫秒
1.
In this paper, a modified formula for βk^PRP is proposed for the conjugate gradient method of solving unconstrained optimization problems. The value of βk^PRP keeps nonnegative independent of the line search. Under mild conditions, the global convergence of modified PRP method with the strong Wolfe-Powell line search is established. Preliminary numerical results show that the modified method is efficient. 相似文献
2.
共轭下降法的全局收敛性 总被引:21,自引:1,他引:21
共轭下降法最早由Fletcher提出,本文证明了一类非精确线搜索条件能保证共轭下的降法的收敛性,并且构造了反例表明,如果线搜索条件放松,则共轭下降法可能不收敛,此外,我们还得到了与Flecher-Reeves方法有关的一类方法的结论。 相似文献
3.
一种修正的HS共轭梯度法及全局收敛性 总被引:2,自引:0,他引:2
<正>1引言考虑无约束极小化问题:(?),(1)其中f(x)连续可微,其梯度函数用g(x)表示.共轭梯度法求解(1)的常用迭代格式为:x_(k+1)=x_k+α_kd_k,(2)(?)(3)其中g_k=▽f(x_k),α_k≥0是由某种线搜索得到的步长因子;d_k为搜索方向,β_k为标量,β_k的不同选择产生了不同的共轭梯度法.著名的β_k公式有: 相似文献
4.
5.
无约束最优化的Polak—Ribiere和Hestenes—Stiefel共轭梯度法的全局收敛性 总被引:2,自引:0,他引:2
本文在很弱的条件下得到了无约束最优化的Polak-Ribiere和Hestenes-Stiefel共轭梯度法的全局收敛性的新结果,这里PR方法和HS方法中的参数β^PRk和β^HSk可以在某个负的区域内取值,这一负的区域与k有关,这些新的收敛性结果改进了文献中已有的结果。数值检验的结果表明了本文中新的PR方法和HS方法是相当有效的。 相似文献
6.
改进HS共轭梯度算法及其全局收敛性 总被引:14,自引:0,他引:14
1.引 言 1952年 M.Hestenes和E.Stiefel提出了求解正定线性方程组的共轭梯度法[1].1964年R.Fletcher和C.Reeves将该方法推广到求解下列无约束优化问题: minf(x),x∈Rn,(1)其中f:Rn→R1为连续可微函数,记gk= f(xk),xk∈ Rn. 若点列{xk}由如下算法产生:其中 βk=[gTk(gk-gk-1)]/[dTk-1(gk-gk-1)].(Hestenes-Stiefel) (4)则称该算法为 Hestenes—Stiefel共轭梯度算… 相似文献
7.
8.
改进的共轭梯度法及其收敛性 总被引:1,自引:0,他引:1
本文对无约束最优化问题提出一类改进的共轭梯度法。该算法采用一类非精确线搜索,扩大了迭代参数的选取范围,并在目标函数连续可微的条件下,证明了算法的全局收敛性。 相似文献
9.
修正Hestenes-Stiefel共轭梯度算法 总被引:4,自引:0,他引:4
本文探讨了Hestenes-Stiefel(HS)共轭梯度算法的收敛性条件.在无充分下降性条件下,证明了一种修正的HS共轭梯度算法的整体收敛性. 相似文献
10.
11.
A new conjugate gradient method is proposed in this paper. For any (inexact) line search, our scheme satifies the sufficient descent property. The method is proved to be globally convergent if the restricted Wolfe-Powell line search is used. Preliminary numerical result shows that it is efficient. 相似文献
12.
李灿 《数学的实践与认识》2016,(15):245-250
求解无约束优化问题的共轭梯度法,其搜索方向的下降性往往依赖于所采用的线性搜索.将提出一种修正的CD算法,其搜索方向d_k始终满足1-1/u≤(-g_k~Td_k)/(‖g_k‖~2)≤1+1/u(u1),即算法在不依赖任何线性搜索的情况下能始终产生充分下降方向.同时,当采用精确线性搜索时,该修正的CD算法就是标准的CD共轭梯度法.在适当条件下,还证明了修正的CD算法在强Wolfe线性搜索下具有全局收敛性.最后,我们给出了相应的数值结果,说明了算法是一种有效的算法. 相似文献
13.
一种修正的谱CD共轭梯度算法的全局收敛性 总被引:2,自引:0,他引:2
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising. 相似文献
14.
强Wolfe条件不能保证标准CD共轭梯度法全局收敛.本文通过建立新的共轭参数,提出无约束优化问题的一个新谱共轭梯度法,该方法在精确线搜索下与标准CD共轭梯度法等价,在标准wolfe线搜索下具有下降性和全局收敛性.初步的数值实验结果表明新方法是有效的,适合于求解非线性无约束优化问题. 相似文献
15.
本文提出了一类与HS方法相关的新的共轭梯度法.在强Wolfe线搜索的条件下,该方法能够保证搜索方向的充分下降性,并且在不需要假设目标函数为凸的情况下,证明了该方法的全局收敛性.同时,给出了这类新共轭梯度法的一种特殊形式,通过调整参数ρ,验证了它对给定测试函数的有效性. 相似文献
16.
Global Convergence Properties of Nonlinear Conjugate Gradient Methods with Modified Secant Condition 总被引:1,自引:1,他引:1
Conjugate gradient methods are appealing for large scale nonlinear optimization problems. Recently, expecting the fast convergence of the methods, Dai and Liao (2001) used secant condition of quasi-Newton methods. In this paper, we make use of modified secant condition given by Zhang et al. (1999) and Zhang and Xu (2001) and propose a new conjugate gradient method following to Dai and Liao (2001). It is new features that this method takes both available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective function. The method is shown to be globally convergent under some assumptions. Numerical results are reported. 相似文献
17.
Sindhu Narayanan & P. Kaelo 《高等学校计算数学学报(英文版)》2021,14(2):527-539
Conjugate gradient methods are interesting iterative methods that solve
large scale unconstrained optimization problems. A lot of recent research has thus
focussed on developing a number of conjugate gradient methods that are more effective. In this paper, we propose another hybrid conjugate gradient method as a linear
combination of Dai-Yuan (DY) method and the Hestenes-Stiefel (HS) method. The
sufficient descent condition and the global convergence of this method are established using the generalized Wolfe line search conditions. Compared to the other
conjugate gradient methods, the proposed method gives good numerical results and
is effective. 相似文献