首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
一种修正的HS共轭梯度法及全局收敛性   总被引:2,自引:0,他引:2  
<正>1引言考虑无约束极小化问题:(?),(1)其中f(x)连续可微,其梯度函数用g(x)表示.共轭梯度法求解(1)的常用迭代格式为:x_(k+1)=x_k+α_kd_k,(2)(?)(3)其中g_k=▽f(x_k),α_k≥0是由某种线搜索得到的步长因子;d_k为搜索方向,β_k为标量,β_k的不同选择产生了不同的共轭梯度法.著名的β_k公式有:  相似文献   

2.
共轭梯度法是求解大规模无约束优化问题的一类重要方法.由于共轭梯度法产生的搜索方向不一定是下降方向,为保证每次迭代方向都是下降方向,本文提出一种求解无约束优化问题的谱共轭梯度算法,该方法的每次搜索方向都是下降方向.当假设目标函数一致凸,且其梯度满足Lipschitz条件,线性搜索满足Wolfe条件时,讨论所设计算法的全局收敛性.  相似文献   

3.
共轭梯度法是求解大规模无约束优化问题最有效的方法之一.对HS共轭梯度法参数公式进行改进,得到了一个新公式,并以新公式建立一个算法框架.在不依赖于任何线搜索条件下,证明了由算法框架产生的迭代方向均满足充分下降条件,且在标准Wolfe线搜索条件下证明了算法的全局收敛性.最后,对新算法进行数值测试,结果表明所改进的方法是有效的.  相似文献   

4.
针对无约束优化问题,通过修正共轭梯度参数,构造新的搜索方向,提出两类修正的WYL共轭梯度法.在每次迭代过程中,两类算法产生的搜索方向均满足充分下降性.在适当条件下,证明了算法的全局收敛性.数值结果表明算法是可行的和有效的.  相似文献   

5.
本文从共轭梯度法的公式推导出对称正定阵A与三对角阵B的相似关系,B的元素由共轭梯度法的迭代参数确定.因此,对称正定阵的条件数计算可以化成三对角阵条件数的计算,并且可以在共轭梯度法的计算中顺带完成.它只需增加O(s)次的计算量,s为迭代次数.这与共轭梯度法的计算量相比是可以忽略的.当A为非对称正定阵时,只要A非奇异,即可用共轭梯度法计算ATA的特征极值和条件数,从而得出A的条件数.对不同算例的计算表明,这是一种快速有效的简便方法.  相似文献   

6.
对称不定问题的不精确Newton法   总被引:6,自引:0,他引:6  
梁恒  白峰杉 《计算数学》2002,24(3):319-326
1.引 言 非线性方程组F(x)=0的数值求解,经典的算法是Newton迭代;xk 1=xk sk,k=0,1,2,…,(1.1)其中的sk满足F’(xk)sk=-F(xk);k=0,1,2,….(1.2)这里x0为迭代的初始点,{xk}称为Newton迭代序列.当变量个数比较多时,每一步Newton迭代中计算Jacobi矩阵F’(xk)和求解线性方程组(1.2)的代价非常高;特别当xk远离方程组的解x*时,高精度地求解线性方程组(1.2)  相似文献   

7.
本文在文献[1]中提出了一类新共轭梯度法的基础上,给出求解无约束优化问题的两类新的非线性下降共轭梯度法,此两类方法在无任何线搜索下,能够保证在每次迭代中产生下降方向.对一般非凸函数,我们在Wolfe线搜索条件下证明了两类新方法的全局收敛性.  相似文献   

8.
三项共轭梯度法收敛性分析   总被引:5,自引:0,他引:5  
戴彧虹  袁亚湘 《计算数学》1999,21(3):355-362
1.引言考虑求解无约束光滑优化问题的线搜索方法其中al事先给定,山为搜索方向,Ik是步长因子.在经典的共轭梯度法中,对k三2,搜索方向dk由负梯度方向一gb和已有搜索方向小.1两个方向组成:其中山—-91,作为参数.关于参数作的计算公式很多,其中两个有名的计算公式称为*R公式和**P公式(见门和河1叩,它们分别为此处及以下11·11均指欧氏范数.在文献山中,Beale提出了搜索方向形如的三项重开始共轭梯度法,其中dt为重开始方向.Powellll]对这一方法引入了适当的重开始准则,获得了很好的数值结果.本文里,我们将研究搜索方向…  相似文献   

9.
李梅霞  籍法俊 《应用数学》2008,21(1):213-218
在本文中,我们提出了一种新的带扰动项的三项记忆梯度混合投影算法.在这种方法中应用了广义Armijo线搜索,并且仅在梯度函数在包含迭代序列的开凸集上一致连续的条件下证明了该算法的全局收敛性.最后给出了几个数值算例.  相似文献   

10.
Polak-Ribière-Polak (PRP)方法是经典共轭梯度法中数值表现较好的方法之一.结合Wolfe非精确线搜索准则对PRP公式进行改进,从而产生新的共轭参数,并基于新共轭参数设计新的谱参数,引入重启条件并构造新的重启方向,进而建立一个带重启步的谱共轭梯度算法.在常规假设及强Wolfe非精确线搜索步长准则下,...  相似文献   

11.
In this paper, by the use of Gram-Schmidt orthogonalization, we propose a class of modified conjugate gradient methods. The methods are modifications of the well-known conjugate gradient methods including the PRP, the HS, the FR and the DY methods. A common property of the modified methods is that the direction generated by any member of the class satisfies gkTdk=-||gk||2g_{k}^{T}d_k=-\|g_k\|^2. Moreover, if line search is exact, the modified method reduces to the standard conjugate gradient method accordingly. In particular, we study the modified YT and YT+ methods. Under suitable conditions, we prove the global convergence of these two methods. Extensive numerical experiments show that the proposed methods are efficient for the test problems from the CUTE library.  相似文献   

12.
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. It is well-known that the direction generated by a conjugate gradient method may not be a descent direction of the objective function. In this paper, we take a little modification to the Fletcher–Reeves (FR) method such that the direction generated by the modified method provides a descent direction for the objective function. This property depends neither on the line search used, nor on the convexity of the objective function. Moreover, the modified method reduces to the standard FR method if line search is exact. Under mild conditions, we prove that the modified method with Armijo-type line search is globally convergent even if the objective function is nonconvex. We also present some numerical results to show the efficiency of the proposed method.Supported by the 973 project (2004CB719402) and the NSF foundation (10471036) of China.  相似文献   

13.
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效.  相似文献   

14.
利用广义投影矩阵,对求解无约束规划的三项记忆梯度算法中的参数给一条件,确定它们的取值范围,以保证得到目标函数的三项记忆梯度广义投影下降方向,建立了求解非线性等式和不等式约束优化问题的三项记忆梯度广义投影算法,并证明了算法的收敛性.同时给出了结合FR,PR,HS共轭梯度参数的三项记忆梯度广义投影算法,从而将经典的共轭梯度算法推广用于求解约束规划问题.数值例子表明算法是有效的.  相似文献   

15.
共轭梯度法是最优化中最常用的方法之一,广泛地应用于求解大规模优化问题,其中参数β_k的不同选取可以构成不同的共轭梯度法.给出了一类含有三个参数的共轭梯度算法,这种算法能够在给定的条件下证明选定的β_k在每一步都能产生一个下降方向,同时在强Wolfe线搜索下,这种算法具有全局收敛性.  相似文献   

16.
In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of FR method,a class of new restarting three terms conjugate gradient methods is presented.Global convergence properties of the new method with two kinds of common line searches are proved.  相似文献   

17.
孙清滢 《数学季刊》2003,18(2):154-162
Conjugate gradient optimization algorithms depend on the search directions.with different choices for the parameters in the search directions.In this note,by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991),a class of new restarting conjugate gradient methods is presented.Global convergences of the new method with two kinds of common line searches,are proved .Firstly,it is shown that,using reverse modulus of continuity funciton and forcing function,the new method for solving unconstrained optimization can work for a continously differentiable function with Curry-Altman‘s step size rule and a bounded level set .Secondly,by using comparing technique,some general convergence propecties of the new method with other kind of step size rule are established,Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.  相似文献   

18.
An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization   总被引:22,自引:0,他引:22  
Recently, we propose a nonlinear conjugate gradient method, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the weak Wolfe conditions. In this paper, we will study methods related to the new nonlinear conjugate gradient method. Specifically, if the size of the scalar k with respect to the one in the new method belongs to some interval, then the corresponding methods are proved to be globally convergent; otherwise, we are able to construct a convex quadratic example showing that the methods need not converge. Numerical experiments are made for two combinations of the new method and the Hestenes–Stiefel conjugate gradient method. The initial results show that, one of the hybrid methods is especially efficient for the given test problems.  相似文献   

19.
In this paper, we present a new hybrid conjugate gradient algorithm for unconstrained optimization. This method is a convex combination of Liu-Storey conjugate gradient method and Fletcher-Reeves conjugate gradient method. We also prove that the search direction of any hybrid conjugate gradient method, which is a convex combination of two conjugate gradient methods, satisfies the famous D-L conjugacy condition and in the same time accords with the Newton direction with the suitable condition. Furthermore, this property doesn't depend on any line search. Next, we also prove that, moduling the value of the parameter t,the Newton direction condition is equivalent to Dai-Liao conjugacy condition.The strong Wolfe line search conditions are used.The global convergence of this new method is proved.Numerical comparisons show that the present hybrid conjugate gradient algorithm is the efficient one.  相似文献   

20.
The main purpose of this paper is to provide a restarting direction for improving on the standard conjugate gradient method.If a drastic non-quadratic behaviour of the objective function is observed in the neighbour of xk,then a restart should be done.The scaling symmetric rank-one update with Davidon's optimal criterion is applied to generate the restarting direction.It is proved that the conjugate gradient method with this strategy retains the quadratic termination.Numerical experiments show that it is successful.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号