首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
共轭梯度法是最优化中最常用的方法之一,广泛地应用于求解大规模优化问题,其中参数β_k的不同选取可以构成不同的共轭梯度法.给出了一类含有三个参数的共轭梯度算法,这种算法能够在给定的条件下证明选定的β_k在每一步都能产生一个下降方向,同时在强Wolfe线搜索下,这种算法具有全局收敛性.  相似文献   

2.
共轭梯度法是求解大规模无约束优化问题最有效的方法之一.对HS共轭梯度法参数公式进行改进,得到了一个新公式,并以新公式建立一个算法框架.在不依赖于任何线搜索条件下,证明了由算法框架产生的迭代方向均满足充分下降条件,且在标准Wolfe线搜索条件下证明了算法的全局收敛性.最后,对新算法进行数值测试,结果表明所改进的方法是有效的.  相似文献   

3.
本文考虑无约束优化问题,基于FR共轭梯度法提出两个修正的谱共轭梯度法(ZFR1方法与ZFR2方法),证明两个新方法在标准Wolfe线搜索下搜索方向下降且是全局收敛的.数值结果验证了这两个方法的有效性.  相似文献   

4.
Polak-Ribière-Polak (PRP)方法是经典共轭梯度法中数值表现较好的方法之一.结合Wolfe非精确线搜索准则对PRP公式进行改进,从而产生新的共轭参数,并基于新共轭参数设计新的谱参数,引入重启条件并构造新的重启方向,进而建立一个带重启步的谱共轭梯度算法.在常规假设及强Wolfe非精确线搜索步长准则下,...  相似文献   

5.
本文提出了两种搜索方向带有扰动项的Fletcher-Reeves (abbr. FR)共轭梯度法.其迭代公式为xk 1=xk αk(sk ωk),其中sk由共轭梯度迭代公式确定,ωk为扰动项,αk采用线搜索确定而不是必须趋于零.我们在很一般的假设条件下证明了两种算法的全局收敛性,而不需要目标函数有下界或水平集有界等有界性条件.  相似文献   

6.
提出了一种带两个参数的三项共轭梯度法,新算法具有如下特点:1)满足共轭性条件;2)自动具有充分下降性;3)新的搜索方向具有更大的下降量.在合适的条件下,证明了算法在强Wolfe线搜索下具有全局收敛性.最后对新算法进行了数值实验,结果表明算法对求解无约束优化问题是有效的.  相似文献   

7.
谱共轭梯度法是求解无约束优化的一种有效算法.该文首先对JJSL共轭参数[Jiang et al.Computational and Applied Mathematics,2021,40(174)]进行投影修正,再通过选取合适谱参数以保证其搜索方向有下降性,从而得到两个有效的谱共轭梯度法.一般假设下,分别使用常规非精确线搜索计算步长,获得这两个新算法的全局收敛性.数值试验结果以及相应性能图进一步说明其数值有效性.  相似文献   

8.
Wolfe线搜索下一类混合共轭梯度法的全局收敛性   总被引:3,自引:0,他引:3  
本文给出了一个新的共轭梯度公式,新公式在精确线搜索下与DY公式等价,并给出了新公式的相关性质.结合新公式和DY公式提出了一个新的混合共轭梯度法,新算法在Wolfe线搜索下产生一个下降方向,并证明了算法的全局收敛性,并给出了数值例子.  相似文献   

9.
本文研究了大规模无约束优化问题,提出了一个基于改进的FR共轭参数公式的共轭梯度法.不依赖于任何线搜索准则,算法所产生的搜索方向总是充分下降的.在标准Wolfe线搜索准则下,获得了新算法的全局收敛性.最后,对所提出的算法进行了初步数值实验,其结果表明所改进的方法是有效的.  相似文献   

10.
对无约束优化问题,本文给出了两个改进的共轭梯度法公式.在不依赖于任何线搜索条件下,由新公式所产生的算法方向均是充分下降的,且在标准Wolfe非精确线搜索条件下,算法都具有全局收敛性.最后,对新算法进行大量的比对试验,数值结果表明所提方法是有效的.  相似文献   

11.
孙清滢 《数学进展》2004,33(5):598-606
利用Rosen投影矩阵,建立求解带线性或非线性不等式约束优化问题的三项记忆梯度Rosen投影下降算法,并证明了算法的收敛性.同时给出了结合FR,PR,HS共轭梯度参数的三项记忆梯度Rosen投影算法,从而将经典的共轭梯度法推广用于求解约束规划问题.数值例子表明算法是有效的。  相似文献   

12.
In this paper, we propose a three-term conjugate gradient method based on secant conditions for unconstrained optimization problems. Specifically, we apply the idea of Dai and Liao (in Appl. Math. Optim. 43: 87–101, 2001) to the three-term conjugate gradient method proposed by Narushima et al. (in SIAM J. Optim. 21: 212–230, 2011). Moreover, we derive a special-purpose three-term conjugate gradient method for a problem, whose objective function has a special structure, and apply it to nonlinear least squares problems. We prove the global convergence properties of the proposed methods. Finally, some numerical results are given to show the performance of our methods.  相似文献   

13.
《Optimization》2012,61(10):1631-1648
ABSTRACT

In this paper, we develop a three-term conjugate gradient method involving spectral quotient, which always satisfies the famous Dai-Liao conjugacy condition and quasi-Newton secant equation, independently of any line search. This new three-term conjugate gradient method can be regarded as a variant of the memoryless Broyden-Fletcher-Goldfarb-Shanno quasi-Newton method with regard to spectral quotient. By combining this method with the projection technique proposed by Solodov and Svaiter in 1998, we establish a derivative-free three-term projection algorithm for dealing with large-scale nonlinear monotone system of equations. We prove the global convergence of the algorithm and obtain the R-linear convergence rate under some mild conditions. Numerical results show that our projection algorithm is effective and robust, and is more competitive with the TTDFP algorithm proposed Liu and Li [A three-term derivative-free projection method for nonlinear monotone system of equations. Calcolo. 2016;53:427–450].  相似文献   

14.
In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.  相似文献   

15.
Many constrained sets in problems such as signal processing and optimal control can be represented as a fixed point set of a certain nonexpansive mapping, and a number of iterative algorithms have been presented for solving a convex optimization problem over a fixed point set. This paper presents a novel gradient method with a three-term conjugate gradient direction that is used to accelerate conjugate gradient methods for solving unconstrained optimization problems. It is guaranteed that the algorithm strongly converges to the solution to the problem under the standard assumptions. Numerical comparisons with the existing gradient methods demonstrate the effectiveness and fast convergence of this algorithm.  相似文献   

16.
In this paper, a nonlinear conjugate structural first-order reliability method is proposed using three-term conjugate discrete map-based sensitivity analysis to enhance convergence properties as stable results and efficient computational burden of nonlinear reliability problems. The concept of finite-step length strategy is incorporated into this method to enhance the stability of the iterative formula for highly nonlinear limit state function, while three-term conjugate search direction combining with a finite-step size is utilized to enhance the efficiency of the sensitivity vector in the proposed iterative reliability formula. The proposed three-term discrete conjugate search direction is developed based on the sufficient descent condition to provide the stable results, theoretically. The efficiency and robustness of the proposed three-term conjugate formula are investigated through several nonlinear/ complex structural examples and are compared with several modified existing iterative formulas. Comparative results illustrate that the three-term conjugate-based finite step length formula provides superior efficiency and robustness than other studied methods.  相似文献   

17.
Bojari  S.  Eslahchi  M. R. 《Numerical Algorithms》2020,83(3):901-933
Numerical Algorithms - In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our...  相似文献   

18.
Jiang  Xianzhen  Liao  Wei  Yin  Jianghua  Jian  Jinbao 《Numerical Algorithms》2022,91(1):161-191

In this paper, based on the hybrid conjugate gradient method and the convex combination technique, a new family of hybrid three-term conjugate gradient methods are proposed for solving unconstrained optimization. The conjugate parameter in the search direction is a hybrid of Dai-Yuan conjugate parameter and any one. The search direction then is the sum of the negative gradient direction and a convex combination in relation to the last search direction and the gradient at the previous iteration. Without choosing any specific conjugate parameters, we show that the search direction generated by the family always possesses the descent property independent of line search technique, and that it is globally convergent under usual assumptions and the weak Wolfe line search. To verify the effectiveness of the presented family, we further design a specific conjugate parameter, and perform medium-large-scale numerical experiments for smooth unconstrained optimization and image restoration problems. The numerical results show the encouraging efficiency and applicability of the proposed methods even compared with the state-of-the-art methods.

  相似文献   

19.
In this paper, a subspace three-term conjugate gradient method is proposed. The search directions in the method are generated by minimizing a quadratic approximation of the objective function on a subspace. And they satisfy the descent condition and Dai-Liao conjugacy condition. At each iteration, the subspace is spanned by the current negative gradient and the latest two search directions. Thereby, the dimension of the subspace should be 2 or 3. Under some appropriate assumptions, the global convergence result of the proposed method is established. Numerical experiments show the proposed method is competitive for a set of 80 unconstrained optimization test problems.  相似文献   

20.
《Optimization》2012,61(9):1387-1400
Although the Hesteness and Stiefel (HS) method is a well-known method, if an inexact line search is used, researches about its convergence rate are very rare. Recently, Zhang, Zhou and Li [Some descent three-term conjugate gradient methods and their global convergence, Optim. Method Softw. 22 (2007), pp. 697–711] proposed a three-term Hestenes–Stiefel method for unconstrained optimization problems. In this article, we investigate the convergence rate of this method. We show that the three-term HS method with the Wolfe line search will be n-step superlinearly and even quadratically convergent if some restart technique is used under reasonable conditions. Some numerical results are also reported to verify the theoretical results. Moreover, it is more efficient than the previous ones.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号