首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 62 毫秒
1.
非凸函数极小问题的BFGS算法   总被引:1,自引:0,他引:1  
本对于非凸函数的无约束优化问题,给出一类修正的BFGS算法。算法的思想是对非凸函数的近似Hesse矩阵进行修正,得到下降方向,并且保证拟牛顿条件成立,当步长采用线性搜索一般模型时,证明了该算法的局部收敛性。  相似文献   

2.
万中  冯冬冬 《计算数学》2011,33(4):387-396
基于非单调线搜索在寻求优化问题最优解中的优越性,提出了一类新的非单调保守BFGS算法.同已有方法不同,该算法中用来控制非单调性程度的算法参数不是取固定值,而是利用已有目标函数和梯度函数的信息自动调整其取值,以改善算法的数值表现.在合适的假设条件下,建立了新的非单调保守BFGS算法的全局收敛性.用基准测试优化问题测试了算...  相似文献   

3.
本文在目标函数是一致凸且采用Wolfe线搜索的条件下,给出无约束最优化问题的DFP算法的全局收敛性的几个充分性条件,并与「1」中的条件进行了比较。  相似文献   

4.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

5.
本文在Zhang H.C.的非单调线搜索规则的基础上,设计了求解无约束最优化问题的新的非单调线搜索BFGS算法,在一定 的条件下证明了算法的线性收敛性和超线性收敛性分析.数值例子表明算法是有效的.  相似文献   

6.
求解无约束优化问题的一类新的下降算法   总被引:2,自引:0,他引:2  
本文对求解无约束优化问题提出了一类新的下降算法,并且给出了HS算法与其相结合的两类杂交算法.在Wolfe线搜索下不需给定下降条件,即证明了它们的全局收敛性.数值实验表明新的算法十分有效,尤其是对求解大规模问题而言.  相似文献   

7.
无约束最优化线搜索一般模型及BFGS方法的整体收敛性   总被引:7,自引:0,他引:7  
本文给出了无约束最优化的算法中线性搜索的可接受的步长选择律的一种一般形式,它概括了大多数已有的步长律为其特例,并且研究了它基本性质,最后证明了此线性搜索一般模拟相结合的无约束优化的BFGS算法的整体收敛性。  相似文献   

8.
谱共轭梯度算法是求解大规模无约束最优化问题的有效算法之一.基于Hestenes-Stiefel算法与谱共轭梯度算法,提出一种谱Hestenes-Stiefel共轭梯度算法.在Wolfe线搜索下,算法产生的搜索方向具有下降性质,且全局收敛性也能得到证明.通过对CUTEr函数库中部分著名的函数进行试验,利用著名的DolanMore评价体系,展示了新算法的有效性.  相似文献   

9.
两个修改BFGS算法的收敛性   总被引:5,自引:0,他引:5  
本文以下内容是这样安排的:第二节给出一个统一算法模型(GBFGS),并在假设目标函数一致凸的条件下证明该算法具有全局收敛性和局部超线性收敛性;作为第二节理论的应用,我们在第三节证明Biggs算法和Yuan算法的收敛性质.  相似文献   

10.
带有广义Wolfe线搜索的变尺度算法的收敛性   总被引:1,自引:0,他引:1  
本文提出一类广义Wolfe线搜索模型,并且把它与著名的BFGS方法相结合,对于所得到的算法证明了:对于凸函数算法具有全局收敛性和超线性收敛速度,这推广了参考文献[1]中的结果.  相似文献   

11.
In this paper, a new nonmonotone BFGS algorithmfor unconstrained optimization is introduced. Under mild conditions,the global convergence of this new algorithm on convex functions isproved. Some numerical experiments show that this new nonmonotoneBFGS algorithm is competitive to the BFGS algorithm.  相似文献   

12.
研究一种新的无约束优化超记忆梯度算法,算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用Wolfe线性搜索产生步长,在较弱的条件下证明了算法的全局收敛性。新算法在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题。  相似文献   

13.
Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. It is well known that the search direction plays a main role in the line search method. In this article, we propose a new search direction with the Wolfe line search technique for solving unconstrained optimization problems. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. Numerical results and comparisons with other CG methods are given.  相似文献   

14.
In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems.  相似文献   

15.
本文对求解无约束优化问题给出两类新的变参数下降算法.在Wolfe线搜索下无需给定充分下降条件,即可证明它们的全局收敛性.大量数值试验表明它们是非常有效的和稳定的,能够广泛用于科学计算.  相似文献   

16.
In this paper, we analyze the global convergence of the least-change secant method proposed by Dennis and Wolkowicz, when applied to convex objective functions. One of the most distinguished features of this method is that the Dennis-Wolkowicz update doesn't necessarily belong to the Broyden convex family and can be close to the DFP update, but it still has the self-correcting property. We prove that, for convex objective functions, this method with the commonly used Wolfe line search is globally convergent. We also provide some numerical results.  相似文献   

17.
本文对求解无约束优化问题提出一类三项混合共轭梯度算法,新算法将Hestenes- stiefel算法与Dai-Yuan方法相结合,并在不需给定下降条件的情况下,证明了算法在Wolfe线搜索原则下的收敛性,数值试验亦显示出这种混合共轭梯度算法较之HS和PRP的优势.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号