首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
In this paper, we develop a memory gradient method for unconstrained optimization. The main characteristic of this method is that we obtain the next iteration without any line search. Under certain conditions, we obtain the strong global convergence of the proposed method.  相似文献   

2.
《Optimization》2012,61(2):163-179
In this article, we consider the global convergence of the Polak–Ribiére–Polyak conjugate gradient method (abbreviated PRP method) for minimizing functions that have Lipschitz continuous partial derivatives. A novel form of non-monotone line search is proposed to guarantee the global convergence of the PRP method. It is also shown that the PRP method has linear convergence rate under some mild conditions when the non-monotone line search reduces to a related monotone line search. The new non-monotone line search needs to estimate the Lipschitz constant of the gradients of objective functions, for which two practical estimations are proposed to help us to find a suitable initial step size for the PRP method. Numerical results show that the new line search approach is efficient in practical computation.  相似文献   

3.
Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and Cragg and Levy (1969). In this paper, we present a new memory gradient method which generates a descent search direction for the objective function at every iteration. We show that our method converges globally to the solution if the Wolfe conditions are satisfied within the framework of the line search strategy. Our numerical results show that the proposed method is efficient for given standard test problems if we choose a good parameter included in the method.  相似文献   

4.
一类新的求解无约束优化问题的记忆梯度法   总被引:1,自引:1,他引:0  
汤京永  贺国平  董丽 《数学杂志》2011,31(2):362-368
本文研究了无约束优化问题.利用当前和前面迭代点的信息产生下降方向以及Armijo线性搜索确定步长,得到了一类新的记忆梯度法.在较弱条件下证明了算法具有全局收敛性和线性收敛速率.数值试验表明算法是有效的.  相似文献   

5.
In this article, a new method is proposed for solving symmetric nonlinear equations, which can ensure that the search direction is descent for the norm function without carrying any line search technique. Under mild conditions, the global convergence of the given method is established. Numerical results show that the proposed method is effective for the given test problems.  相似文献   

6.
In this paper, a new gradient-related algorithm for solving large-scale unconstrained optimization problems is proposed. The new algorithm is a kind of line search method. The basic idea is to choose a combination of the current gradient and some previous search directions as a new search direction and to find a step-size by using various inexact line searches. Using more information at the current iterative step may improve the performance of the algorithm. This motivates us to find some new gradient algorithms which may be more effective than standard conjugate gradient methods. Uniformly gradient-related conception is useful and it can be used to analyze global convergence of the new algorithm. The global convergence and linear convergence rate of the new algorithm are investigated under diverse weak conditions. Numerical experiments show that the new algorithm seems to converge more stably and is superior to other similar methods in many situations.  相似文献   

7.
一类新的非单调记忆梯度法及其全局收敛性   总被引:1,自引:0,他引:1  
在非单调Armijo线搜索的基础上提出一种新的非单调线搜索,研究了一类在该线搜索下的记忆梯度法,在较弱条件下证明了其全局收敛性。与非单调Armijo线搜索相比,新的非单调线搜索在每次迭代时可以产生更大的步长,从而使目标函数值充分下降,降低算法的计算量。  相似文献   

8.
Based on a modified line search scheme, this paper presents a new derivative-free projection method for solving nonlinear monotone equations with convex constraints, which can be regarded as an extension of the scaled conjugate gradient method and the projection method. Under appropriate conditions, the global convergence and linear convergence rate of the proposed method is proven. Preliminary numerical results are also reported to show that this method is promising.  相似文献   

9.
基于无约束单目标记忆梯度法,本文提出了一种无约束多目标优化问题的记忆梯度法,并证明了算法在Armijo线性搜索下的收敛性。数据试验结果验证了该算法的有效性。  相似文献   

10.
In this paper, we present a new memory gradient method such that the direction generated by this method provides a sufficient descent direction for the objective function at every iteration. Then, we analyze its global convergence under mild conditions and convergence rate for uniformly convex functions. Finally, we report some numerical results to show the efficiency of the proposed method.  相似文献   

11.
The Randomized Kaczmarz method (RK) is a stochastic iterative method for solving linear systems that has recently grown in popularity due to its speed and low memory requirement. Selectable Set Randomized Kaczmarz is a variant of RK that leverages existing information about the Kaczmarz iterate to identify an adaptive “selectable set” and thus yields an improved convergence guarantee. In this article, we propose a general perspective for selectable set approaches and prove a convergence result for that framework. In addition, we define two specific selectable set sampling strategies that have competitive convergence guarantees to those of other variants of RK. One selectable set sampling strategy leverages information about the previous iterate, while the other leverages the orthogonality structure of the problem via the Gramian matrix. We complement our theoretical results with numerical experiments that compare our proposed rules with those existing in the literature.  相似文献   

12.
A new subspace minimization conjugate gradient algorithm with a nonmonotone Wolfe line search is proposed and analyzed. In the scheme, we propose two choices of the search direction by minimizing a quadratic approximation of the objective function in special subspaces, and state criterions on how to choose the direction. Under given conditions, we obtain the significant conclusion that each choice of the direction satisfies the sufficient descent property. Based on the idea on how the function is close to a quadratic function, a new strategy for choosing the initial stepsize is presented for the line search. With the used nonmonotone Wolfe line search, we prove the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with well-known CGOPT and CG_DESCENT and show that the proposed algorithm is very promising.  相似文献   

13.
李梅霞  籍法俊 《应用数学》2008,21(1):213-218
在本文中,我们提出了一种新的带扰动项的三项记忆梯度混合投影算法.在这种方法中应用了广义Armijo线搜索,并且仅在梯度函数在包含迭代序列的开凸集上一致连续的条件下证明了该算法的全局收敛性.最后给出了几个数值算例.  相似文献   

14.
Although the study of global convergence of the Polak–Ribière–Polyak (PRP), Hestenes–Stiefel (HS) and Liu–Storey (LS) conjugate gradient methods has made great progress, the convergence of these algorithms for general nonlinear functions is still erratic, not to mention under weak conditions on the objective function and weak line search rules. Besides, it is also interesting to investigate whether there exists a general method that converges under the standard Armijo line search for general nonconvex functions, since very few relevant results have been achieved. So in this paper, we present a new general form of conjugate gradient methods whose theoretical significance is attractive. With any formula β k  ≥ 0 and under weak conditions, the proposed method satisfies the sufficient descent condition independently of the line search used and the function convexity, and its global convergence can be achieved under the standard Wolfe line search or even under the standard Armijo line search. Based on this new method, convergence results on the PRP, HS, LS, Dai–Yuan–type (DY) and Conjugate–Descent–type (CD) methods are established. Preliminary numerical results show the efficiency of the proposed methods.  相似文献   

15.
对于无约束优化问题,提出了一类新的三项记忆梯度算法.这类算法是在参数满足某些假设的条件下,确定它的取值范围,从而保证三项记忆梯度方向是使目标函数充分下降的方向.在非单调步长搜索下讨论了算法的全局收敛性.为了得到具有更好收敛性质的算法,结合Solodov and Svaiter(2000)中的部分技巧,提出了一种新的记忆梯度投影算法,并证明了该算法在函数伪凸的情况下具有整体收敛性.  相似文献   

16.
利用平面上的黄金分割法求全局最优解   总被引:5,自引:0,他引:5  
给出了无约束全局最优问题的一种解法 ,该方法是一维搜索中的 0 .61 8法的推广 ,不仅使其适用范围由一维扩展到平面上 ,并且将原方法只适用于单峰函数的局部搜索改进为可适用于多峰函数的全局最优解的搜索 .给出了收敛性证明 .本法突出的优点在于 :适用性强、算法简单、可以在任意精度内寻得最优解并且克服了以往直接解法所共有的要求大量计算机内存的缺点 .仿真结果表明算法是有效的 .  相似文献   

17.
刘金魁 《数学季刊》2014,(1):142-150
Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new descent gradient method based on the LS method.It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search.Finally,we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP~+method.  相似文献   

18.
We discuss a filter-based pattern search method for unconstrained optimization in this paper. For the purpose to broaden the search range we use both filter technique and frames, which are fragments of grids, to provide a new criterion of iterate acceptance. The convergence can be ensured under some conditions. The numerical result shows that this method is practical and efficient.  相似文献   

19.
In this paper, an adaptive nonmonotone line search method for unconstrained minimization problems is proposed. At every iteration, the new algorithm selects only one of the two directions: a Newton-type direction and a negative curvature direction, to perform the line search. The nonmonotone technique is included in the backtracking line search when the Newton-type direction is the search direction. Furthermore, if the negative curvature direction is the search direction, we increase the steplength under certain conditions. The global convergence to a stationary point with second-order optimality conditions is established. Some numerical results which show the efficiency of the new algorithm are reported.   相似文献   

20.
本文讨论不等式约束优化问题,给出一个信赖域方法与SQP方法相结合的新的可行算法,算法中采用了压缩技术,使得QP子问题产生的搜索方向尽可能为可行方向,并且采用了高阶校正的方法来克服算法产生的Maratos效应现象.在适当的条件下,证明了算法的全局收敛性和超线性收敛性.数值结果表明算法是有效的.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号