首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 140 毫秒
1.
由William W.Hager和张洪超提出的一种新的共轭梯度法(简称HZ方法),已被证明是一种有效的方法.本文证明了HZ共轭梯度法在Armijo型线性搜索下的全局收敛性.数值实验显示,在Armijo型线性搜索下的HZ共轭梯度法比在Wolfe线性搜索下更有效.  相似文献   

2.
共轭下降法的全局收敛性   总被引:3,自引:0,他引:3  
本文提出了一种Armijo型的线搜索,并在这种线搜索下讨论了共轭下降法的全局收敛性,且可得方法在每次迭代均产生一个下降搜索方向.  相似文献   

3.
一类新的非单调记忆梯度法及其全局收敛性   总被引:1,自引:0,他引:1  
在非单调Armijo线搜索的基础上提出一种新的非单调线搜索,研究了一类在该线搜索下的记忆梯度法,在较弱条件下证明了其全局收敛性。与非单调Armijo线搜索相比,新的非单调线搜索在每次迭代时可以产生更大的步长,从而使目标函数值充分下降,降低算法的计算量。  相似文献   

4.
连淑君  王长钰 《应用数学》2007,20(1):120-127
本文我们讨论了一簇共轭梯度法,它可被看作是FR法和DY法的凸组合.我们提出了两种Armijo型线搜索,并在这两种线搜索下,讨论了共轭梯度法簇的全局收敛性.  相似文献   

5.
基于无约束单目标记忆梯度法,本文提出了一种无约束多目标优化问题的记忆梯度法,并证明了算法在Armijo线性搜索下的收敛性。数据试验结果验证了该算法的有效性。  相似文献   

6.
给出了一个用于解决 LC1线性约束优化问题的 BFGS-SQP算法 ,这个算法是用 Armijo线性原则来求步长的 .为推广 BFGS-SGP算法 ,本文采用 Wolfe线性搜索原则来替代该 BFGS-SQP算法的 Armijo原则 ,经过分析 ,同样得到了 BFGS-SGP算法的全局收敛性及超线性收敛性  相似文献   

7.
本文证明了回追步长搜索、Curry—Altman步长搜索及其改进形式、Goldstein—Armijo算法的闭性,并指出真正的Armijo步长搜索是回追步长搜索的特例,从而肯定了真正Armijo步长搜索的闭性.  相似文献   

8.
Armijo线性搜索下Hager-Zhang共轭梯度法的全局收敛性   总被引:2,自引:0,他引:2       下载免费PDF全文
Hager和Zhang[4]提出了一种新的非线性共轭梯度法(简称 HZ 方法), 并证明了该方法在 Wolfe搜索和 Goldstein 搜索下求解强凸问题的全局收敛性.但是HZ方法在标准Armijo 搜索下求解非凸问题是否全局收敛尚不清楚.该文提出了一种保守的HZ共轭梯度法,并且证明了这种方法在 Armijo 线性搜索下求解非凸优化问题的全局收敛性.此外,作者给出了一些 数值结果以检验该方法的有效性.  相似文献   

9.
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效.  相似文献   

10.
李梅霞  籍法俊 《应用数学》2008,21(1):213-218
在本文中,我们提出了一种新的带扰动项的三项记忆梯度混合投影算法.在这种方法中应用了广义Armijo线搜索,并且仅在梯度函数在包含迭代序列的开凸集上一致连续的条件下证明了该算法的全局收敛性.最后给出了几个数值算例.  相似文献   

11.
Nonmonotone line search approach is a new technique for solving optimization problems. It relaxes the line search range and finds a larger step-size at each iteration, so as to possibly avoid local minimizer and run away from narrow curved valley. It is helpful to find the global minimizer of optimization problems. In this paper we develop a new modification of matrix-free nonmonotone Armijo line search and analyze the global convergence and convergence rate of the resulting method. We also address several approaches to estimate the Lipschitz constant of the gradient of objective functions that would be used in line search algorithms. Numerical results show that this new modification of Armijo line search is efficient for solving large scale unconstrained optimization problems.  相似文献   

12.
In this paper, we propose a new nonmonotone Armijo type line search and prove that the MBFGS method proposed by Li and Fukushima with this new line search converges globally for nonconvex minimization. Some numerical experiments show that this nonmonotone MBFGS method is efficient for the given test problems.  相似文献   

13.
An efficient descent method for unconstrained optimization problems is line search method in which the step size is required to choose at each iteration after a descent direction is determined. There are many ways to choose the step sizes, such as the exact line search, Armijo line search, Goldstein line search, and Wolfe line search, etc. In this paper we propose a new inexact line search for a general descent method and establish some global convergence properties. This new line search has many advantages comparing with other similar inexact line searches. Moreover, we analyze the global convergence and local convergence rate of some special descent methods with the new line search. Preliminary numerical results show that the new line search is available and efficient in practical computation.  相似文献   

14.
In this paper, we develop a new nonmonotone line search for general line search method and establish some global convergence theorems. The new nonmonotone line search is a novel form of the nonmonotone Armijo line search and allows one to choose a larger step size at each iteration, which is available in constructing new line search methods and possibly reduces the function evaluations at each iteration. Moreover, we analyze the convergence rate of some special line search methods with the new line search. Preliminary numerical results show that some line search methods with the new nonmonotone line search are available and efficient in practical computation.  相似文献   

15.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.  相似文献   

16.
In this paper, we propose a new regularized quasi-Newton method for unconstrained optimization. At each iteration, a regularized quasi-Newton equation is solved to obtain the search direction. The step size is determined by a non-monotone Armijo backtracking line search. An adaptive regularized parameter, which is updated according to the step size of the line search, is employed to compute the next search direction. The presented method is proved to be globally convergent. Numerical experiments show that the proposed method is effective for unconstrained optimizations and outperforms the existing regularized Newton method.  相似文献   

17.
一个新的无约束优化超记忆梯度算法   总被引:3,自引:0,他引:3  
时贞军 《数学进展》2006,35(3):265-274
本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号