共查询到15条相似文献,搜索用时 78 毫秒
1.
2.
3.
4.
提出一种求解无约束优化问题的非单调多步曲线搜索方法.此方法具有如下特点:(1)算法在产生下一个迭代点时不仅利用了当前迭代点的信息,而且还可能利用前m个迭代点的信息.这就是多步法;(2)下降方向和步长同时确定,而不是先找到方向,再由线性搜索寻找步长.这就是曲线搜索技术;(3)采用非单调搜索技巧.在较弱的条件下,我们证明了此方法的收敛性. 相似文献
5.
6.
基于非单调线搜索技术和IMPBOT算法,提出了一个求解无约束优化问题的ODE型混合方法.该方法的主要特点是:为了求得试验步,该方法在每次迭代时不必求解带信赖域界的子问题,仅需要求解一线性方程组系统;当试验步不被接受时,该方法就执行改进的Wolfe-型非单调线搜索来获得下一个新的迭代点,从而避免了反复求解线性方程组系统. 在一定条件下,所提算法还是整体收敛和超线性收敛的. 数值试验结果表明该方法是有效的. 相似文献
7.
有界约束非线性优化问题的仿射共轭梯度路径法 总被引:2,自引:0,他引:2
本文提出仿射内点离散共轭梯度路径法解有界约束的非线性优化问题,通过构造预条件离散的共轭梯度路径解二次模型获得预选迭代方向,结合内点回代线搜索获得下一步的迭代,在合理的假设条件下,证明了算法的整体收敛性与局部超线性收敛速率,最后,数值结果表明了算法的有效性. 相似文献
8.
9.
10.
11.
In this paper we present a new memory gradient method with trust region for unconstrained optimization problems. The method
combines line search method and trust region method to generate new iterative points at each iteration and therefore has both
advantages of line search method and trust region method. It sufficiently uses the previous multi-step iterative information
at each iteration and avoids the storage and computation of matrices associated with the Hessian of objective functions, so
that it is suitable to solve large scale optimization problems. We also design an implementable version of this method and
analyze its global convergence under weak conditions. This idea enables us to design some quick convergent, effective, and
robust algorithms since it uses more information from previous iterative steps. Numerical experiments show that the new method
is effective, stable and robust in practical computation, compared with other similar methods. 相似文献
12.
研究一种新的无约束优化超记忆梯度算法,算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用Wolfe线性搜索产生步长,在较弱的条件下证明了算法的全局收敛性。新算法在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题。 相似文献
13.
We propose a new inexact line search rule and analyze the global convergence and convergence rate of related descent methods.
The new line search rule is similar to the Armijo line-search rule and contains it as a special case. We can choose a larger
stepsize in each line-search procedure and maintain the global convergence of related line-search methods. This idea can make
us design new line-search methods in some wider sense. In some special cases, the new descent method can reduce to the Barzilai
and Borewein method. Numerical results show that the new line-search methods are efficient for solving unconstrained optimization
problems.
The work was supported by NSF of China Grant 10171054, Postdoctoral Fund of China, and K. C. Wong Postdoctoral Fund of CAS
Grant 6765700.
The authors thank the anonymous referees for constructive comments and suggestions that greatly improved the paper. 相似文献
14.
Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. It is well known that the search direction plays a main role in the line search method. In this article, we propose a new search direction with the Wolfe line search technique for solving unconstrained optimization problems. Under the above line searches and some assumptions, the global convergence properties of the given methods are discussed. Numerical results and comparisons with other CG methods are given. 相似文献
15.
无约束优化问题的对角稀疏拟牛顿法 总被引:3,自引:0,他引:3
对无约束优化问题提出了对角稀疏拟牛顿法,该算法采用了Armijo非精确线性搜索,并在每次迭代中利用对角矩阵近似拟牛顿法中的校正矩阵,使计算搜索方向的存贮量和工作量明显减少,为大型无约束优化问题的求解提供了新的思路.在通常的假设条件下,证明了算法的全局收敛性,线性收敛速度并分析了超线性收敛特征。数值实验表明算法比共轭梯度法有效,适于求解大型无约束优化问题. 相似文献