共查询到20条相似文献,搜索用时 109 毫秒
1.
《数学物理学报(A辑)》2010,(6)
基于简单二次函数模型,结合非精确大步长Armijo线搜索技术,建立了一个新的求解无约束最优化问题的组合信赖域与线搜索算法,在目标函数梯度▽f(x)在R~n上一致连续条件下证明了算法的全局收敛性.数值例子表明算法是有效的,适合求解大规模问题. 相似文献
2.
3.
§1.前言 有些实践中的优化问题可以按无约束来处理,而且大量非常有效的约束优化算法都涉及无约束优化方法,因此,无约束优化方法在实用上是很重要的。 考虑下面的二次目标函数F(X)的无约束优化问题: 相似文献
4.
《数学进展》2016,(2)
利用改进函数将非光滑凸约束优化问题转化成无约束优化问题,构造了一个具有迫近形式的不可行拟牛顿束算法.值得注意的是,随着每次迭代的进行,该算法的无约束优化子问题的目标函数可能发生改变(取零步目标函数不改变,取下降步则更新目标函数),为此必须做必要的调整以保证算法的收敛性.本文主要采用了Sagastizabal和So1odov的不可行束方法的思想,在每个迭代点不一定是原始可行的情况下,得出了算法产生序列的每一个聚点是原问题最优解的收敛性结果.进一步,本文针对目标函数强凸情况下的BFGS拟牛顿算法,得到了全局收敛结果中保证拟牛顿矩阵有界的条件以及迭代序列的R-线性收敛结果. 相似文献
5.
6.
7.
提出一类新的求解无约束优化问题的记忆梯度法,证明了算法的全局收敛性.当目标函数为一致凸函数时,对其线性收敛速率进行了分析.新算法在迭代过程中无需对步长进行线性搜索,仅需对算法中的一些参数进行预测估计,从而减少了目标函数及梯度的迭代次数,降低了算法的计算量和存储量.数值试验表明算法是有效的. 相似文献
8.
一类新的记忆梯度法及其全局收敛性 总被引:1,自引:0,他引:1
研究了求解无约束优化问题的记忆梯度法,利用当前和前面迭代点的信息产生下降方向,得到了一类新的无约束优化算法,在Wolfe线性搜索下证明了其全局收敛性.新算法结构简单,不用计算和存储矩阵,适于求解大型优化问题.数值试验表明算法有效. 相似文献
9.
非凸无约束优化问题的广义拟牛顿法的全局收敛性 总被引:3,自引:0,他引:3
本文对无约束优化问题提出一类新的广义拟牛顿法,并采用一类非精确线搜索证明了算法对一般非凸目标函数极小化问题的全局收敛性. 相似文献
10.
11.
Efficient line search algorithm for unconstrained optimization 总被引:6,自引:0,他引:6
A new line search algorithm for smooth unconstrained optimization is presented that requires only one gradient evaluation with an inaccurate line search and at most two gradient evaluations with an accurate line search. It terminates in finitely many operations and shares the same theoretical properties as the standard line search rules like the Armijo-Goldstein-Wolfe-Powell rules. This algorithm is especially appropriate for the situation when gradient evaluations are very expensive relative to function evaluations.The authors would like to thank Margaret Wright and Jorge Moré for valuable comments on earlier versions of this paper. 相似文献
12.
In this paper, a new gradient-related algorithm for solving large-scale unconstrained optimization problems is proposed. The new algorithm is a kind of line search method. The basic idea is to choose a combination of the current gradient and some previous search directions as a new search direction and to find a step-size by using various inexact line searches. Using more information at the current iterative step may improve the performance of the algorithm. This motivates us to find some new gradient algorithms which may be more effective than standard conjugate gradient methods. Uniformly gradient-related conception is useful and it can be used to analyze global convergence of the new algorithm. The global convergence and linear convergence rate of the new algorithm are investigated under diverse weak conditions. Numerical experiments show that the new algorithm seems to converge more stably and is superior to other similar methods in many situations. 相似文献
13.
一个新的无约束优化超记忆梯度算法 总被引:3,自引:0,他引:3
本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效. 相似文献
14.
Sne?ana S.DJORDJEVI? 《数学物理学报(B辑英文版)》2019,(1)
In this paper, we present a new hybrid conjugate gradient algorithm for unconstrained optimization. This method is a convex combination of Liu-Storey conjugate gradient method and Fletcher-Reeves conjugate gradient method. We also prove that the search direction of any hybrid conjugate gradient method, which is a convex combination of two conjugate gradient methods, satisfies the famous D-L conjugacy condition and in the same time accords with the Newton direction with the suitable condition. Furthermore, this property doesn't depend on any line search. Next, we also prove that, moduling the value of the parameter t,the Newton direction condition is equivalent to Dai-Liao conjugacy condition.The strong Wolfe line search conditions are used.The global convergence of this new method is proved.Numerical comparisons show that the present hybrid conjugate gradient algorithm is the efficient one. 相似文献
15.
Neculai Andrei 《Journal of Computational and Applied Mathematics》2010,234(12):3397-3410
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algorithms may differ from 1 by two orders of magnitude and tend to vary in a very unpredictable manner, the algorithms are equipped with an acceleration scheme able to improve the efficiency of the algorithms. Computational results for a set consisting of 750 unconstrained optimization test problems show that these new conjugate gradient algorithms substantially outperform the Dai-Yuan conjugate gradient algorithm and its hybrid variants, Hestenes-Stiefel, Polak-Ribière-Polyak, CONMIN conjugate gradient algorithms, limited quasi-Newton algorithm LBFGS and compare favorably with CG_DESCENT. In the frame of this numerical study the accelerated scaled memoryless BFGS preconditioned conjugate gradient ASCALCG algorithm proved to be more robust. 相似文献
16.
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效. 相似文献
17.
18.
A new adaptive subspace minimization three-term conjugate gradient algorithm with nonmonotone line search is introduced and analyzed in this paper.The search directions are computed by minimizing a quadratic approximation of the objective function on special subspaces,and we also proposed an adaptive rule for choosing different searching directions at each iteration.We obtain a significant conclusion that the each choice of the search directions satisfies the sufficient descent condition.With the used nonmonotone line search,we prove that the new algorithm is globally convergent for general nonlinear functions under some mild assumptions.Numerical experiments show that the proposed algorithm is promising for the given test problem set. 相似文献
19.
一类新的非单调记忆梯度法及其全局收敛性 总被引:1,自引:0,他引:1
在非单调Armijo线搜索的基础上提出一种新的非单调线搜索,研究了一类在该线搜索下的记忆梯度法,在较弱条件下证明了其全局收敛性。与非单调Armijo线搜索相比,新的非单调线搜索在每次迭代时可以产生更大的步长,从而使目标函数值充分下降,降低算法的计算量。 相似文献
20.
A simple conic model function,in which the Hessian approximation is a scalar matrix,is constructed by using the function values and gradient values of the minimizing function.Based on this conic model function,a new nonmonotone line search method is proposed.The convergence results of this line search method are proved under certain conditions.Numerical results show that the new algorithm is effective. 相似文献