首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Globally Convergent Algorithms for Unconstrained Optimization   总被引:2,自引:0,他引:2  
A new globalization strategy for solving an unconstrained minimization problem is proposed based on the idea of combining Newton's direction and the steepest descent direction WITHIN each iteration. Global convergence is guaranteed with an arbitrary initial point. The search direction in each iteration is chosen to be as close to the Newton's direction as possible and could be the Newton's direction itself. Asymptotically the Newton step will be taken in each iteration and thus the local convergence is quadratic. Numerical experiments are also reported. Possible combination of a Quasi-Newton direction with the steepest descent direction is also considered in our numerical experiments. The differences between the proposed strategy and a few other strategies are also discussed.  相似文献   

2.
对一般目标函数极小化问题的拟牛顿法及其全局收敛性的研究,已经成为拟牛顿法理论中最基本的开问题之一.本文对这个问题做了进一步的研究,对无约束优化问题提出一类新的广义拟牛顿算法,并结合Goldstein线搜索证明了算法对一般非凸目标函数极小化问题的全局收敛性.  相似文献   

3.
结合非单调信赖域方法,和非单调线搜索技术,提出了一种新的无约束优化算法.信赖域方法的每一步采用线搜索,使得迭代每一步都充分下降加快了迭代速度.在一定条件下,证明了算法具有全局收敛性和局部超线性.收敛速度.数值试验表明算法是十分有效的.  相似文献   

4.
In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the constituted algorithm with either Wolfe-type or Armijotype line search converges globally and Q-superlinearly if the function to be minimized has Lipschitz continuous gradient.  相似文献   

5.
尝试在有限存储类算法中利用目标函数值所提供的信息.首先利用插值条件构造了一个新的二次函数逼近目标函数,得到了一个新的弱割线方程,然后将此弱割线方程与袁[1]的弱割线方程相结合,给出了一族包括标准LBFGS的有限存储BFGS类算法,证明了这族算法的收敛性.从标准试验函数库CUTE中选择试验函数进行了数值试验,试验结果表明这族算法的数值表现都与标准LBFGS类似.  相似文献   

6.
应用双参数的类Broyden族校正公式,为研究求解无约束最优化问题的拟牛顿类算法对一般目标函数的收敛性这个开问题提供了一种新的方法.  相似文献   

7.
Trust region(TR)algorithms are a class of recently developed alogrthms for nonlinear optimization.A new family of TR algorithms for unconstrained optimization,which is the extension of the usual TR method,is pressented in this paper.When the objective function is bounded below and continuously differentiable,and the norm of the Hesse approximations increases at most linearly with the iteration number,we prove the global convergence of the algorithms.Limited numerical results are repoted,which indicate that our new TR algorithm is competitive.  相似文献   

8.
The BFGS method is the most effective of the quasi-Newton methods for solving unconstrained optimization problems. Wei, Li, and Qi [16] have proposed some modified BFGS methods based on the new quasi-Newton equation B k+1 s k = y* k , where y* k is the sum of y k and A k s k, and A k is some matrix. The average performance of Algorithm 4.3 in [16] is better than that of the BFGS method, but its superlinear convergence is still open. This article proves the superlinear convergence of Algorithm 4.3 under some suitable conditions.  相似文献   

9.
We consider quasidifferentiable functions in the sense of Demyanov and Rubinov, i. e. functions, which are directionally differentiable and whose directional derivative can be expressed as a difference of two sublinear functions, so that its subdifferential, called the quasidifferential, consists of a pair of sets. For these functions a generalized gradient algorithm is proposed. Its behaviour is studied in detail for the special class of continuously subdifferentiable functions. Numerical test results are given. Finally, the general quasidifferentiable case is simulated by means of perturbed subdifferentials, where we make use of the non-uniqueness in the quasidifferential representation.  相似文献   

10.
本文对无约束最优化问题提出一阶曲线法,给出其全局收敛结果.对由微分方程定义的一阶曲线,提出渐近收敛指数概念,并计算出几个常用曲线模型的渐近收敛指数.  相似文献   

11.
一个新的无约束优化超记忆梯度算法   总被引:3,自引:0,他引:3  
时贞军 《数学进展》2006,35(3):265-274
本文提出一种新的无约束优化超记忆梯度算法,算法利用当前点的负梯度和前一点的负梯度的线性组合为搜索方向,以精确线性搜索和Armijo搜索确定步长.在很弱的条件下证明了算法具有全局收敛性和线性收敛速度.因算法中避免了存贮和计算与目标函数相关的矩阵,故适于求解大型无约束优化问题.数值实验表明算法比一般的共轭梯度算法有效.  相似文献   

12.
本文提出了一类新的用于解决无约束最优化问题的拟牛顿方法,并证明了这样的性质,在 精确线性搜索条件下,每一步该族所有方法所产生的迭代方向和迭代点列仅依赖于参数ρ.该方 法可视为拟牛顿方法中黄族的推广.  相似文献   

13.
本文基于分式逼近提出了一类求解单变量无约束优化问题的新割线法,给出并证明了该方法的收敛阶是(√2+1).并进一步对新方法的性能进行了分析,给出了新方法、经典的牛顿法和其他修正的割线类方法解单变量无约束优化问题的数值实验.理论和数值结果均表明新的割线法是有效的.  相似文献   

14.
填充函数法是求解全局优化问题的一个重要的确定性算法,这种方法的关键是构造具有良好性质的填充函数.构造了一个新的求解无约束全局优化问题的填充函数.函数连续可微且只包含一个参数.通过分析该函数的相关性质,设计了相应的算法.数值实验表明该算法简单有效.  相似文献   

15.
在本文中,我们给出一个求解无约束优化问题的秩一适定方法,该方法具有下述较好性质:校正矩阵是对称正定的;在适当条件下,对非凸函数拥有全局收敛性.我们还给出数值检验结果.  相似文献   

16.
四种无约束优化算法的比较研究   总被引:1,自引:0,他引:1  
从数值试验的角度 ,通过对 3个测试问题 (其中构造了一个规模大小可变的算例 )的求解 ,对共轭梯度法、BFGS拟牛顿法、DFP拟牛顿法和截断牛顿法进行比较研究 ,根据测试结果的分析 ,显示截断牛顿法在求解大规模优化问题时具有优势 ,从而为大规模寻优算法的研究提供了有益的借鉴 .  相似文献   

17.
Unconstrained Optimization Reformulations of Variational Inequality Problems   总被引:12,自引:0,他引:12  
Recently, Peng considered a merit function for the variational inequality problem (VIP), which constitutes an unconstrained differentiable optimization reformulation of VIP. In this paper, we generalize the merit function proposed by Peng and study various properties of the generalized function. We call this function the D-gap function. We give conditions under which any stationary point of the D-gap function is a solution of VIP and conditions under which it provides a global error bound for VIP. We also present a descent method for solving VIP based on the D-gap function.  相似文献   

18.
An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization   总被引:22,自引:0,他引:22  
Recently, we propose a nonlinear conjugate gradient method, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the weak Wolfe conditions. In this paper, we will study methods related to the new nonlinear conjugate gradient method. Specifically, if the size of the scalar k with respect to the one in the new method belongs to some interval, then the corresponding methods are proved to be globally convergent; otherwise, we are able to construct a convex quadratic example showing that the methods need not converge. Numerical experiments are made for two combinations of the new method and the Hestenes–Stiefel conjugate gradient method. The initial results show that, one of the hybrid methods is especially efficient for the given test problems.  相似文献   

19.
We consider multi-step quasi-Newton methods for unconstrained optimization. These methods were introduced by Ford and Moghrabi (Appl. Math., vol. 50, pp. 305–323, 1994; Optimization Methods and Software, vol. 2, pp. 357–370, 1993), who showed how interpolating curves could be used to derive a generalization of the Secant Equation (the relation normally employed in the construction of quasi-Newton methods). One of the most successful of these multi-step methods makes use of the current approximation to the Hessian to determine the parameterization of the interpolating curve in the variable-space and, hence, the generalized updating formula. In this paper, we investigate new parameterization techniques to the approximate Hessian, in an attempt to determine a better Hessian approximation at each iteration and, thus, improve the numerical performance of such algorithms.  相似文献   

20.
We present a family of local-search-based heuristics for Quadratic Unconstrained Binary Optimization (QUBO), all of which start with a (possibly fractional) initial point, sequentially improving its quality by rounding or switching the value of one variable, until arriving to a local optimum. The effects of various parameters on the efficiency of these methods are analyzed through computational experiments carried out on thousands of randomly generated problems having 20 to 2500 variables. Tested on numerous benchmark problems, the performance of the most competitive variant (ACSIOM) was shown to compare favorably with that of other published procedures. G. Tavares supported in part by the Portuguese FCT and by the FSE in the context of the III Quadro Comunitário de Apoio. P. L. Hammer, Our co-author and friend, Dr. Peter L. Hammer died in a tragic car accident while the final version of this paper was being prepared.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号