首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

2.
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain ??better?? curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method, has the global and superlinear convergence for convex functions. Numerical experiments with this class, using the well-known quasi-Newton BFGS, DFP and a modified SR1 updates, are presented to illustrate some advantages of the new techniques. These experiments show that the performance of several combined methods are substantially better than that of the standard BFGS method. Similar improvements are also obtained if the simple sufficient function reduction condition on the steplength is used instead of the strong Wolfe conditions.  相似文献   

3.
本本文给出了一个解非线性对称方程组问题的具有下降方向的近似高斯一牛顿基础BFGS方法。无论使用何种线性搜索此方法产生的方向总是下降的。在适当的条件下我们将证明此方法的全局收敛性和超线性收敛性。并给出数值检验结果。  相似文献   

4.

A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored and employed, such as a local superlinear rate of convergence under assumptions that are common for attaining such guarantees. To the best of our knowledge, this is the first work in which a local superlinear convergence rate guarantee is offered by a quasi-Newton scheme that does not either store all curvature pairs throughout the entire run of the optimization algorithm or store an explicit (inverse) Hessian approximation. Numerical results are presented to show that displacement aggregation within an adaptive L-BFGS scheme can lead to better performance than standard L-BFGS.

  相似文献   

5.
In this paper, we propose a trust region method for minimizing a function whose Hessian matrix at the solutions may be singular. The global convergence of the method is obtained under mild conditions. Moreover, we show that if the objective function is LC 2 function, the method possesses local superlinear convergence under the local error bound condition without the requirement of isolated nonsingular solution. This is the first regularized Newton method with trust region technique which possesses local superlinear (quadratic) convergence without the assumption that the Hessian of the objective function at the solution is nonsingular. Preliminary numerical experiments show the efficiency of the method. This work is partly supported by the National Natural Science Foundation of China (Grant Nos. 70302003, 10571106, 60503004, 70671100) and Science Foundation of Beijing Jiaotong University (2007RC014).  相似文献   

6.
Convergence properties of a class of multi-directional parallel quasi-Newton algorithmsfor the solution of unconstrained minimization problems are studied in this paper.At eachiteration these algorithms generate several different quasi-Newton directions,and thenapply line searches to determine step lengths along each direction,simultaneously.Thenext iterate is obtained among these trail points by choosing the lowest point in the sense offunction reductions.Different quasi-Newton updating formulas from the Broyden familyare used to generate a main sequence of Hessian matrix approximations.Based on theBFGS and the modified BFGS updating formulas,the global and superlinear convergenceresults are proved.It is observed that all the quasi-Newton directions asymptoticallyapproach the Newton direction in both direction and length when the iterate sequenceconverges to a local minimum of the objective function,and hence the result of superlinearconvergence follows.  相似文献   

7.
On the limited memory BFGS method for large scale optimization   总被引:60,自引:0,他引:60  
We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cycles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.This work was supported by the Applied Mathematical Sciences subprogram of the Office of Energy Research, U.S. Department of Energy, under contract DE-FG02-87ER25047, and by National Science Foundation Grant No. DCR-86-02071.  相似文献   

8.
《Optimization》2012,61(1):85-99
In this article, we propose a BFGS method for solving symmetric nonlinear equations. The presented method possesses some favourable properties: (a) the generated sequence of iterates is norm descent; (b) the generated sequence of the quasi-Newton matrix is positive definite and (c) this method possesses the global convergence and superlinear convergence. Numerical results show that the presented method is interesting.  相似文献   

9.
In a recent paper McCormick and Ritter consider two classes of algorithms, namely methods of conjugate directions and quasi-Newton methods, for the problem of minimizing a function ofn variablesF(x). They show that the former methods possess ann-step superlinear rate of convergence while the latter are every step superlinear and therefore inherently superior. In this paper a simple and computationally inexpensive modification of a method of conjugate directions is presented. It is shown that the modified method is a quasi-Newton method and is thus every step superlinearly convergent. It is also shown that under certain assumptions on the second derivatives ofF the rate of convergence of the modified method isn-step quadratic.This work was supported by the National Research Council of Canada under Research Grant A8189.  相似文献   

10.
In this paper, we propose a BFGS trust-region method for solving symmetric nonlinear equations. The global convergence and the superlinear convergence of the presented method will be established under favorable conditions. Numerical results show that the new algorithm is effective.  相似文献   

11.
In this paper, we consider Newton's method for solving the system of necessary optimality conditions of optimization problems with equality and inequality constraints. The principal drawbacks of the method are the need for a good starting point, the inability to distinguish between local maxima and local minima, and, when inequality constraints are present, the necessity to solve a quadratic programming problem at each iteration. We show that all these drawbacks can be overcome to a great extent without sacrificing the superlinear convergence rate by making use of exact differentiable penalty functions introduced by Di Pillo and Grippo (Ref. 1). We also show that there is a close relationship between the class of penalty functions of Di Pillo and Grippo and the class of Fletcher (Ref. 2), and that the region of convergence of a variation of Newton's method can be enlarged by making use of one of Fletcher's penalty functions.This work was supported by the National Science Foundation, Grant No. ENG-79-06332.  相似文献   

12.
A version of the conjugate Gram-Schmidt method (devised by Hestenes), which requires only function evaluations, is considered. It is shown thatN-step superlinear convergence to a minimum is possible when applying that method to nonquadratic functions.This work was supported by an NPS Foundation Research Grant.  相似文献   

13.
In this paper, we propose a BFGS (Broyden–Fletcher–Goldfarb–Shanno)-SQP (sequential quadratic programming) method for nonlinear inequality constrained optimization. At each step, the method generates a direction by solving a quadratic programming subproblem. A good feature of this subproblem is that it is always consistent. Moreover, we propose a practical update formula for the quasi-Newton matrix. Under mild conditions, we prove the global and superlinear convergence of the method. We also present some numerical results.  相似文献   

14.
15.
The forward–backward splitting method (FBS) for minimizing a nonsmooth composite function can be interpreted as a (variable-metric) gradient method over a continuously differentiable function which we call forward–backward envelope (FBE). This allows to extend algorithms for smooth unconstrained optimization and apply them to nonsmooth (possibly constrained) problems. Since the FBE can be computed by simply evaluating forward–backward steps, the resulting methods rely on a similar black-box oracle as FBS. We propose an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka–?ojasiewicz property at its critical points. Moreover, when using quasi-Newton directions the proposed method achieves superlinear convergence provided that usual second-order sufficiency conditions on the FBE hold at the limit point of the generated sequence. Such conditions translate into milder requirements on the original function involving generalized second-order differentiability. We show that BFGS fits our framework and that the limited-memory variant L-BFGS is well suited for large-scale problems, greatly outperforming FBS or its accelerated version in practice, as well as ADMM and other problem-specific solvers. The analysis of superlinear convergence is based on an extension of the Dennis and Moré theorem for the proposed algorithmic scheme.  相似文献   

16.
This work is an attempt to develop multiobjective versions of some well-known single objective quasi-Newton methods, including BFGS, self-scaling BFGS (SS-BFGS), and the Huang BFGS (H-BFGS). A comprehensive and comparative study of these methods is presented in this paper. The Armijo line search is used for the implementation of these methods. The numerical results show that the Armijo rule does not work the same way for the multiobjective case as for the single objective case, because, in this case, it imposes a large computational effort and significantly decreases the speed of convergence in contrast to the single objective case. Hence, we consider two cases of all multi-objective versions of quasi-Newton methods: in the presence of the Armijo line search and in the absence of any line search. Moreover, the convergence of these methods without using any line search under some mild conditions is shown. Also, by introducing a multiobjective subproblem for finding the quasi-Newton multiobjective search direction, a simple representation of the Karush–Kuhn–Tucker conditions is derived. The H-BFGS quasi-Newton multiobjective optimization method provides a higher-order accuracy in approximating the second order curvature of the problem functions than the BFGS and SS-BFGS methods. Thus, this method has some benefits compared to the other methods as shown in the numerical results. All mentioned methods proposed in this paper are evaluated and compared with each other in different aspects. To do so, some well-known test problems and performance assessment criteria are employed. Moreover, these methods are compared with each other with regard to the expended CPU time, the number of iterations, and the number of function evaluations.  相似文献   

17.
Convergence analysis of a modified BFGS method on convex minimizations   总被引:2,自引:0,他引:2  
A modified BFGS method is proposed for unconstrained optimization. The global convergence and the superlinear convergence of the convex functions are established under suitable assumptions. Numerical results show that this method is interesting.  相似文献   

18.
We prove the superlinear convergence of a nonmonotone BFGS algorithm on convex objective functions under suitable conditions.  相似文献   

19.
一族超线性收敛的投影拟牛顿算法   总被引:5,自引:0,他引:5  
本文将梯度投影与拟牛顿法相结合,给出了求解一般线性约束非线性规划问题含两组参数的算法族.在一定的条件下证明了算法族的全局收敛性与它的子族的超线性收敛速度,并给出了投影D.F.P方法、投影BFGS方法等一些特例.  相似文献   

20.
本文利用一个修正的BFGS公式,提出了一个结合Armijo线搜索条件技术的BFGS信赖域方法,并在一定条件下证明了该方法的全局收敛性和超线性收敛性.初步的数值实验结果表明该方法是有效的.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号