首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

2.
Quasi-Newton methods in conjunction with the piecewise sequential quadratic programming are investigated for solving mathematical programming with equilibrium constraints, in particular for problems with complementarity constraints. Local convergence as well as superlinear convergence of these quasi-Newton methods can be established under suitable assumptions. In particular, several well-known quasi-Newton methods such as BFGS and DFP are proved to exhibit the local and superlinear convergence.  相似文献   

3.
《Optimization》2012,61(2):249-263
New algorithms for solving unconstrained optimization problems are presented based on the idea of combining two types of descent directions: the direction of anti-gradient and either the Newton or quasi-Newton directions. The use of latter directions allows one to improve the convergence rate. Global and superlinear convergence properties of these algorithms are established. Numerical experiments using some unconstrained test problems are reported. Also, the proposed algorithms are compared with some existing similar methods using results of experiments. This comparison demonstrates the efficiency of the proposed combined methods.  相似文献   

4.
The convergence properties of different updating methods for the multipliers in augmented Lagrangians are considered. It is assumed that the updating of the multipliers takes place after each line search of a quasi-Newton method. Two of the updating methods are shown to be linearly convergent locally, while a third method has superlinear convergence locally. Modifications of the algorithms to ensure global convergence are considered. The results of a computational comparison with other methods are presented.This work was supported by the Swedish Institute of Applied Mathematics.  相似文献   

5.
The forward–backward splitting method (FBS) for minimizing a nonsmooth composite function can be interpreted as a (variable-metric) gradient method over a continuously differentiable function which we call forward–backward envelope (FBE). This allows to extend algorithms for smooth unconstrained optimization and apply them to nonsmooth (possibly constrained) problems. Since the FBE can be computed by simply evaluating forward–backward steps, the resulting methods rely on a similar black-box oracle as FBS. We propose an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka–?ojasiewicz property at its critical points. Moreover, when using quasi-Newton directions the proposed method achieves superlinear convergence provided that usual second-order sufficiency conditions on the FBE hold at the limit point of the generated sequence. Such conditions translate into milder requirements on the original function involving generalized second-order differentiability. We show that BFGS fits our framework and that the limited-memory variant L-BFGS is well suited for large-scale problems, greatly outperforming FBS or its accelerated version in practice, as well as ADMM and other problem-specific solvers. The analysis of superlinear convergence is based on an extension of the Dennis and Moré theorem for the proposed algorithmic scheme.  相似文献   

6.
We present a superlinearly convergent exact penalty method for solving constrained nonlinear least squares problems, in which the projected exact penalty Hessian is approximated by using a structured secant updating scheme. We give general conditions for the two-step superlinear convergence of the algorithm and prove that the projected structured Broyden–Fletcher–Goldfarb–Shanno (BFGS), Powell-symmetric-Broyden (PSB), and Davidon–Fletcher–Powell (DFP) update formulas satisfy these conditions. Then we extend the results to the projected structured convex Broyden family update formulas. Extensive testing results obtained by an implementation of our algorithms, as compared to the results obtained by several other competent algorithms, demonstrate the efficiency and robustness of the proposed approach.  相似文献   

7.
一族超线性收敛的投影拟牛顿算法   总被引:5,自引:0,他引:5  
本文将梯度投影与拟牛顿法相结合,给出了求解一般线性约束非线性规划问题含两组参数的算法族.在一定的条件下证明了算法族的全局收敛性与它的子族的超线性收敛速度,并给出了投影D.F.P方法、投影BFGS方法等一些特例.  相似文献   

8.
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain ??better?? curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method, has the global and superlinear convergence for convex functions. Numerical experiments with this class, using the well-known quasi-Newton BFGS, DFP and a modified SR1 updates, are presented to illustrate some advantages of the new techniques. These experiments show that the performance of several combined methods are substantially better than that of the standard BFGS method. Similar improvements are also obtained if the simple sufficient function reduction condition on the steplength is used instead of the strong Wolfe conditions.  相似文献   

9.
《Optimization》2012,61(1):85-99
In this article, we propose a BFGS method for solving symmetric nonlinear equations. The presented method possesses some favourable properties: (a) the generated sequence of iterates is norm descent; (b) the generated sequence of the quasi-Newton matrix is positive definite and (c) this method possesses the global convergence and superlinear convergence. Numerical results show that the presented method is interesting.  相似文献   

10.
In this paper, we propose a BFGS (Broyden–Fletcher–Goldfarb–Shanno)-SQP (sequential quadratic programming) method for nonlinear inequality constrained optimization. At each step, the method generates a direction by solving a quadratic programming subproblem. A good feature of this subproblem is that it is always consistent. Moreover, we propose a practical update formula for the quasi-Newton matrix. Under mild conditions, we prove the global and superlinear convergence of the method. We also present some numerical results.  相似文献   

11.
Analysis of a self-scaling quasi-Newton method   总被引:1,自引:0,他引:1  
We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in addition, the search directions were well-scaled, but we show that this is not always the case. We find that the method has a major drawback: to achieve superlinear convergence it may be necessary to evaluate the function twice per iteration, even very near the solution. An example is constructed to show that the step-sizes required to achieve a superlinear rate converge to 2 and 0.5 alternately.This work was supported by National Science Foundation Grant CCR-9101359, and by the Department of Energy Grant DE-FG02-87ER25047.This work was performed while the author was visiting Northwestern University.  相似文献   

12.
A descent method is given for minimizing a nondifferentiable function which can be locally approximated by pointwise minima of convex functions. At each iterate the algorithm finds several directions by solving several linear or quadratic programming subproblems. These directions are then used in an Armijo-like search for the next iterate. A feasible direction extension to inequality constrained minimization problems is also presented. The algorithms converge to points satisfying necessary optimality conditions which are sharper than the ones involved in convergence results for algorithms based on the Clarke subdifferential.This research was sponsored by Project 02.15.  相似文献   

13.
This paper develops a reduced Hessian method for solving inequality constrained optimization problems. At each iteration, the proposed method solves a quadratic subproblem which is always feasible by introducing a slack variable to generate a search direction and then computes the steplength by adopting a standard line search along the direction through employing the l penalty function. And a new update criterion is proposed to generate the quasi-Newton matrices, whose dimensions may be variable, approximating the reduced Hessian of the Lagrangian. The global convergence is established under mild conditions. Moreover, local R-linear and superlinear convergence are shown under certain conditions.  相似文献   

14.
一类改进BFGS算法及其收敛性分析   总被引:6,自引:0,他引:6  
本文针对无约束最优化问题,基于目标函数的局部二次模型近似,提出一类改进的BFGS算法,称为 MBFGS算法。其修正 B_k的公式中含有一个参数θ∈[0,l],当 θ= 1时即得经典的BFGS公式;当θ∈[0、l)时,所得公式已不属于拟Newton类。在目标函数一致凸假设下,证明了所给算法的全局收敛性及局部超线性收敛性。  相似文献   

15.
In a recent paper McCormick and Ritter consider two classes of algorithms, namely methods of conjugate directions and quasi-Newton methods, for the problem of minimizing a function ofn variablesF(x). They show that the former methods possess ann-step superlinear rate of convergence while the latter are every step superlinear and therefore inherently superior. In this paper a simple and computationally inexpensive modification of a method of conjugate directions is presented. It is shown that the modified method is a quasi-Newton method and is thus every step superlinearly convergent. It is also shown that under certain assumptions on the second derivatives ofF the rate of convergence of the modified method isn-step quadratic.This work was supported by the National Research Council of Canada under Research Grant A8189.  相似文献   

16.
Quasi-Newton equations play a central role in quasi-Newton methods for optimization and various quasi-Newton equations are available. This paper gives a survey on these quasi-Newton equations and studies properties of quasi-Newton methods with updates satisfying different quasi-Newton equations. These include single-step quasi-Newton equations that use only gradient information and that use both gradient and function value information in one step, and multi-step quasi-Newton equations that use the gradient information in last m steps. Main properties of quasi-Newton methods with updates satisfying different quasi-Newton equations are studied. These properties include the finite termination property, invariance, heredity of positive definite updates, consistency of search directions, global convergence and local superlinear convergence properties.  相似文献   

17.
A Modified BFGS Algorithm for Unconstrained Optimization   总被引:7,自引:0,他引:7  
In this paper we present a modified BFGS algorithm for unconstrainedoptimization. The BFGS algorithm updates an approximate Hessianwhich satisfies the most recent quasi-Newton equation. The quasi-Newtoncondition can be interpreted as the interpolation conditionthat the gradient value of the local quadratic model matchesthat of the objective function at the previous iterate. Ourmodified algorithm requires that the function value is matched,instead of the gradient value, at the previous iterate. Themodified algorithm preserves the global and local superlinearconvergence properties of the BFGS algorithm. Numerical resultsare presented, which suggest that a slight improvement has beenachieved.  相似文献   

18.
We present a new matrix-free method for the computation of negative curvature directions based on the eigenstructure of minimal-memory BFGS matrices. We determine via simple formulas the eigenvalues of these matrices and we compute the desirable eigenvectors by explicit forms. Consequently, a negative curvature direction is computed in such a way that avoids the storage and the factorization of any matrix. We propose a modification of the L-BFGS method in which no information is kept from old iterations, so that memory requirements are minimal. The proposed algorithm incorporates a curvilinear path and a linesearch procedure, which combines two search directions; a memoryless quasi-Newton direction and a direction of negative curvature. Results of numerical experiments for large scale problems are also presented.  相似文献   

19.
In this paper, some Newton and quasi-Newton algorithms for the solution of inequality constrained minimization problems are considered. All the algorithms described produce sequences {x k } convergingq-superlinearly to the solution. Furthermore, under mild assumptions, aq-quadratic convergence rate inx is also attained. Other features of these algorithms are that only the solution of linear systems of equations is required at each iteration and that the strict complementarity assumption is never invoked. First, the superlinear or quadratic convergence rate of a Newton-like algorithm is proved. Then, a simpler version of this algorithm is studied, and it is shown that it is superlinearly convergent. Finally, quasi-Newton versions of the previous algorithms are considered and, provided the sequence defined by the algorithms converges, a characterization of superlinear convergence extending the result of Boggs, Tolle, and Wang is given.This research was supported by the National Research Program Metodi di Ottimizzazione per la Decisioni, MURST, Roma, Italy.  相似文献   

20.
This work is an attempt to develop multiobjective versions of some well-known single objective quasi-Newton methods, including BFGS, self-scaling BFGS (SS-BFGS), and the Huang BFGS (H-BFGS). A comprehensive and comparative study of these methods is presented in this paper. The Armijo line search is used for the implementation of these methods. The numerical results show that the Armijo rule does not work the same way for the multiobjective case as for the single objective case, because, in this case, it imposes a large computational effort and significantly decreases the speed of convergence in contrast to the single objective case. Hence, we consider two cases of all multi-objective versions of quasi-Newton methods: in the presence of the Armijo line search and in the absence of any line search. Moreover, the convergence of these methods without using any line search under some mild conditions is shown. Also, by introducing a multiobjective subproblem for finding the quasi-Newton multiobjective search direction, a simple representation of the Karush–Kuhn–Tucker conditions is derived. The H-BFGS quasi-Newton multiobjective optimization method provides a higher-order accuracy in approximating the second order curvature of the problem functions than the BFGS and SS-BFGS methods. Thus, this method has some benefits compared to the other methods as shown in the numerical results. All mentioned methods proposed in this paper are evaluated and compared with each other in different aspects. To do so, some well-known test problems and performance assessment criteria are employed. Moreover, these methods are compared with each other with regard to the expended CPU time, the number of iterations, and the number of function evaluations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号