共查询到20条相似文献,搜索用时 0 毫秒
1.
Liying Liu Shengwei Yao Zengxin Wei 《Journal of Computational and Applied Mathematics》2008,220(1-2):422-438
In this paper, a new nonmonotone MBFGS algorithm for unconstrained optimization will be proposed. Under some suitable assumptions, the global and superlinear convergence of the new nonmonotone MBFGS algorithm on convex objective functions will be established. Some numerical experiments show that this new nonmonotone MBFGS algorithm is competitive to the MBFGS algorithm and the nonmonotone BFGS algorithm. 相似文献
2.
In this paper, a new nonmonotone BFGS algorithmfor unconstrained optimization is introduced. Under mild conditions,the global convergence of this new algorithm on convex functions isproved. Some numerical experiments show that this new nonmonotoneBFGS algorithm is competitive to the BFGS algorithm. 相似文献
3.
《Optimization》2012,61(4):981-992
In this paper, we consider a trust-region method for solving nonlinear equations which employs a new nonmonotone technique. A strong nonmonotone strategy and a weaker nonmonotone strategy can be obtained by choosing the parameter adaptively. Thus, the disadvantages of the traditional nonmonotone strategy can be avoided. It does not need to compute the Jacobian matrix at every iteration, so that the workload and time are decreased. Theoretical analysis indicates that the new algorithm preserves the global convergence under classical assumptions. Moreover, superlinear and quadratic convergence are established under suitable conditions. Numerical experiments show the efficiency and effectiveness of the proposed method for solving nonlinear equations. 相似文献
4.
《Optimization》2012,61(1):85-99
In this article, we propose a BFGS method for solving symmetric nonlinear equations. The presented method possesses some favourable properties: (a) the generated sequence of iterates is norm descent; (b) the generated sequence of the quasi-Newton matrix is positive definite and (c) this method possesses the global convergence and superlinear convergence. Numerical results show that the presented method is interesting. 相似文献
5.
BABAIE-KAFAKI Saman 《中国科学 数学(英文版)》2011,(9)
By making a convex combination of the modified secant equations proposed by Yuan and Wei et al.,a hybrid secant equation and also,a modified BFGS algorithm is proposed.The hybridization parameter is effectively computed using the available information of recent iterations.Under proper conditions,it is shown that the proposed algorithm is globally,locally and superlinearly convergent.By using the performance profile introduced by Dolan and Mor'e,a comparison between the implementations of the proposed algori... 相似文献
6.
Hong Xia YIN Dong Lei DU 《数学学报(英文版)》2007,23(7):1233-1240
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems. 相似文献
7.
基于非单调线搜索在寻求优化问题最优解中的优越性,提出了一类新的非单调保守BFGS算法.同已有方法不同,该算法中用来控制非单调性程度的算法参数不是取固定值,而是利用已有目标函数和梯度函数的信息自动调整其取值,以改善算法的数值表现.在合适的假设条件下,建立了新的非单调保守BFGS算法的全局收敛性.用基准测试优化问题测试了算... 相似文献
8.
On the superlinear local convergence of a filter-SQP method 总被引:5,自引:0,他引:5
Transition to superlinear local convergence is shown for a modified version of the trust-region filter-SQP method for nonlinear programming introduced by Fletcher, Leyffer, and Toint [8]. Hereby, the original trust-region SQP-steps can be used without an additional second order correction. The main modification consists in using the Lagrangian function value instead of the objective function value in the filter together with an appropriate infeasibility measure. Moreover, it is shown that the modified trust-region filter-SQP method has the same global convergence properties as the original algorithm in [8].Mathematics Subject Classification (2000): 90C55, 65K05, 90C30 相似文献
9.
Nonsmooth Equation Based BFGS Method for Solving KKT Systems in Mathematical Programming 总被引:2,自引:0,他引:2
Li D. H. Yamashita N. Fukushima M. 《Journal of Optimization Theory and Applications》2001,109(1):123-167
In this paper, we present a BFGS method for solving a KKT system in mathematical programming, based on a nonsmooth equation reformulation of the KKT system. We split successively the nonsmooth equation into equivalent equations with a particular structure. Based on the splitting, we develop a BFGS method in which the subproblems are systems of linear equations with symmetric and positive-definite coefficient matrices. A suitable line search is introduced under which the generated iterates exhibit an approximate norm descent property. The method is well defined and, under suitable conditions, converges to a KKT point globally and superlinearly without any convexity assumption on the problem. 相似文献
10.
This paper discusses the global convergence of a class of nonmonotone conjugate gradient methods (NM methods) for nonconvex object functions. This class of methods includes the nonmonotone counterpart of modified Polak-Ribiere method and modified Hestenes-Stiefel method as special cases. 相似文献
11.
A globally convergent BFGS method with nonmonotone line search for non-convex minimization 总被引:1,自引:0,他引:1
In this paper, we propose a modified BFGS (Broyden–Fletcher–Goldfarb–Shanno) method with nonmonotone line search for unconstrained optimization. Under some mild conditions, we show that the method is globally convergent without a convexity assumption on the objective function. We also report some preliminary numerical results to show the efficiency of the proposed method. 相似文献
12.
In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient. 相似文献
13.
A successive quadratic programming algorithm with global and superlinear convergence properties 总被引:6,自引:0,他引:6
Masao Fukushima 《Mathematical Programming》1986,35(3):253-264
This paper presents a successive quadratic programming algorithm for solving general nonlinear programming problems. In order to avoid the Maratos effect, direction-finding subproblems are derived by modifying the second-order approximations to both objective and constraint functions of the problem. We prove that the algorithm possesses global and superlinear convergence properties.This work was supported in part by a Scientific Research Grant-in-Aid from the Ministry of Education, Science and Culture, Japan. 相似文献
14.
Ph. L. Toint 《Mathematical Programming》1986,36(3):290-306
Global convergence is proved for a partitioned BFGS algorithm, when applied on a partially separable problem with a convex
decomposition. This case convers a known practical optimization method for large dimensional unconstrained problems. Inexact
solution of the linear system defining the search direction and variants of the steplength rule are also shown to be acceptable
without affecting the global convergence properties. 相似文献
15.
In this paper, an active set limited BFGS algorithm is proposed for bound constrained optimization. The global convergence will be established under some suitable conditions. Numerical results show that the given method is effective. 相似文献
16.
An active set limited memory BFGS algorithm for large-scale bound constrained optimization is proposed. The active sets are estimated by an identification technique. The search direction is determined by a lower dimensional system of linear equations in free subspace. The implementations of the method on CUTE test problems are described, which show the efficiency of the proposed algorithm. The work was supported by the 973 project granted 2004CB719402 and the NSF project of China granted 10471036. 相似文献
17.
We extend the Mizuno-Todd-Ye predictor-corrector algorithm for solving monotone linear complementarity problems. We prove that the extended algorithm is globally Q-linearly convergent and solves problems with integer data of bitlengthL in at most
iterations. We also prove that the duality gap converges to zero Q-superlinearly for problems having strictly complementary solutions. Our results generalize the results obtained by Ye, Tapia, and Zhang for linear programming. 相似文献
18.
Since 1965, there has been significant progress in the theoretical study on quasi-Newton methods for solving nonlinear equations, especially in the local convergence analysis. However, the study on global convergence of quasi-Newton methods is relatively fewer, especially for the BFGS method. To ensure global convergence, some merit function such as the squared norm merit function is typically used. In this paper, we propose an algorithm for solving nonlinear monotone equations, which combines the BFGS method and the hyperplane projection method. We also prove that the proposed BFGS method converges globally if the equation is monotone and Lipschitz continuous without differentiability requirement on the equation, which makes it possible to solve some nonsmooth equations. An attractive property of the proposed method is that its global convergence is independent of any merit function.We also report some numerical results to show efficiency of the proposed method.
19.
This paper studies an algorithm for minimizing a convex function based upon a combination of polyhedral and quadratic approximation. The method was given earlier, but without a good specification for updating the algorithm's curvature matrix. Here, for the case of onedimensional minimization, we provide a specification that insures convergence even in cases where the curvature scalar tends to zero or infinity. Under mild additional assumptions, we show that the convergence is superlinear. 相似文献
20.
Janos Karatson 《Applications of Mathematics》2005,50(3):277-290
A mesh independent bound is given for the superlinear convergence of the CGM for preconditioned self-adjoint linear elliptic problems using suitable equivalent operators. The results rely on K-condition numbers and related estimates for compact Hilbert-Schmidt operators in Hilbert space.This research was supported by the Hungarian National Research Fund OTKA under grant No. T043765. 相似文献