首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
LetB be a positive definite symmetric approximation to the second derivative matrix of the objective function of a minimization calculation. We study modifications of the BFGS method that apply a scaling technique to the columns of a conjugate direction matrixZ satisfyingZ T BZ = I. For a simple scaling technique similar to the ones considered by Powell (1987) and (1989) we show that, due to a two iteration cycle, linear convergence can occur when the method is applied to a quadratic function and Wolfe's line search is employed, although for exact line searches quadratic termination can be proved. We then suggest a different scaling technique that prevents certain columns thought to contain important curvature information from being scaled. For this algorithm we prove global and superlinear convergence and demonstrate the superiority of our method over the BFGS formula on a range of illconditioned optimization problems. Moreover, we present an implementation of our algorithm that requires only 3n 2 +O(n) multiplications per iteration.  相似文献   

2.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

3.
In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry’s (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm. The author was awarded the Romanian Academy Grant 168/2003.  相似文献   

4.
Nonsmooth optimization via quasi-Newton methods   总被引:1,自引:0,他引:1  
We investigate the behavior of quasi-Newton algorithms applied to minimize a nonsmooth function f, not necessarily convex. We introduce an inexact line search that generates a sequence of nested intervals containing a set of points of nonzero measure that satisfy the Armijo and Wolfe conditions if f is absolutely continuous along the line. Furthermore, the line search is guaranteed to terminate if f is semi-algebraic. It seems quite difficult to establish a convergence theorem for quasi-Newton methods applied to such general classes of functions, so we give a careful analysis of a special but illuminating case, the Euclidean norm, in one variable using the inexact line search and in two variables assuming that the line search is exact. In practice, we find that when f is locally Lipschitz and semi-algebraic with bounded sublevel sets, the BFGS (Broyden–Fletcher–Goldfarb–Shanno) method with the inexact line search almost always generates sequences whose cluster points are Clarke stationary and with function values converging R-linearly to a Clarke stationary value. We give references documenting the successful use of BFGS in a variety of nonsmooth applications, particularly the design of low-order controllers for linear dynamical systems. We conclude with a challenging open question.  相似文献   

5.
Descent property is very important for an iterative method to be globally convergent. In this paper, we propose a way to construct sufficient descent directions for unconstrained optimization. We then apply the technique to derive a PSB (Powell-Symmetric-Broyden) based method. The PSB based method locally reduces to the standard PSB method with unit steplength. Under appropriate conditions, we show that the PSB based method with Armijo line search or Wolfe line search is globally and superlinearly convergent for uniformly convex problems. We also do some numerical experiments. The results show that the PSB based method is competitive with the standard BFGS method.  相似文献   

6.
A family of new conjugate gradient methods is proposed based on Perry’s idea, which satisfies the descent property or the sufficient descent property for any line search. In addition, based on the scaling technology and the restarting strategy, a family of scaling symmetric Perry conjugate gradient methods with restarting procedures is presented. The memoryless BFGS method and the SCALCG method are the special forms of the two families of new methods, respectively. Moreover, several concrete new algorithms are suggested. Under Wolfe line searches, the global convergence of the two families of the new methods is proven by the spectral analysis for uniformly convex functions and nonconvex functions. The preliminary numerical comparisons with CG_DESCENT and SCALCG algorithms show that these new algorithms are very effective algorithms for the large-scale unconstrained optimization problems. Finally, a remark for further research is suggested.  相似文献   

7.
Consider the BFGS quasi-Newton method applied to a general non-convex function that has continuous second derivatives. This paper aims to construct a four-dimensional example such that the BFGS method need not converge. The example is perfect in the following sense: (a) All the stepsizes are exactly equal to one; the unit stepsize can also be accepted by various line searches including the Wolfe line search and the Arjimo line search; (b) The objective function is strongly convex along each search direction although it is not in itself. The unit stepsize is the unique minimizer of each line search function. Hence the example also applies to the global line search and the line search that always picks the first local minimizer; (c) The objective function is polynomial and hence is infinitely continuously differentiable. If relaxing the convexity requirement of the line search function; namely, (b) we are able to construct a relatively simple polynomial example.  相似文献   

8.
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain ??better?? curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method, has the global and superlinear convergence for convex functions. Numerical experiments with this class, using the well-known quasi-Newton BFGS, DFP and a modified SR1 updates, are presented to illustrate some advantages of the new techniques. These experiments show that the performance of several combined methods are substantially better than that of the standard BFGS method. Similar improvements are also obtained if the simple sufficient function reduction condition on the steplength is used instead of the strong Wolfe conditions.  相似文献   

9.
《Optimization》2012,61(4):549-570
The best spectral conjugate gradient algorithm by (Birgin, E. and Martínez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117–128). which is mainly a scaled variant of (Perry, J.M., 1977, A class of Conjugate gradient algorithms with a two step varaiable metric memory, Discussion Paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University), is modified in such a way as to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded into the restart philosophy of Beale–Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative way by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Computational results and performance profiles for a set consisting of 700 unconstrained optimization problems show that this new scaled nonlinear conjugate gradient algorithm substantially outperforms known conjugate gradient methods including: the spectral conjugate gradient SCG by Birgin and Martínez, the scaled Fletcher and Reeves, the Polak and Ribière algorithms and the CONMIN by (Shanno, D.F. and Phua, K.H., 1976, Algorithm 500, Minimization of unconstrained multivariate functions. ACM Transactions on Mathematical Software, 2, 87–94).  相似文献   

10.
We describe a method for constructing an n-orthogonal coordinate system in constant curvature spaces. The construction proposed is actually a modification of the Krichever method for producing an orthogonal coordinate system in the n-dimensional Euclidean space. To demonstrate how this method works, we construct some examples of orthogonal coordinate systems on the two-dimensional sphere and the hyperbolic plane, in the case when the spectral curve is reducible and all irreducible components are isomorphic to a complex projective line.  相似文献   

11.
The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for large-scale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer (1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient (BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.  相似文献   

12.
Let Σ be a set of n-dimensional polytopes. A set Ω of n-dimensional polytopes is said to be an element set for Σ if each polytope in Σ is the union of a finite number of polytopes in Ω identified along (n − 1)-dimensional faces. In this paper, we consider the n-dimensional polytopes in general, and extend the notion of element sets to higher dimensions. In particular, we will show that in the 4-space, the element number of the six convex regular polychora is at least 2, and in the n-space (n ≥ 5), the element number is 3, unless n + 1 is a square number.  相似文献   

13.
A new adaptive scaled Broyden-Fletcher-Goldfarb-Shanno (BFGS) method for unconstrained optimization is presented. The third term in the standard BFGS update formula is scaled in order to reduce the large eigenvalues of the approximation to the Hessian of the minimizing function. Under the inexact Wolfe line search conditions, the global convergence of the adaptive scaled BFGS method is proved in very general conditions without assuming the convexity of the minimizing function. Using 80 unconstrained optimization test functions with a medium number of variables, the preliminary numerical experiments show that this variant of the scaled BFGS method is more efficient than the standard BFGS update or than some other scaled BFGS methods.  相似文献   

14.
In 1960, J. B. Rosen gave a famous Gradient Projection Method in [1]. But the convergence of the algorithm has not been proved for a long time. Many authors paid much attention to this problem, such as X.S. Zhang proved in [2] (1984) that the limit point of {x k} which is generated by Rosen's algorithm is a K-T piont for a 3-dimensional caes, if {x k} is convergent. D. Z. Du proved in [3] (1986) that Rosen's algorithm is convergent for 4-dimensional. In [4] (1986), the author of this paper gave a general proof of the convergence of Rosen's Gradient Projection Method for ann-dimensional case. As Rosen's method requires exact line search, we know that exact line search is very difficult on computer. In this paper a line search method of discrete steps are presented and the convergence of the algorithm is proved.  相似文献   

15.
This research presents a new constrained optimization approach for solving systems of nonlinear equations. Particular advantages are realized when all of the equations are convex. For example, a global algorithm for finding the zero of a convex real-valued function of one variable is developed. If the algorithm terminates finitely, then either the algorithm has computed a zero or determined that none exists; if an infinite sequence is generated, either that sequence converges to a zero or again no zero exists. For solving n-dimensional convex equations, the constrained optimization algorithm has the capability of determining that the system of equations has no solution. Global convergence of the algorithm is established under weaker conditions than previously known and, in this case, the algorithm reduces to Newton’s method together with a constrained line search at each iteration. It is also shown how this approach has led to a new algorithm for solving the linear complementarity problem.  相似文献   

16.
This article studies a modified BFGS algorithm for solving smooth unconstrained strongly convex minimization problem. The modified BFGS method is based on the new quasi-Newton equation Bk+1sk=yk where yk*, =yk + Aksk andA k is a matrix. Wei, Li and Qi [WLQ] have proven that the average performance of two of those algorithms is better than that of the classical one. In this paper, we prove the global convergence of these algorithms associated to a general line search rule.  相似文献   

17.
The Busemann–Petty problem asks whether origin-symmetric convex bodies in Rn with smaller areas of all central hyperplane sections necessarily have smaller n-dimensional volume. The solution was completed in the end of the 1990s, and the answer is affirmative if n4 and negative if n5. Since the answer is negative in most dimensions, it is natural to ask what information about the volumes of central sections of two bodies does allow to compare the n-dimensional volumes of these bodies in all dimensions. In this article we give an answer to this question in terms of certain powers of the Laplace operator applied to the section function of the body.  相似文献   

18.
The complex Busemann-Petty problem asks whether origin symmetric convex bodies in with smaller central hyperplane sections necessarily have smaller volume. The answer is affirmative if n ≤ 3 and negative if n ≥ 4. Since the answer is negative in most dimensions, it is natural to ask what conditions on the (n − 1)-dimensional volumes of the central sections of complex convex bodies with complex hyperplanes allow to compare the n-dimensional volumes. In this article we give necessary conditions on the section function in order to obtain an affirmative answer in all dimensions. The result is the complex analogue of [16].   相似文献   

19.
This article concerns new off-diagonal estimates on the remainder and its derivatives in the pointwise Weyl law on a compact n-dimensional Riemannian manifold. As an application, we prove that near any non-self-focal point, the scaling limit of the spectral projector of the Laplacian onto frequency windows of constant size is a normalized Bessel function depending only on n.  相似文献   

20.
On the Nonmonotone Line Search   总被引:10,自引:0,他引:10  
The technique of nonmonotone line search has received many successful applications and extensions in nonlinear optimization. This paper provides some basic analyses of the nonmonotone line search. Specifically, we analyze the nonmonotone line search methods for general nonconvex functions along different lines. The analyses are helpful in establishing the global convergence of a nonmonotone line search method under weaker conditions on the search direction. We explore also the relations between nonmonotone line search and R-linear convergence assuming that the objective function is uniformly convex. In addition, by taking the inexact Newton method as an example, we observe a numerical drawback of the original nonmonotone line search and suggest a standard Armijo line search when the nonmonotone line search condition is not satisfied by the prior trial steplength. The numerical results show the usefulness of such suggestion for the inexact Newton method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号