首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Multi-step quasi-Newton methods for optimization   总被引:4,自引:0,他引:4  
Quasi-Newton methods update, at each iteration, the existing Hessian approximation (or its inverse) by means of data deriving from the step just completed. We show how “multi-step” methods (employing, in addition, data from previous iterations) may be constructed by means of interpolating polynomials, leading to a generalization of the “secant” (or “quasi-Newton”) equation. The issue of positive-definiteness in the Hessian approximation is addressed and shown to depend on a generalized version of the condition which is required to hold in the original “single-step” methods. The results of extensive numerical experimentation indicate strongly that computational advantages can accrue from such an approach (by comparison with “single-step” methods), particularly as the dimension of the problem increases.  相似文献   

2.
Recent progress in unconstrained nonlinear optimization without derivatives   总被引:6,自引:0,他引:6  
We present an introduction to a new class of derivative free methods for unconstrained optimization. We start by discussing the motivation for such methods and why they are in high demand by practitioners. We then review the past developments in this field, before introducing the features that characterize the newer algorithms. In the context of a trust region framework, we focus on techniques that ensure a suitable “geometric quality” of the considered models. We then outline the class of algorithms based on these techniques, as well as their respective merits. We finally conclude the paper with a discussion of open questions and perspectives. Current reports available by anonymous ftp from the directory “pub/reports” on thales.math.fundp.ac.be. WWW: http://www.fundp.ac.be/ phtoint/pht/publications.html.  相似文献   

3.
A quasi-Newton trust-region method   总被引:1,自引:0,他引:1  
The classical trust-region method for unconstrained minimization can be augmented with a line search that finds a point that satisfies the Wolfe conditions. One can use this new method to define an algorithm that simultaneously satisfies the quasi-Newton condition at each iteration and maintains a positive-definite approximation to the Hessian of the objective function. This new algorithm has strong global convergence properties and is robust and efficient in practice.  相似文献   

4.
A conjugate-gradient optimization method which is invariant to nonlinear scaling of a quadratic form is introduced. The technique has the property that the search directions generated are identical to those produced by the classical Fletcher-Reeves algorithm applied to the quadratic form. The approach enables certain nonquadratic functions to be minimized in a finite number of steps. Several examples which illustrate the efficacy of the method are included.  相似文献   

5.
把正定矩阵关于向量的等内积分解算法应用于求解无约束优化问题的拟牛顿算法中,提出了利用校正矩阵的等内积分解矩阵确定搜索方向的一种新算法和等价于DFP和BFGS校正公式的新的迭代公式.  相似文献   

6.
It has become customary to compare the performance of unconstrained optimization algorithms on families of extended symmetric test functions. In this paper, results are presented which indicate that the performance of the variable metric algorithm on such functions is greatly distorted by rounding errors that destroy the special nature of these functions. A simple method of overcoming this difficulty is demonstrated, and it confirms the theoretical result that the number of iterations required to solve such problems is independent of the dimension.This research was supported by the Science and Engineering Research Council.  相似文献   

7.
Many practical optimization problems involve nonsmooth (that is, not necessarily differentiable) functions of thousands of variables. In the paper [Haarala, Miettinen, Mäkelä, Optimization Methods and Software, 19, (2004), pp. 673–692] we have described an efficient method for large-scale nonsmooth optimization. In this paper, we introduce a new variant of this method and prove its global convergence for locally Lipschitz continuous objective functions, which are not necessarily differentiable or convex. In addition, we give some encouraging results from numerical experiments.  相似文献   

8.
We wish to examine the conjugate gradient and quasi-Newton minimization algorithms. A relation noted by Nazareth is extended to an algorithm in which conjugate gradient and quasi-Newton search directions occur together and which can be interpreted as a conjugate gradient algorithm with a changing metric.  相似文献   

9.
Recent attempts to assess the performance of SSVM algorithms for unconstrained minimization problems differ in their evaluations from earlier assessments. Nevertheless, the new experiments confirm earlier observations that, on certain types of problems, the SSVM algorithms are far superior to other variable metric methods. This paper presents a critical review of these recent assessments and discusses some current interpretations advanced to explain the behavior of SSVM methods. The paper examines the new empirical results, in light of the original self-scaling theory, and introduces a new interpretation of these methods based on anL-function model of the objective function. This interpretation sheds new light on the performance characteristics of the SSVM methods, which contributes to the understanding of their behavior and helps in characterizing classes of problems which can benefit from the self-scaling approach.The subject of this paper was presented at the ORSA/TIMS National Meeting in New York, 1978.This work was done while the author was with the Analysis Research Group, Xerox Palo Alto Research Center, Palo Alto, California.  相似文献   

10.
《Optimization》2012,61(10):1717-1727
ABSTRACT

In this paper, we present a class of approximating matrices as a function of a scalar parameter that includes the Davidon-Fletcher-Powell and Broyden-Fletcher-Goldfarb-Shanno methods as special cases. A powerful iterative descent method for finding a local minimum of a function of several variables is described. The new method maintains the positive definiteness of the approximating matrices. For a region in which the function depends quadratically on the variables, no more than n iterations are required, where n is the number of variables. A set of computational results that verifies the superiority of the new method are presented.  相似文献   

11.
The paper compares a factorized sparse quasi-Newton update of Goldfarb with a nonfactorized BFGS sparse update of Shanno on a series of test problems, with numerical results strongly favoring the unfactorized update. Analysis of Goldfarb's method is done to explain the poor numerical performance. Two specific conjugate gradient methods for solving the required systems of linear equations with the unfactorized update are described and tested.This research was supported by the National Science Foundation under Grant No. MCS-77-07327  相似文献   

12.
Two new methods for unconstrained optimization are presented. Both methods employ a hybrid direction strategy which is a modification of Powell's 1970 dogleg strategy. They also employ a projection technique introduced by Davidon in his 1975 algorithm which uses projection images of x and g in updating the approximate Hessian. The first method uses Davidon's optimally conditioned update formula, while the second uses only the BFGS update. Both methods performed well without Powell's special iterations and singularity safeguards, and the numerical results are very promising.This research was supported by the National Science Foundation under Grant No. GJ-40903.  相似文献   

13.
A conjecture of Dixon relating to the behaviour of variable metric methods on functions with special symmetry is validated under suitable onditions. The relation between Huang's class and Oren's class is explored. Then the equivalence of Davidon's and Oren and Spedicato's approaches to optimal conditioning is demonstrated.  相似文献   

14.
This paper introduces an algorithm for convex minimization which includes quasi-Newton updates within a proximal point algorithm that depends on a preconditioned bundle subalgorithm. The method uses the Hessian of a certain outer function which depends on the Jacobian of a proximal point mapping which, in turn, depends on the preconditioner matrix and on a Lagrangian Hessian relative to a certain tangent space. Convergence is proved under boundedness assumptions on the preconditioner sequence. Research supported by NSF Grant No. DMS-9402018 and by Institut National de Recherche en Informatique et en Automatique, France.  相似文献   

15.
Quasi-Newton method is a well-known effective method for solving optimization problems. Since it is a line search method, which needs a line search procedure after determining a search direction at each iteration, we must decide a line search rule to choose a step size along a search direction. In this paper, we propose a new inexact line search rule for quasi-Newton method and establish some global convergent results of this method. These results are useful in designing new quasi-Newton methods. Moreover, we analyze the convergence rate of quasi-Newton method with the new line search rule.  相似文献   

16.
The effect of nonlinearly scaling the objective function on the variable-metric method is investigated, and Broyden's update is modified so that a property of invariancy to the scaling is satisfied. A new three-parameter class of updates is generated, and criteria for an optimal choice of the parameters are given. Numerical experiments compare the performance of a number of algorithms of the resulting class.The author is indebted to Professor S. S. Oren, Economic Engineering Department, Stanford University, Stanford, California, for stimulating discussions during the development of this paper. He also recognizes the financial support by the National Research Council of Italy (CNR) for his stay at Stanford University.  相似文献   

17.
This paper presents a quasi-Newton-type algorithm for nonconvex multiobjective optimization. In this algorithm, the iterations are repeated until termination conditions are met, which is when a suitable descent direction cannot be found anymore. Under suitable assumptions, global convergence is established.  相似文献   

18.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


19.
It is well known that the sufficient descent condition is very important to the global convergence of the nonlinear conjugate gradient method. In this paper, some modified conjugate gradient methods which possess this property are presented. The global convergence of these proposed methods with the weak Wolfe–Powell (WWP) line search rule is established for nonconvex function under suitable conditions. Numerical results are reported. This work is supported by Guangxi University SF grands X061041 and China NSF grands 10761001.  相似文献   

20.
Smoothed penalty algorithms for optimization of nonlinear models   总被引:1,自引:0,他引:1  
We introduce an algorithm for solving nonlinear optimization problems with general equality and box constraints. The proposed algorithm is based on smoothing of the exact l 1-penalty function and solving the resulting problem by any box-constraint optimization method. We introduce a general algorithm and present theoretical results for updating the penalty and smoothing parameter. We apply the algorithm to optimization problems for nonlinear traffic network models and report on numerical results for a variety of network problems and different solvers for the subproblems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号