首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Quasi-Newton method is a well-known effective method for solving optimization problems. Since it is a line search method, which needs a line search procedure after determining a search direction at each iteration, we must decide a line search rule to choose a step size along a search direction. In this paper, we propose a new inexact line search rule for quasi-Newton method and establish some global convergent results of this method. These results are useful in designing new quasi-Newton methods. Moreover, we analyze the convergence rate of quasi-Newton method with the new line search rule.  相似文献   

2.
无约束优化问题的对角稀疏拟牛顿法   总被引:3,自引:0,他引:3  
对无约束优化问题提出了对角稀疏拟牛顿法,该算法采用了Armijo非精确线性搜索,并在每次迭代中利用对角矩阵近似拟牛顿法中的校正矩阵,使计算搜索方向的存贮量和工作量明显减少,为大型无约束优化问题的求解提供了新的思路.在通常的假设条件下,证明了算法的全局收敛性,线性收敛速度并分析了超线性收敛特征。数值实验表明算法比共轭梯度法有效,适于求解大型无约束优化问题.  相似文献   

3.
In this paper, we consider a multivariate spectral projected gradient (MSPG) method for bound constrained optimization. Combined with a quasi-Newton property, the multivariate spectral projected gradient method allows an individual adaptive step size along each coordinate direction. On the basis of nonmonotone line search, global convergence is established. A numerical comparison with the traditional SPG method shows that the method is promising.  相似文献   

4.
新非单调线搜索规则的Lampariello修正对角稀疏拟牛顿算法   总被引:2,自引:0,他引:2  
孙清滢  崔彬  王长钰 《计算数学》2008,30(3):255-268
本文设计了求解无约束最优化问题的新的非单调线搜索规则的Lampariello修正对角稀疏拟牛顿算法.新的步长规则类似于Grippo非单调线搜索规则并包含Grippo非单调线搜索规则作为特例.新的步长规则在每一次线搜索时得到一个相对于Grippo非单调线搜索规则的较大步长,同时保证算法的全局收敛性.数值例子表明算法是有效的,适合求解大规模问题.  相似文献   

5.
An efficient descent method for unconstrained optimization problems is line search method in which the step size is required to choose at each iteration after a descent direction is determined. There are many ways to choose the step sizes, such as the exact line search, Armijo line search, Goldstein line search, and Wolfe line search, etc. In this paper we propose a new inexact line search for a general descent method and establish some global convergence properties. This new line search has many advantages comparing with other similar inexact line searches. Moreover, we analyze the global convergence and local convergence rate of some special descent methods with the new line search. Preliminary numerical results show that the new line search is available and efficient in practical computation.  相似文献   

6.
Convergence properties of a class of multi-directional parallel quasi-Newton algorithmsfor the solution of unconstrained minimization problems are studied in this paper.At eachiteration these algorithms generate several different quasi-Newton directions,and thenapply line searches to determine step lengths along each direction,simultaneously.Thenext iterate is obtained among these trail points by choosing the lowest point in the sense offunction reductions.Different quasi-Newton updating formulas from the Broyden familyare used to generate a main sequence of Hessian matrix approximations.Based on theBFGS and the modified BFGS updating formulas,the global and superlinear convergenceresults are proved.It is observed that all the quasi-Newton directions asymptoticallyapproach the Newton direction in both direction and length when the iterate sequenceconverges to a local minimum of the objective function,and hence the result of superlinearconvergence follows.  相似文献   

7.
In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.  相似文献   

8.
In this paper, we suggest another accelerated conjugate gradient algorithm for which both the descent and the conjugacy conditions are guaranteed. The search direction is selected as a linear combination of the gradient and the previous direction. The coefficients in this linear combination are selected in such a way that both the descent and the conjugacy condition are satisfied at every iteration. The algorithm introduces the modified Wolfe line search, in which the parameter in the second Wolfe condition is modified at every iteration. It is shown that both for uniformly convex functions and for general nonlinear functions, the algorithm with strong Wolfe line search generates directions bounded away from infinity. The algorithm uses an acceleration scheme modifying the step length in such a manner as to improve the reduction of the function values along the iterations. Numerical comparisons with some conjugate gradient algorithms using a set of 75 unconstrained optimization problems with different dimensions show that the computational scheme outperforms the known conjugate gradient algorithms like Hestenes and Stiefel; Polak, Ribière and Polyak; Dai and Yuan or the hybrid Dai and Yuan; CG_DESCENT with Wolfe line search, as well as the quasi-Newton L-BFGS.  相似文献   

9.
对一般目标函数极小化问题的拟牛顿法及其全局收敛性的研究,已经成为拟牛顿法理论中最基本的开问题之一.本文对这个问题做了进一步的研究,对无约束优化问题提出一类新的广义拟牛顿算法,并结合Goldstein线搜索证明了算法对一般非凸目标函数极小化问题的全局收敛性.  相似文献   

10.
应用双参数的类Broyden族校正公式,为研究求解无约束最优化问题的拟牛顿类算法对一般目标函数的收敛性这个开问题提供了一种新的方法.  相似文献   

11.
In this paper, the classical Gauss-Newton method for the unconstrained least squares problem is modified by introducing a quasi-Newton approximation to the second-order term of the Hessian. Various quasi-Newton formulas are considered, and numerical experiments show that most of them are more efficient on large residual problems than the Gauss-Newton method and a general purpose minimization algorithm based upon the BFGS formula. A particular quasi-Newton formula is shown numerically to be superior. Further improvements are obtained by using a line search that exploits the special form of the function.  相似文献   

12.
To the unconstrained programme of non-convex function, this article give a modified BFGS algorithm. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the quasi-Newton iteration pattern. We prove the global convergence properties of the algorithm associating with the general form of line search, and prove the quadratic convergence rate of the algorithm under some conditions.  相似文献   

13.
In this paper, a new class of three term memory gradient method with nonmonotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Combining the quasi-Newton method with the new method, the former is modified to have global convergence property. Numerical results show that the new algorithm is efficient.  相似文献   

14.
We consider an efficient trust-region framework which employs a new nonmonotone line search technique for unconstrained optimization problems. Unlike the traditional nonmonotone trust-region method, our proposed algorithm avoids resolving the subproblem whenever a trial step is rejected. Instead, it performs a nonmonotone Armijo-type line search in direction of the rejected trial step to construct a new point. Theoretical analysis indicates that the new approach preserves the global convergence to the first-order critical points under classical assumptions. Moreover, superlinear and quadratic convergence are established under suitable conditions. Numerical experiments show the efficiency and effectiveness of the proposed approach for solving unconstrained optimization problems.  相似文献   

15.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

16.
Quasi-Newton Methods for Unconstrained Optimization   总被引:3,自引:0,他引:3  
A revised algorithm is given for unconstrained optimizationusing quasi-Newton methods. The method is based on recurringthe factorization of an approximation to the Hessian matrix.Knowledge of this factorization allows greater flexibility whenchoosing the direction of search while minimizing the adverseeffects of rounding error. The control of rounding error isparticularly important when analytical derivatives are unavailable,and a modification of the algorithm to accept finite-differenceapproximations to the derivatives is given.  相似文献   

17.
Numerical Algorithms - This paper presents a class of low memory quasi-Newton methods with standard backtracking line search for large-scale unconstrained minimization. The methods are derived by...  相似文献   

18.
A quasi-Newton trust-region method   总被引:1,自引:0,他引:1  
The classical trust-region method for unconstrained minimization can be augmented with a line search that finds a point that satisfies the Wolfe conditions. One can use this new method to define an algorithm that simultaneously satisfies the quasi-Newton condition at each iteration and maintains a positive-definite approximation to the Hessian of the objective function. This new algorithm has strong global convergence properties and is robust and efficient in practice.  相似文献   

19.
This paper concerns the memoryless quasi-Newton method, that is precisely the quasi-Newton method for which the approximation to the inverse of Hessian, at each step, is updated from the identity matrix. Hence its search direction can be computed without the storage of matrices. In this paper, a scaled memoryless symmetric rank one (SR1) method for solving large-scale unconstrained optimization problems is developed. The basic idea is to incorporate the SR1 update within the framework of the memoryless quasi-Newton method. However, it is well-known that the SR1 update may not preserve positive definiteness even when updated from a positive definite matrix. Therefore we propose the memoryless SR1 method, which is updated from a positive scaled of the identity, where the scaling factor is derived in such a way that positive definiteness of the updating matrices are preserved and at the same time improves the condition of the scaled memoryless SR1 update. Under very mild conditions it is shown that, for strictly convex objective functions, the method is globally convergent with a linear rate of convergence. Numerical results show that the optimally scaled memoryless SR1 method is very encouraging.  相似文献   

20.
1.IntroductionAlinesearchmethodforminimizingarealfunctionfgeneratesasequencex1,x2,.-.ofpointsbyapplyingtheiterationxk+i=xk+akpk,k=1,2,....(1)Inaquasi-NewtonmethodthesearchdirectionpkischosensothatBkpk=-gk,whereBkis(usually)apositivedefiIiltematrirandgkdenotesVf(xk)-FortheBFGSupdate,(see[21,forexample),thematricesBkaredefinedbytheformulawheresk=xk+1-xkandyk=gk+1-gk'ItiswellknownthatifB1ispositivedefiniteands[Y*>O(3)thenallmatricesBk+1,k=1,2,...generatedby(2)arepositivedefiinte.Thuspkisadi…  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号