首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Numerical Algorithms - This paper is focused on improving global convergence of the modified BFGS algorithm with Yuan-Wei-Lu line search formula. This improvement has been achieved by presenting a...  相似文献   

2.
Yuan  Gonglin  Li  Pengyuan  Lu  Junyu 《Numerical Algorithms》2022,91(1):353-365
Numerical Algorithms - The BFGS method, which has great numerical stability, is one of the quasi-Newton line search methods. However, the global convergence of the BFGS method with a Wolfe line...  相似文献   

3.
杜守强 《运筹学学报》2012,16(4):105-111
给出在Goldstein线搜索条件下求解非线性方程的Levenberg-Marquardt方法, 在较为温和的条件下证明了该方法的全局收敛性, 并且利用该方法对广义互补问题进行了求解分析.  相似文献   

4.
In this article, based on the modified secant equation, we propose a modified Hestenes-Stiefel (HS) conjugate gradient method which has similar form as the CG-DESCENT method proposed by Hager and Zhang (SIAM J Optim 16:170–192, 2005). The presented method can generate sufficient descent directions without any line search. Under some mild conditions, we show that it is globally convergent with Armijo line search. Moreover, the R-linear convergence rate of the modified HS method is established. Preliminary numerical results show that the proposed method is promising, and competitive with the well-known CG-DESCENT method.  相似文献   

5.
In this paper, we are concerned with the conjugate gradient methods for solving unconstrained optimization problems. It is well-known that the direction generated by a conjugate gradient method may not be a descent direction of the objective function. In this paper, we take a little modification to the Fletcher–Reeves (FR) method such that the direction generated by the modified method provides a descent direction for the objective function. This property depends neither on the line search used, nor on the convexity of the objective function. Moreover, the modified method reduces to the standard FR method if line search is exact. Under mild conditions, we prove that the modified method with Armijo-type line search is globally convergent even if the objective function is nonconvex. We also present some numerical results to show the efficiency of the proposed method.Supported by the 973 project (2004CB719402) and the NSF foundation (10471036) of China.  相似文献   

6.
In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.  相似文献   

7.
In this paper, we develop a memory gradient method for unconstrained optimization. The main characteristic of this method is that we obtain the next iteration without any line search. Under certain conditions, we obtain the strong global convergence of the proposed method.  相似文献   

8.
9.
In this paper, we propose a modified BFGS (Broyden–Fletcher–Goldfarb–Shanno) method with nonmonotone line search for unconstrained optimization. Under some mild conditions, we show that the method is globally convergent without a convexity assumption on the objective function. We also report some preliminary numerical results to show the efficiency of the proposed method.  相似文献   

10.
Oviedo  Harry 《Numerical Algorithms》2022,91(3):1183-1203
Numerical Algorithms - In this paper, we analyze the global convergence of a general non-monotone line search method on Riemannian manifolds. For this end, we introduce some properties for the...  相似文献   

11.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.  相似文献   

12.
A modified BFGS algorithm for solving the unconstrained optimization, whose Hessian matrix at the minimum point of the convex function is of rank defects, is presented in this paper.The main idea of the algorithm is first to add a modified term to the convex function for obtain an equivalent model, then simply the model to get the modified BFGS algorithm. The superlinear convergence property of the algorithm is proved in this paper. To compared with the Tensor algorithms presented by R. B. Schnabel (seing [4],[5]), this method is more efficient for solving singular unconstrained optimization in computing amount and complication.  相似文献   

13.
This article studies a modified BFGS algorithm for solving smooth unconstrained strongly convex minimization problem. The modified BFGS method is based on the new quasi-Newton equation Bk+1sk=yk where yk*, =yk + Aksk andA k is a matrix. Wei, Li and Qi [WLQ] have proven that the average performance of two of those algorithms is better than that of the classical one. In this paper, we prove the global convergence of these algorithms associated to a general line search rule.  相似文献   

14.
推广线搜索下一类共轭梯度法的全局收敛性   总被引:2,自引:0,他引:2  
在推广线搜索下给出了一类共轭梯度法的全局收敛结果  相似文献   

15.
In this paper, HS conjugate gradient method for minimizing a continuously differentiable functionf onR n is modified to have global convergence property. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continuously differentiable function with Curry-Altman’s step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with Armijo step size rule are established. Numerical results show that the new algorithms are efficient.  相似文献   

16.
We in this note drop a Lipschitz constant in a Grippo-Lucidi-type step length rule recently proposed by Shi and Shen [Z. Shi, J. Shen, Convergence of PRP method with new nonmonotone line search, Applied Mathematics and Computation 181(1) (2006) 423-431], and the original convergence remains valid.  相似文献   

17.
Global convergence is proved for a partitioned BFGS algorithm, when applied on a partially separable problem with a convex decomposition. This case convers a known practical optimization method for large dimensional unconstrained problems. Inexact solution of the linear system defining the search direction and variants of the steplength rule are also shown to be acceptable without affecting the global convergence properties.  相似文献   

18.
We study the global convergence of a two-parameter family of conjugate gradient methods in which the line search procedure is replaced by a fixed formula of stepsize. This character is of significance if the line search is expensive in a particular application. In addition to the convergence results, we present computational results for various conjugate gradient methods without line search including those discussed by Sun and Zhang (Ann. Oper. Res. 103 (2001) 161–173).  相似文献   

19.
To the unconstrained programme of non-convex function, this article give a modified BFGS algorithm associated with the general line search model. The idea of the algorithm is to modify the approximate Hessian matrix for obtaining the descent direction and guaranteeing the efficacious of the new quasi-Newton iteration equationB k +1s k =y k * ,, wherey k * is the sum ofy k andA k s k , andA k is some matrix. The global convergence properties of the algorithm associating with the general form of line search is proved.  相似文献   

20.
1.IntroductionIntillspaperweanalyzetheconvergenceonmultiplicativeiterativealgorithmsfortheIninimizationofadiffcrentiablefunctiondefinedonthepositiveorthantofR".ThealgorithmissllggestedbyEggermolltl'],andisrelatedtotheEM[2](Expextation--Maximization)algoritllnlforPositronemissiontonlography[']andimagereconstructi..14].Wecollsidertheproblenl"linf(x)s.t.x20.Themultiplicativeiterativealgorithmshavethel'orlniforj=l,2,',n,withAhdeterminedthroughalinesearch.Whilelusem[5]establishedanelegantconv…  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号