首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Optimization》2012,61(2):163-179
In this article, we consider the global convergence of the Polak–Ribiére–Polyak conjugate gradient method (abbreviated PRP method) for minimizing functions that have Lipschitz continuous partial derivatives. A novel form of non-monotone line search is proposed to guarantee the global convergence of the PRP method. It is also shown that the PRP method has linear convergence rate under some mild conditions when the non-monotone line search reduces to a related monotone line search. The new non-monotone line search needs to estimate the Lipschitz constant of the gradients of objective functions, for which two practical estimations are proposed to help us to find a suitable initial step size for the PRP method. Numerical results show that the new line search approach is efficient in practical computation.  相似文献   

2.
共轭梯度法是最优化中最常用的方法之一,广泛地应用于求解大规模优化问题,其中参数β_k的不同选取可以构成不同的共轭梯度法.给出了一类含有三个参数的共轭梯度算法,这种算法能够在给定的条件下证明选定的β_k在每一步都能产生一个下降方向,同时在强Wolfe线搜索下,这种算法具有全局收敛性.  相似文献   

3.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.  相似文献   

4.
On the superlinear local convergence of a filter-SQP method   总被引:5,自引:0,他引:5  
Transition to superlinear local convergence is shown for a modified version of the trust-region filter-SQP method for nonlinear programming introduced by Fletcher, Leyffer, and Toint [8]. Hereby, the original trust-region SQP-steps can be used without an additional second order correction. The main modification consists in using the Lagrangian function value instead of the objective function value in the filter together with an appropriate infeasibility measure. Moreover, it is shown that the modified trust-region filter-SQP method has the same global convergence properties as the original algorithm in [8].Mathematics Subject Classification (2000): 90C55, 65K05, 90C30  相似文献   

5.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence result under traditional line searches such as Armijo, Wolfe and Goldstein line searches. In this paper a convergent version of Liu–Storey conjugate gradient method (LS in short) is proposed for minimizing functions that have Lipschitz continuous partial derivatives. By estimating the Lipschitz constant of the derivative of objective functions, we can find an adequate step size at each iteration so as to guarantee the global convergence and improve the efficiency of LS method in practical computation.  相似文献   

6.
Global convergence result for conjugate gradient methods   总被引:2,自引:0,他引:2  
Conjugate gradient optimization algorithms depend on the search directions,
  相似文献   

7.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

8.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

9.
We extend the Mizuno-Todd-Ye predictor-corrector algorithm for solving monotone linear complementarity problems. We prove that the extended algorithm is globally Q-linearly convergent and solves problems with integer data of bitlengthL in at most iterations. We also prove that the duality gap converges to zero Q-superlinearly for problems having strictly complementary solutions. Our results generalize the results obtained by Ye, Tapia, and Zhang for linear programming.  相似文献   

10.
This paper is concerned with the open problem as to whether DFP method with inexact line search converges globally to the minimum of a uniformly convex function. We study this problem by way of a Gauss-Newton approach rather than an ordinary Newton approach. We also propose a derivative-free line search that can be implemented conveniently by a backtracking process and has such an attractive property that any iterative method with this line search generates a sequence of iterates that is approximately norm descent. Moreover, if the Jacobian matrices are uniformly nonsingular, then the generated sequenceconverges. Under appropriate conditions, we establish global and superlinear convergence of the proposed Gauss-Newton based DFP method, which supports the open problem positively.  相似文献   

11.
共轭梯度法是一类具有广泛应用的求解大规模无约束优化问题的方法. 提出了一种新的非线性共轭梯度(CG)法,理论分析显示新算法在多种线搜索条件下具有充分下降性. 进一步证明了新CG算法的全局收敛性定理. 最后,进行了大量数值实验,其结果表明与传统的几类CG方法相比,新算法具有更为高效的计算性能.  相似文献   

12.
This paper is concerned with proving theoretical results related to the convergence of the conjugate gradient (CG) method for solving positive definite symmetric linear systems. Considering the inverse of the projection of the inverse of the matrix, new relations for ratios of the A‐norm of the error and the norm of the residual are provided, starting from some earlier results of Sadok (Numer. Algorithms 2005; 40 :201–216). The proofs of our results rely on the well‐known correspondence between the CG method and the Lanczos algorithm. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

13.
It is well-known that the HS method and the PRP method may not converge for nonconvex optimization even with exact line search. Some globalization techniques have been proposed, for instance, the PRP+ globalization technique and the Grippo-Lucidi globalization technique for the PRP method. In this paper, we propose a new efficient globalization technique for general nonlinear conjugate gradient methods for nonconvex minimization. This new technique utilizes the information of the previous search direction sufficiently. Under suitable conditions, we prove that the nonlinear conjugate gradient methods with this new technique are globally convergent for nonconvex minimization if the line search satisfies Wolfe conditions or Armijo condition. Extensive numerical experiments are reported to show the efficiency of the proposed technique.  相似文献   

14.
《Optimization》2012,61(12):2679-2691
In this article, we present an improved three-term conjugate gradient algorithm for large-scale unconstrained optimization. The search directions in the developed algorithm are proved to satisfy an approximate secant equation as well as the Dai-Liao’s conjugacy condition. With the standard Wolfe line search and the restart strategy, global convergence of the algorithm is established under mild conditions. By implementing the algorithm to solve 75 benchmark test problems with dimensions from 1000 to 10,000, the obtained numerical results indicate that the algorithm outperforms the state-of-the-art algorithms available in the literature. It costs less CPU time and smaller number of iterations in solving the large-scale unconstrained optimization.  相似文献   

15.
16.
推广线搜索下一类共轭梯度法的全局收敛性   总被引:2,自引:0,他引:2  
在推广线搜索下给出了一类共轭梯度法的全局收敛结果  相似文献   

17.
1. IntroductionWe consider the global convergence of conjugate gradient methods for the unconstrainednonlinear optimization problemadn f(x),where f: Re - RI is continuously dtherelltiable and its gradiellt is denoted by g. Weconsider only the cajse where the methods are implemented without regular restarts. Theiterative formula is given byXk 1 = Xk Akdk, (1'1).and the seaxch direction da is defined bywhere gb is a scalar, ^k is a stenlength, and gb denotes g(xk).The best-known formulas fo…  相似文献   

18.
This paper discusses the order-preserving convergence for spectral approximation of the self-adjoint completely continuous operator T.Under the condition that the approximate operator Th converges to T in norm,it is proven that the k-th eigenvalue of Th converges to the k-th eigenvalue of T.(We sorted the positive eigenvalues in decreasing order and negative eigenvalues in increasing order.) Then we apply this result to conforming elements,nonconforming elements and mixed elements of self-adjoint elliptic differential operators eigenvalue problems,and prove that the k-th approximate eigenvalue obtained by these methods converges to the k-th exact eigenvalue.  相似文献   

19.
20.
对一类特殊极大值函数非光滑方程问题的方法进行了研究, 利用极大值函数和绝对值函数的光滑函数对提出的非光滑方程问题进行转化, 提出了一种光滑保守DPRP共轭梯度法. 在一般的条件下, 给出了光滑保守DPRP共轭梯度法的全局收敛性, 最后给出相关的数值实验表明方法的有效性.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号