首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Summary. The method of shortest residuals (SR) was presented by Hestenes and studied by Pytlak. If the function is quadratic, and if the line search is exact, then the SR method reduces to the linear conjugate gradient method. In this paper, we put forward the formulation of the SR method when the line search is inexact. We prove that, if stepsizes satisfy the strong Wolfe conditions, both the Fletcher-Reeves and Polak-Ribière-Polyak versions of the SR method converge globally. When the Wolfe conditions are used, the two versions are also convergent provided that the stepsizes are uniformly bounded; if the stepsizes are not bounded, an example is constructed to show that they need not converge. Numerical results show that the SR method is a promising alternative of the standard nonlinear conjugate gradient method. Received June 25, 1996 / Revised version received April 1, 1997 / Published online July 28, 1999  相似文献   

2.
Summary. We show that the example given in [Dai, Y., Yuan, Y. (1999): Global convergence of the method of shortest residuals, Numerische Mathematik 83, 581–598] does not contradict the results of [Pytlak, R. (1994): On the convergence of conjugate gradient algorithms, IMA J. Numerical Analysis 14, 443–460]. Received September 9, 2000 / Revised version received November 28, 2000 / Published online July 25, 2001  相似文献   

3.
采用PDE灵敏度滤波器可以消除连续体结构拓扑优化结果存在的棋盘格现象、数值不稳定等问题,且PDE灵敏度滤波器的实质是具有Neumann边界条件的Helmholtz偏微分方程.针对大规模PDE灵敏度滤波器的求解问题,有限元分析得到其代数方程,分别采用共轭梯度算法、多重网格算法和多重网格预处理共轭梯度算法对代数方程进行求解,并且研究精度、过滤半径以及网格数量对拓扑优化效率的影响.结果表明:与共轭梯度算法和多重网格算法相比,多重网格预处理共轭梯度算法迭代次数最少,运行时间最短,极大地提高了拓扑优化效率.  相似文献   

4.
We capitalize upon the known relationship between pairs of orthogonal and minimal residual methods (or, biorthogonal and quasi-minimal residual methods) in order to estimate how much smaller the residuals or quasi-residuals of the minimizing methods can be compared to those of the corresponding Galerkin or Petrov–Galerkin method. Examples of such pairs are the conjugate gradient (CG) and the conjugate residual (CR) methods, the full orthogonalization method (FOM) and the generalized minimal residual (GMRES) method, the CGNE and BiCG versions of applying CG to the normal equations, as well as the biconjugate gradient (BiCG) and the quasi-minimal residual (QMR) methods. Also the pairs consisting of the (bi)conjugate gradient squared (CGS) and the transpose-free QMR (TFQMR) methods can be added to this list if the residuals at half-steps are included, and further examples can be created easily.The analysis is more generally applicable to the minimal residual (MR) and quasi-minimal residual (QMR) smoothing processes, which are known to provide the transition from the results of the first method of such a pair to those of the second one. By an interpretation of these smoothing processes in coordinate space we deepen the understanding of some of the underlying relationships and introduce a unifying framework for minimal residual and quasi-minimal residual smoothing. This framework includes the general notion of QMR-type methods.  相似文献   

5.
改进HS共轭梯度算法及其全局收敛性   总被引:14,自引:0,他引:14  
时贞军 《计算数学》2001,23(4):393-406
1.引 言 1952年 M.Hestenes和E.Stiefel提出了求解正定线性方程组的共轭梯度法[1].1964年R.Fletcher和C.Reeves将该方法推广到求解下列无约束优化问题: minf(x),x∈Rn,(1)其中f:Rn→R1为连续可微函数,记gk= f(xk),xk∈ Rn. 若点列{xk}由如下算法产生:其中 βk=[gTk(gk-gk-1)]/[dTk-1(gk-gk-1)].(Hestenes-Stiefel)  (4)则称该算法为 Hestenes—Stiefel共轭梯度算…  相似文献   

6.
In this paper, we present a new hybrid conjugate gradient algorithm for unconstrained optimization. This method is a convex combination of Liu-Storey conjugate gradient method and Fletcher-Reeves conjugate gradient method. We also prove that the search direction of any hybrid conjugate gradient method, which is a convex combination of two conjugate gradient methods, satisfies the famous D-L conjugacy condition and in the same time accords with the Newton direction with the suitable condition. Furthermore, this property doesn't depend on any line search. Next, we also prove that, moduling the value of the parameter t,the Newton direction condition is equivalent to Dai-Liao conjugacy condition.The strong Wolfe line search conditions are used.The global convergence of this new method is proved.Numerical comparisons show that the present hybrid conjugate gradient algorithm is the efficient one.  相似文献   

7.
A new conjugate gradient method is proposed by applying Powell’s symmetrical technique to conjugate gradient methods in this paper. Using Wolfe line searches, the global convergence of the method is analyzed by using the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk’s condition. Based on this, some concrete descent algorithms are developed. 200s numerical experiments are presented to verify their performance and the numerical results show that these algorithms are competitive compared with the PRP+ algorithm. Finally, a brief discussion of the new proposed method is given.  相似文献   

8.
The search direction in unconstrained minimization algorithms for large‐scale problems is usually computed as an iterate of the preconditioned) conjugate gradient method applied to the minimization of a local quadratic model. In line‐search procedures this direction is required to satisfy an angle condition that says that the angle between the negative gradient at the current point and the direction is bounded away from π/2. In this paper, it is shown that the angle between conjugate gradient iterates and the negative gradient strictly increases as far as the conjugate gradient algorithm proceeds. Therefore, the interruption of the conjugate gradient sub‐algorithm when the angle condition does not hold is theoretically justified. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

9.
The development of the Lanczos algorithm for finding eigenvalues of large sparse symmetric matrices was followed by that of block forms of the algorithm. In this paper, similar extensions are carried out for a relative of the Lanczos method, the conjugate gradient algorithm. The resulting block algorithms are useful for simultaneously solving multiple linear systems or for solving a single linear system in which the matrix has several separated eigenvalues or is not easily accessed on a computer. We develop a block biconjugate gradient algorithm for general matrices, and develop block conjugate gradient, minimum residual, and minimum error algorithms for symmetric semidefinite matrices. Bounds on the rate of convergence of the block conjugate gradient algorithm are presented, and issues related to computational implementation are discussed. Variants of the block conjugate gradient algorithm applicable to symmetric indefinite matrices are also developed.  相似文献   

10.
In this paper, a truncated conjugate gradient method with an inexact Gauss-Newton technique is proposed for solving nonlinear systems.?The iterative direction is obtained by the conjugate gradient method solving the inexact Gauss-Newton equation.?Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions. Finally, some numerical results are presented to illustrate the effectiveness of the proposed algorithm.  相似文献   

11.
This paper considers the numerical simulation of optimal control evolution dam problem by using conjugate gradient method.The paper considers the free boundary value problem related to time dependent fluid flow in a homogeneous earth rectangular dam.The dam is taken to be sufficiently long that the flow is considered to be two dimensional.On the left and right walls of the dam there is a reservoir of fluid at a level dependent on time.This problem can be transformed into a variational inequality on a fixed domain.The numerical techniques we use are based on a linear finite element method to approximate the state equations and a conjugate gradient algorithm to solve the discrete optimal control problem.This algorithm is based on Armijo's rule in the unconstrained optimization theory.The convergence of the discrete optimal solutions to the continuous optimal solutions,and the convergence of the conjugate gradient algorithm are proved.A numerical example is given to determine the location of the minimum surface  相似文献   

12.
本文对求解无约束优化问题提出一类三项混合共轭梯度算法,新算法将Hestenes- stiefel算法与Dai-Yuan方法相结合,并在不需给定下降条件的情况下,证明了算法在Wolfe线搜索原则下的收敛性,数值试验亦显示出这种混合共轭梯度算法较之HS和PRP的优势.  相似文献   

13.
借助谱梯度法和HS共轭梯度法的结构, 建立一种求解非线性单调方程组问题的谱HS投影算法. 该算法继承了谱梯度法和共轭梯度法储存量小和计算简单的特征, 且不需要任何导数信息, 因此它适应于求解大规模非光滑的非线性单调方程组问题. 在适当的条件下, 证明了该算法的收敛性, 并通过数值实验表明了该算法的有效性.  相似文献   

14.
三项共轭梯度法收敛性分析   总被引:5,自引:0,他引:5  
戴彧虹  袁亚湘 《计算数学》1999,21(3):355-362
1.引言考虑求解无约束光滑优化问题的线搜索方法其中al事先给定,山为搜索方向,Ik是步长因子.在经典的共轭梯度法中,对k三2,搜索方向dk由负梯度方向一gb和已有搜索方向小.1两个方向组成:其中山—-91,作为参数.关于参数作的计算公式很多,其中两个有名的计算公式称为*R公式和**P公式(见门和河1叩,它们分别为此处及以下11·11均指欧氏范数.在文献山中,Beale提出了搜索方向形如的三项重开始共轭梯度法,其中dt为重开始方向.Powellll]对这一方法引入了适当的重开始准则,获得了很好的数值结果.本文里,我们将研究搜索方向…  相似文献   

15.
Simple versions of the conjugate gradient algorithm and the Lanczos method are discussed, and some merits of the latter are described. A variant of Lanczos is proposed which maintains robust linear independence of the Lanczos vectors by keeping them in secondary storage and occasionally making use of them. The main applications are to problems in which (1) the cost of the matrix-vector product dominates other costs, (2) there is a sequence of right hand sides to be processed, and (3) the eigenvalue distribution of A is not too favorable.  相似文献   

16.
提出了一种凸组合共轭梯度算法,并将其算法应用到ARIMA模型参数估计中.新算法由改进的谱共轭梯度算法与共轭梯度算法作凸组合构造而成,具有下述特性:1)具备共轭性条件;2)自动满足充分下降性.证明了在标准Wolfe线搜索下新算法具备完全收敛性,最后数值实验表明通过调节凸组合参数,新算法更加快速有效,通过具体实例证实了模型...  相似文献   

17.
一种混合的HS-DY共轭梯度法   总被引:22,自引:3,他引:19  
戴志锋  陈兰平 《计算数学》2005,27(4):429-436
本文在HS方法和DY方法的基础上,综合两者的优势,提出了一种求解无约束优化问题的新的混合共轭梯度法.在Wolfe线搜索下,不需给定下降条件,证明了算法的全局收敛性.数值试验表明,新算法较之HS方法和PR方法更加有效.  相似文献   

18.
梯度投影法是一类有效的约束最优化算法,在最优化领域中占有重要的地位.但是,梯度投影法所采用的投影是正交投影,不包含目标函数和约束函数的二阶导数信息·因而;收敛速度不太令人满意.本文介绍一种共轭投影概念,利用共轭投影构造了一般线性或非线性约束下的共轭投影变尺度算法,并证明了算法在一定条件下具有全局收敛性.由于算法中的共轭投影恰当地包含了目标函数和约束函数的二阶导数信息,因而收敛速度有希望加快.数值试验的结果表明算法是有效的.  相似文献   

19.
In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter βk is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms including CONMIN by Shanno and Phua [D.F. Shanno, K.H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 2 (1976) 87–94], SCALCG by Andrei [N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl. 38 (2007) 401–416; N. Andrei, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optim. Methods Softw. 22 (2007) 561–571; N. Andrei, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. Math. Lett. 20 (2007) 645–650], and new conjugacy condition and related new conjugate gradient by Li, Tang and Wei [G. Li, C. Tang, Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math. 202 (2007) 523–539] or truncated Newton TN by Nash [S.G. Nash, Preconditioning of truncated-Newton methods, SIAM J. on Scientific and Statistical Computing 6 (1985) 599–616] using a set of 750 unconstrained optimization test problems show that the suggested algorithm outperforms these conjugate gradient algorithms as well as TN.  相似文献   

20.
本文研究了实子矩阵约束下矩阵方程AX=B及其最佳逼近的共轭梯度迭代解法.首先运用矩阵分块将原方程AX=B转换为2个低阶方程,利用共轭梯度的思想构造迭代算法;然后证明了算法的有限步终止性;最后给出数值实例验证算法的有效性.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号