首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
In this paper, we focus on the stochastic inverse eigenvalue problem with partial eigendata of constructing a stochastic matrix from the prescribed partial eigendata. A Riemannian variant of the Fletcher–Reeves conjugate gradient method is proposed for solving a general unconstrained minimization problem on a Riemannian manifold, and the corresponding global convergence is established under some assumptions. Then, we reformulate the inverse problem as a nonlinear least squares problem over a matrix oblique manifold, and the application of the proposed geometric method to the nonlinear least squares problem is investigated. The proposed geometric method is also applied to the case of prescribed entries and the case of column stochastic matrix. Finally, some numerical tests are reported to illustrate that the proposed geometric method is effective for solving the inverse problem.  相似文献   

2.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

3.
基于著名的PRP共轭梯度方法,利用CG_DESCENT共轭梯度方法的结构,本文提出了一种求解大规模无约束最优化问题的修正PRP共轭梯度方法。该方法在每一步迭代中均能够产生一个充分下降的搜索方向,且独立于任何线搜索条件。在标准Wolfe线搜索条件下,证明了修正PRP共轭梯度方法的全局收敛性和线性收敛速度。数值结果展示了修正PRP方法对给定的测试问题是非常有效的。  相似文献   

4.
In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.  相似文献   

5.
In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.  相似文献   

6.
The subject of this work is accelerating data uncertainty quantification. In particular, we are interested in expediting the stochastic estimation of the diagonal of the inverse covariance (precision) matrix that holds a wealth of information concerning the quality of data collections, especially when the matrices are symmetric positive definite and dense. Schemes built on direct methods incur a prohibitive cubic cost. Recently proposed iterative methods can remedy this but the overall cost is raised again as the convergence of stochastic estimators can be slow. The motivation behind our approach stems from the fact that the computational bottleneck in stochastic estimation is the application of the precision matrix on a set of appropriately selected vectors. The proposed method combines block conjugate gradient with a block-seed approach for multiple right-hand sides, taking advantage of the nature of the right-hand sides and the fact that the diagonal is not sought to high accuracy. Our method is applicable if the matrix is only known implicitly and also produces a matrix-free diagonal preconditioner that can be applied to further accelerate the method. Numerical experiments confirm that the approach is promising and helps contain the overall cost of diagonal estimation as the number of samples grows.  相似文献   

7.
In this Note, we formulate a sparse Krylov-based algorithm for solving large-scale linear systems of algebraic equations arising from the discretization of randomly parametrized (or stochastic) elliptic partial differential equations (SPDEs). We analyze the proposed sparse conjugate gradient (CG) algorithm within the framework of inexact Krylov subspace methods, prove its convergence and study its abstract computational cost. Numerical studies conducted on stochastic diffusion models show that the proposed sparse CG algorithm outperforms the classical CG method when the sought solutions admit a sparse representation in a polynomial chaos basis. In such cases, the sparse CG algorithm recovers almost exactly the sparsity pattern of the exact solutions, which enables accelerated convergence. In the case when the SPDE solution does not admit a sparse representation, the convergence of the proposed algorithm is very similar to the classical CG method.  相似文献   

8.
Based on a singular value analysis on an extension of the Polak–Ribière–Polyak method, a nonlinear conjugate gradient method with the following two optimal features is proposed: the condition number of its search direction matrix is minimum and also, the distance of its search direction from the search direction of a descent nonlinear conjugate gradient method proposed by Zhang et al. is minimum. Under proper conditions, global convergence of the method can be achieved. To enhance e?ciency of the proposed method, Powell’s truncation of the conjugate gradient parameters is used. The method is computationally compared with the nonlinear conjugate gradient method proposed by Zhang et al. and a modified Polak–Ribière–Polyak method proposed by Yuan. Results of numerical comparisons show e?ciency of the proposed method in the sense of the Dolan–Moré performance profile.  相似文献   

9.
连淑君  王长钰 《应用数学》2007,20(1):120-127
本文我们讨论了一簇共轭梯度法,它可被看作是FR法和DY法的凸组合.我们提出了两种Armijo型线搜索,并在这两种线搜索下,讨论了共轭梯度法簇的全局收敛性.  相似文献   

10.
本文研究了在控制理论和随机滤波等领域中遇到的一类含高次逆幂的矩阵方程的等价矩阵方程对称解的数值计算问题.采用牛顿算法求等价矩阵方程的对称解,并采用修正共轭梯度法求由牛顿算法每一步迭代计算导出的线性矩阵方程的对称解或者对称最小二乘解,建立了求这类矩阵方程对称解的双迭代算法,数值算例验证了双迭代算法是有效的.  相似文献   

11.
共轭梯度法是求解大规模无约束优化问题最有效的方法之一.对HS共轭梯度法参数公式进行改进,得到了一个新公式,并以新公式建立一个算法框架.在不依赖于任何线搜索条件下,证明了由算法框架产生的迭代方向均满足充分下降条件,且在标准Wolfe线搜索条件下证明了算法的全局收敛性.最后,对新算法进行数值测试,结果表明所改进的方法是有效的.  相似文献   

12.
借助谱梯度法和HS共轭梯度法的结构, 建立一种求解非线性单调方程组问题的谱HS投影算法. 该算法继承了谱梯度法和共轭梯度法储存量小和计算简单的特征, 且不需要任何导数信息, 因此它适应于求解大规模非光滑的非线性单调方程组问题. 在适当的条件下, 证明了该算法的收敛性, 并通过数值实验表明了该算法的有效性.  相似文献   

13.
In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.  相似文献   

14.
In this paper, a truncated conjugate gradient method with an inexact Gauss-Newton technique is proposed for solving nonlinear systems.?The iterative direction is obtained by the conjugate gradient method solving the inexact Gauss-Newton equation.?Global convergence and local superlinear convergence rate of the proposed algorithm are established under some reasonable conditions. Finally, some numerical results are presented to illustrate the effectiveness of the proposed algorithm.  相似文献   

15.
Conjugate Gradient Methods with Armijo-type Line Searches   总被引:14,自引:0,他引:14  
Abstract Two Armijo-type line searches are proposed in this paper for nonlinear conjugate gradient methods.Under these line searches, global convergence results are established for several famous conjugate gradientmethods, including the Fletcher-Reeves method, the Polak-Ribiere-Polyak method, and the conjugate descentmethod.  相似文献   

16.
Conjugate gradient methods are interesting iterative methods that solve large scale unconstrained optimization problems. A lot of recent research has thus focussed on developing a number of conjugate gradient methods that are more effective. In this paper, we propose another hybrid conjugate gradient method as a linear combination of Dai-Yuan (DY) method and the Hestenes-Stiefel (HS) method. The sufficient descent condition and the global convergence of this method are established using the generalized Wolfe line search conditions. Compared to the other conjugate gradient methods, the proposed method gives good numerical results and is effective.  相似文献   

17.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence result under traditional line searches such as Armijo, Wolfe and Goldstein line searches. In this paper a convergent version of Liu–Storey conjugate gradient method (LS in short) is proposed for minimizing functions that have Lipschitz continuous partial derivatives. By estimating the Lipschitz constant of the derivative of objective functions, we can find an adequate step size at each iteration so as to guarantee the global convergence and improve the efficiency of LS method in practical computation.  相似文献   

18.
Jiang  Xianzhen  Liao  Wei  Yin  Jianghua  Jian  Jinbao 《Numerical Algorithms》2022,91(1):161-191

In this paper, based on the hybrid conjugate gradient method and the convex combination technique, a new family of hybrid three-term conjugate gradient methods are proposed for solving unconstrained optimization. The conjugate parameter in the search direction is a hybrid of Dai-Yuan conjugate parameter and any one. The search direction then is the sum of the negative gradient direction and a convex combination in relation to the last search direction and the gradient at the previous iteration. Without choosing any specific conjugate parameters, we show that the search direction generated by the family always possesses the descent property independent of line search technique, and that it is globally convergent under usual assumptions and the weak Wolfe line search. To verify the effectiveness of the presented family, we further design a specific conjugate parameter, and perform medium-large-scale numerical experiments for smooth unconstrained optimization and image restoration problems. The numerical results show the encouraging efficiency and applicability of the proposed methods even compared with the state-of-the-art methods.

  相似文献   

19.
Satisfying in the sufficient descent condition is a strength of a conjugate gradient method. Here, it is shown that under the Wolfe line search conditions the search directions generated by the memoryless BFGS conjugate gradient algorithm proposed by Shanno satisfy the sufficient descent condition for uniformly convex functions.  相似文献   

20.
A new conjugate gradient method is proposed by applying Powell’s symmetrical technique to conjugate gradient methods in this paper. Using Wolfe line searches, the global convergence of the method is analyzed by using the spectral analysis of the conjugate gradient iteration matrix and Zoutendijk’s condition. Based on this, some concrete descent algorithms are developed. 200s numerical experiments are presented to verify their performance and the numerical results show that these algorithms are competitive compared with the PRP+ algorithm. Finally, a brief discussion of the new proposed method is given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号