共查询到20条相似文献,搜索用时 15 毫秒
1.
The recent designed non-linear conjugate gradient method of Dai and Kou [SIAM J Optim. 2013;23:296–320] is very efficient currently in solving large-scale unconstrained minimization problems due to its simpler iterative form, lower storage requirement and its closeness to the scaled memoryless BFGS method. Just because of these attractive properties, this method was extended successfully to solve higher dimensional symmetric non-linear equations in recent years. Nevertheless, its numerical performance in solving convex constrained monotone equations has never been explored. In this paper, combining with the projection method of Solodov and Svaiter, we develop a family of non-linear conjugate gradient methods for convex constrained monotone equations. The proposed methods do not require the Jacobian information of equations, and even they do not store any matrix in each iteration. They are potential to solve non-smooth problems with higher dimensions. We prove the global convergence of the class of the proposed methods and establish its R-linear convergence rate under some reasonable conditions. Finally, we also do some numerical experiments to show that the proposed methods are efficient and promising. 相似文献
2.
陈香萍 《数学的实践与认识》2017,(13):168-175
推广了一种修正的CG_DESCENT共轭梯度方法,并建立了一种有效求解非线性单调方程组问题的无导数投影算法.在适当的线搜索条件下,证明了算法的全局收敛性.由于新算法不需要借助任何导数信息,故它适应于求解大规模非光滑的非线性单调方程组问题.大量的数值试验表明,新算法对给定的测试问题是有效的. 相似文献
3.
借助谱梯度法和HS共轭梯度法的结构, 建立一种求解非线性单调方程组问题的谱HS投影算法. 该算法继承了谱梯度法和共轭梯度法储存量小和计算简单的特征,
且不需要任何导数信息, 因此它适应于求解大规模非光滑的非线性单调方程组问题. 在适当的条件下, 证明了该算法的收敛性, 并通过数值实验表明了该算法的有效性. 相似文献
4.
共轭梯度法是一类具有广泛应用的求解大规模无约束优化问题的方法. 提出了一种新的非线性共轭梯度(CG)法,理论分析显示新算法在多种线搜索条件下具有充分下降性. 进一步证明了新CG算法的全局收敛性定理. 最后,进行了大量数值实验,其结果表明与传统的几类CG方法相比,新算法具有更为高效的计算性能. 相似文献
5.
Pengjie Liu Xiaoyu Wu Hu Shao Yan Zhang Shuhan Cao 《Numerical Linear Algebra with Applications》2023,30(2):e2471
In this work, by considering the hyperplane projection and hybrid techniques, three scaled three-term conjugate gradient methods are extended to solve the system of constrained monotone nonlinear equations, and the developed methods have the advantages of low storage and only using function values. The new methods satisfy the sufficient descent condition independent of any line search criterion. It has been proved that three new methods converge globally under some mild conditions. The numerical experiments for constrained monotone nonlinear equations and image de-blurring problems illustrate that the proposed methods are numerically effective and efficient. 相似文献
6.
D. G. Luenberger 《Journal of Optimization Theory and Applications》1974,14(5):477-495
A new programming algorithm for nonlinear constrained optimization problems is proposed. The method is based on the penalty function approach and thereby circumyents the necessity to maintain feasibility at each iteration, but it also behaves much like the gradient projection method. Although only first-order information is used, the algorithm converges asymptotically at a rate which is independent of the magnitude of the penalty term; hence, unlike the simple gradient method, the asymptotic rate of the proposed method is not affected by the ill-conditioning associated with the introduction of the penalty term. It is shown that the asymptotic rate of convergence of the proposed method is identical with that of the gradient projection method.Dedicated to Professor M. R. HestenesThis research was supported by the National Science Foundation, Grant No. GK-16125. 相似文献
7.
The convergence of Rosen's gradient method is a long-standing problem in nonlinear programming. Recently, progress has been made by several researchers. In this paper, we completely resolve the problem.This author's work was supported in part by AF OSR-86-0078, NSF DMS-86-06225, and NSF of China. 相似文献
8.
一类新的共轭投影梯度算法 总被引:2,自引:0,他引:2
本文利用[5]引进的共轭投影的概念,结合堵丁柱[3]中的思想,提出一类新的共轭梯度投影算法.在一定的条件下,证明了该算法具有全局收敛性和超线性收敛速度. 相似文献
9.
《Optimization》2012,61(2):163-179
In this article, we consider the global convergence of the Polak–Ribiére–Polyak conjugate gradient method (abbreviated PRP method) for minimizing functions that have Lipschitz continuous partial derivatives. A novel form of non-monotone line search is proposed to guarantee the global convergence of the PRP method. It is also shown that the PRP method has linear convergence rate under some mild conditions when the non-monotone line search reduces to a related monotone line search. The new non-monotone line search needs to estimate the Lipschitz constant of the gradients of objective functions, for which two practical estimations are proposed to help us to find a suitable initial step size for the PRP method. Numerical results show that the new line search approach is efficient in practical computation. 相似文献
10.
ZENGQINGGUANG 《高校应用数学学报(英文版)》1997,12(1):117-125
In this paper, we provide a new generalized gradient projection algorithm for nonlinear programming problems with linear constraints. This algorithm has simple structure and is very practical and stable. Under the weaker assumptions, we have proved the global convergence of our algorithm. 相似文献
11.
《Optimization》2012,61(9):1791-1806
12.
13.
《Optimization》2012,61(12):2679-2691
In this article, we present an improved three-term conjugate gradient algorithm for large-scale unconstrained optimization. The search directions in the developed algorithm are proved to satisfy an approximate secant equation as well as the Dai-Liao’s conjugacy condition. With the standard Wolfe line search and the restart strategy, global convergence of the algorithm is established under mild conditions. By implementing the algorithm to solve 75 benchmark test problems with dimensions from 1000 to 10,000, the obtained numerical results indicate that the algorithm outperforms the state-of-the-art algorithms available in the literature. It costs less CPU time and smaller number of iterations in solving the large-scale unconstrained optimization. 相似文献
14.
CaiXia Kou 《中国科学 数学(英文版)》2014,57(3):635-648
Conjugate gradient methods have played a special role in solving large scale nonlinear problems. Recently, the author and Dai proposed an efficient nonlinear conjugate gradient method called CGOPT, through seeking the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method. In this paper, we make use of two types of modified secant equations to improve CGOPT method. Under some assumptions, the improved methods are showed to be globally convergent. Numerical results are also reported. 相似文献
15.
《Optimization》2012,61(10):1631-1648
ABSTRACTIn this paper, we develop a three-term conjugate gradient method involving spectral quotient, which always satisfies the famous Dai-Liao conjugacy condition and quasi-Newton secant equation, independently of any line search. This new three-term conjugate gradient method can be regarded as a variant of the memoryless Broyden-Fletcher-Goldfarb-Shanno quasi-Newton method with regard to spectral quotient. By combining this method with the projection technique proposed by Solodov and Svaiter in 1998, we establish a derivative-free three-term projection algorithm for dealing with large-scale nonlinear monotone system of equations. We prove the global convergence of the algorithm and obtain the R-linear convergence rate under some mild conditions. Numerical results show that our projection algorithm is effective and robust, and is more competitive with the TTDFP algorithm proposed Liu and Li [A three-term derivative-free projection method for nonlinear monotone system of equations. Calcolo. 2016;53:427–450]. 相似文献
16.
Masao Fukushima 《Journal of Computational and Applied Mathematics》1990,30(3):329-339
This paper presents a conjugate gradient method for solving systems of linear inequalities. The method is of dual optimization type and consists of two phases which can be implemented in a common framework. Phase 1 either finds the minimum-norm solution of the system or detects the inconsistency of the system. In the latter event, the method proceeds to Phase 2 in which an approximate least-squares solution to the system is obtained. The method is particularly suitable to large scale problems because it preserves the sparsity structure of the problem. Its efficiency is shown by computational comparisons with an SOR type method. 相似文献
17.
《Optimization》2012,61(4):993-1009
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed. 相似文献
18.
19.
20.
对一类特殊极大值函数非光滑方程问题的方法进行了研究, 利用极大值函数和绝对值函数的光滑函数对提出的非光滑方程问题进行转化, 提出了一种光滑保守DPRP共轭梯度法. 在一般的条件下, 给出了光滑保守DPRP共轭梯度法的全局收敛性, 最后给出相关的数值实验表明方法的有效性. 相似文献