首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Optimization》2012,61(10):1631-1648
ABSTRACT

In this paper, we develop a three-term conjugate gradient method involving spectral quotient, which always satisfies the famous Dai-Liao conjugacy condition and quasi-Newton secant equation, independently of any line search. This new three-term conjugate gradient method can be regarded as a variant of the memoryless Broyden-Fletcher-Goldfarb-Shanno quasi-Newton method with regard to spectral quotient. By combining this method with the projection technique proposed by Solodov and Svaiter in 1998, we establish a derivative-free three-term projection algorithm for dealing with large-scale nonlinear monotone system of equations. We prove the global convergence of the algorithm and obtain the R-linear convergence rate under some mild conditions. Numerical results show that our projection algorithm is effective and robust, and is more competitive with the TTDFP algorithm proposed Liu and Li [A three-term derivative-free projection method for nonlinear monotone system of equations. Calcolo. 2016;53:427–450].  相似文献   

2.
《Optimization》2012,61(1):85-99
In this article, we propose a BFGS method for solving symmetric nonlinear equations. The presented method possesses some favourable properties: (a) the generated sequence of iterates is norm descent; (b) the generated sequence of the quasi-Newton matrix is positive definite and (c) this method possesses the global convergence and superlinear convergence. Numerical results show that the presented method is interesting.  相似文献   

3.
Yanyun Ding  Jianwei Li 《Optimization》2017,66(12):2309-2328
The recent designed non-linear conjugate gradient method of Dai and Kou [SIAM J Optim. 2013;23:296–320] is very efficient currently in solving large-scale unconstrained minimization problems due to its simpler iterative form, lower storage requirement and its closeness to the scaled memoryless BFGS method. Just because of these attractive properties, this method was extended successfully to solve higher dimensional symmetric non-linear equations in recent years. Nevertheless, its numerical performance in solving convex constrained monotone equations has never been explored. In this paper, combining with the projection method of Solodov and Svaiter, we develop a family of non-linear conjugate gradient methods for convex constrained monotone equations. The proposed methods do not require the Jacobian information of equations, and even they do not store any matrix in each iteration. They are potential to solve non-smooth problems with higher dimensions. We prove the global convergence of the class of the proposed methods and establish its R-linear convergence rate under some reasonable conditions. Finally, we also do some numerical experiments to show that the proposed methods are efficient and promising.  相似文献   

4.
谢锐  吴义虎 《经济数学》2009,26(3):104-110
提出一种求解强单调非线性方程组的BFGS算法,该算法的一个明显优点是Bκ的条件数比Li-Fukushima^[3]提出的GNBFGS中Bκ的条件数小得多。且该算法是一种无需计算导数的下降算法。在一定的条件下,证明了算法的全局收敛性和超线性收敛性。最后进行数值试验,结果表明,本文算法具有较好的数值结果。而且验证了本文所提出的算法中Bκ的条件数要比GNBFGS算法的条件数小得多。  相似文献   

5.
本文给出了求解非线性单调方程组的两个自调比对称秩1牛顿法,即投影SSR1法和投影有限储存SSR1法.这两个算法将自调比对称秩1校正参数进行了一个简单的修改并采用了保守策略.在非线性单调函数满足李普希茨连续的条件下,证明了算法的全局收敛性,并与相同类型的BFGS法进行了初步的数值比较试验,试验结果表明自调比对称秩1类投影算法求解非线性单调方程组与相同类型的BFGS数值结果相当.  相似文献   

6.
This study presents a novel adaptive trust-region method for solving symmetric nonlinear systems of equations. The new method uses a derivative-free quasi-Newton formula in place of the exact Jacobian. The global convergence and local quadratic convergence of the new method are established without the nondegeneracy assumption of the exact Jacobian. Using the compact limited memory BFGS, we adapt a version of the new method for solving large-scale problems and develop the dogleg scheme for solving the associated trust-region subproblems. The sufficient decrease condition for the adapted dogleg scheme is established. While the efficiency of the present trust-region approach can be improved by using adaptive radius techniques, utilizing the compact limited memory BFGS adjusts this approach to handle large-scale symmetric nonlinear systems of equations. Preliminary numerical results for both medium- and large-scale problems are reported.  相似文献   

7.
刘金魁 《计算数学》2016,38(2):113-124
本文在著名PRP共轭梯度算法的基础上研究了一种无导数谱PRP投影算法,并证明了算法在求解带有凸约束条件的非线性单调方程组问题的全局收敛性.由于无导数和储存量小的特性,它更适应于求解大规模非光滑的非线性单调方程组问题.数值试验表明,新算法对给定的测试问题是有效的和稳定的.  相似文献   

8.
The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is convex (or even uniformly convex). We propose to solve unconstrained nonconvex optimization problems by a self-scaling BFGS algorithm with nonmonotone linear search. Nonmonotone line search has been recognized in numerical practices as a competitive approach for solving large-scale nonlinear problems. We consider two different nonmonotone line search forms and study the global convergence of these nonmonotone self-scale BFGS algorithms. We prove that, under some weaker condition than that in the literature, both forms of the self-scaling BFGS algorithm are globally convergent for unconstrained nonconvex optimization problems.  相似文献   

9.
In this paper, we present two partitioned quasi-Newton methods for solving partially separable nonlinear equations. When the Jacobian is not available, we propose a partitioned Broyden’s rank one method and show that the full step partitioned Broyden’s rank one method is locally and superlinearly convergent. By using a well-defined derivative-free line search, we globalize the method and establish its global and superlinear convergence. In the case where the Jacobian is available, we propose a partitioned adjoint Broyden method and show its global and superlinear convergence. We also present some preliminary numerical results. The results show that the two partitioned quasi-Newton methods are effective and competitive for solving large-scale partially separable nonlinear equations.  相似文献   

10.
In this paper, a subspace limited memory BFGS algorithm for solving large-scale bound constrained optimization problems is developed. It is modifications of the subspace limited memory quasi-Newton method proposed by Ni and Yuan [Q. Ni, Y.X. Yuan, A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization, Math. Comput. 66 (1997) 1509–1520]. An important property of our proposed method is that more limited memory BFGS update is used. Under appropriate conditions, the global convergence of the method is established. The implementations of the method on CUTE test problems are presented, which indicate the modifications are beneficial to the performance of the algorithm.  相似文献   

11.
In this paper, we propose a BFGS (Broyden–Fletcher–Goldfarb–Shanno)-SQP (sequential quadratic programming) method for nonlinear inequality constrained optimization. At each step, the method generates a direction by solving a quadratic programming subproblem. A good feature of this subproblem is that it is always consistent. Moreover, we propose a practical update formula for the quasi-Newton matrix. Under mild conditions, we prove the global and superlinear convergence of the method. We also present some numerical results.  相似文献   

12.
In this paper, we propose a general smoothing Broyden-like quasi-Newton method for solving a class of nonsmooth equations. Under appropriate conditions, the proposed method converges to a solution of the equation globally and superlinearly. In particular, the proposed method provides the possibility of developing a quasi-Newton method that enjoys superlinear convergence even if strict complementarity fails to hold. We pay particular attention to semismooth equations arising from nonlinear complementarity problems, mixed complementarity problems and variational inequality problems. We show that under certain conditions, the related methods based on the perturbed Fischer–Burmeister function, Chen–Harker–Kanzow–Smale smoothing function and the Gabriel–Moré class of smoothing functions converge globally and superlinearly.  相似文献   

13.
In this paper, we propose a spectral DY-type projection method for nonlinear monotone systems of equations, which is a reasonable combination of DY conjugate gradient method, the spectral gradient method and the projection technique. Without the differentiability assumption on the system of equations, we establish the global convergence of the proposed method, which does not rely on any merit function. Furthermore, this method is derivative-free and so is very suitable to solve large-scale nonlinear monotone systems. The preliminary numerical results show the feasibility and effectiveness of the proposed method.  相似文献   

14.
推广了一种修正的CG_DESCENT共轭梯度方法,并建立了一种有效求解非线性单调方程组问题的无导数投影算法.在适当的线搜索条件下,证明了算法的全局收敛性.由于新算法不需要借助任何导数信息,故它适应于求解大规模非光滑的非线性单调方程组问题.大量的数值试验表明,新算法对给定的测试问题是有效的.  相似文献   

15.
The BFGS method is the most effective of the quasi-Newton methods for solving unconstrained optimization problems. Wei, Li, and Qi [16] have proposed some modified BFGS methods based on the new quasi-Newton equation B k+1 s k = y* k , where y* k is the sum of y k and A k s k, and A k is some matrix. The average performance of Algorithm 4.3 in [16] is better than that of the BFGS method, but its superlinear convergence is still open. This article proves the superlinear convergence of Algorithm 4.3 under some suitable conditions.  相似文献   

16.
Local convergence of quasi-Newton methods for B-differentiable equations   总被引:7,自引:0,他引:7  
We study local convergence of quasi-Newton methods for solving systems of nonlinear equations defined by B-differentiable functions. We extend the classical linear and superlinear convergence results for general quasi-Newton methods as well as for Broyden's method. We also show how Broyden's method may be applied to nonlinear complementarity problems and illustrate its computational performance on two small examples.  相似文献   

17.
We are concerned with defining new globalization criteria for solution methods of nonlinear equations. The current criteria used in these methods require a sufficient decrease of a particular merit function at each iteration of the algorithm. As was observed in the field of smooth unconstrained optimization, this descent requirement can considerably slow the rate of convergence of the sequence of points produced and, in some cases, can heavily deteriorate the performance of algorithms. The aim of this paper is to show that the global convergence of most methods proposed in the literature for solving systems of nonlinear equations can be obtained using less restrictive criteria that do not enforce a monotonic decrease of the chosen merit function. In particular, we show that a general stabilization scheme, recently proposed for the unconstrained minimization of continuously differentiable functions, can be extended to methods for the solution of nonlinear (nonsmooth) equations. This scheme includes different kinds of relaxation of the descent requirement and opens up the possibility of describing new classes of algorithms where the old monotone linesearch techniques are replace with more flexible nonmonotone stabilization procedures. As in the case of smooth unconstrained optimization, this should be the basis for defining more efficient algorithms with very good practical rates of convergence.This material is partially based on research supported by the Air Force Office of Scientific Research Grant AFOSR-89-0410, National Science Foundation Grant CCR-91-57632, and Istituto di Analisi dei Sistemi ed Informatica del CNR.  相似文献   

18.
Techniques for obtaining safely positive definite Hessian approximations with self-scaling and modified quasi-Newton updates are combined to obtain ??better?? curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method, has the global and superlinear convergence for convex functions. Numerical experiments with this class, using the well-known quasi-Newton BFGS, DFP and a modified SR1 updates, are presented to illustrate some advantages of the new techniques. These experiments show that the performance of several combined methods are substantially better than that of the standard BFGS method. Similar improvements are also obtained if the simple sufficient function reduction condition on the steplength is used instead of the strong Wolfe conditions.  相似文献   

19.
徐建军 《应用数学》1991,4(4):78-85
本文给出了适于在MIMD机上解非线性方程组的同步化并行Broyden方法和换列修正拟Newton法的迭代格式,以及它们的局部收敛性定理.数值试验结果也验证了收敛性.  相似文献   

20.
An extended descent framework for variational inequalities   总被引:1,自引:0,他引:1  
In this paper, we develop a very general descent framework for solving asymmetric, monotone variational inequalities. We introduce two classes of differentiable merit functions and the associated global convergence frameworks which include, as special instances, the projection, Newton, quasi-Newton, linear Jacobi, and nonlinear methods. The generic algorithm is very flexible and consequently well suited for exploiting any particular structure of the problem.This research was supported by the National Science and Engineering Research Council of Canada, Grant A5789, and by the Department of National Defence of Canada, Grant FUHBP.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号