首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe–Powell line search technique; (iv) This method inherits an important property of the well-known Polak–Ribière–Polyak (PRP) method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening. The global convergence and the linearly convergent rate of the given method are established. Numerical results show that this method is interesting.  相似文献   

2.
The Barzilai–Borwein (BB) gradient method has received many studies due to its simplicity and numerical efficiency. By incorporating a nonmonotone line search, Raydan (SIAM J Optim. 1997;7:26–33) has successfully extended the BB gradient method for solving general unconstrained optimization problems so that it is competitive with conjugate gradient methods. However, the numerical results reported by Raydan are poor for very ill-conditioned problems because the effect of the degree of nonmonotonicity may be noticeable. In this paper, we focus more on the nonmonotone line search technique used in the global Barzilai–Borwein (GBB) gradient method. We improve the performance of the GBB gradient method by proposing an adaptive nonmonotone line search based on the morphology of the objective function. We also prove the global convergence and the R-linear convergence rate of the proposed method under reasonable assumptions. Finally, we give some numerical experiments made on a set of unconstrained optimization test problems of the CUTEr collection. The results show the efficiency of the proposed method in the sense of the performance profile introduced (Math Program. 2002;91:201–213) by Dolan and Moré.  相似文献   

3.
This article presents enhancement strategies for the Hermitian and skew‐Hermitian splitting method based on gradient iterations. The spectral properties are exploited for the parameter estimation, often resulting in a better convergence. In particular, steepest descent with early stopping can generate a rough estimate of the optimal upper bound. This is better than an arbitrary choice since the latter often causes stability problems or slow convergence. In addition, delayed gradient methods are considered as inner solvers for the splitting method. Experiments verify the effectiveness of the proposed estimation strategies and show that delayed gradient methods are competitive with conjugate gradient in low precision.  相似文献   

4.
一个充分下降的有效共轭梯度法   总被引:2,自引:0,他引:2  
对于大规模无约束优化问题,本文提出了一个充分下降的共轭梯度法公式,并建立相应的算法.该算法在不依赖于任何线搜索条件下,每步迭代都能产生一个充分下降方向.若采用标准Wolfe非精确线搜索求步长,则在常规假设条件下可获得算法良好的全局收敛性最后,对算法进行大规模数值试验,并采用Dolan和More的性能图对试验效果进行刻画,结果表明该算法是有效的.  相似文献   

5.
In this paper, we describe an implementation and give performance results for a conjugate gradient algorithm for unconstrained optimization. The algorithm is based upon the Nazareth three-term formula and incorporates Allwright preconditioning matrices and restart tests. The performance results for this combination compare favorably with existing codes.The support of the Science and Engineering Research Council is gratefully acknowledged.  相似文献   

6.
It is well known that the sufficient descent condition is very important to the global convergence of the nonlinear conjugate gradient method. In this paper, some modified conjugate gradient methods which possess this property are presented. The global convergence of these proposed methods with the weak Wolfe–Powell (WWP) line search rule is established for nonconvex function under suitable conditions. Numerical results are reported. This work is supported by Guangxi University SF grands X061041 and China NSF grands 10761001.  相似文献   

7.
Hybrid spectral gradient method for the unconstrained minimization problem   总被引:1,自引:0,他引:1  
We present a hybrid algorithm that combines a genetic algorithm with the Barzilai–Borwein gradient method. Under specific assumptions the new method guarantees the convergence to a stationary point of a continuously differentiable function, from any arbitrary initial point. Our preliminary numerical results indicate that the new methodology finds efficiently and frequently the global minimum, in comparison with the globalized Barzilai–Borwein method and the genetic algorithm of the Toolbox of Genetic Algorithms of MatLab.   相似文献   

8.
A modified PRP conjugate gradient method   总被引:4,自引:0,他引:4  
This paper gives a modified PRP method which possesses the global convergence of nonconvex function and the R-linear convergence rate of uniformly convex function. Furthermore, the presented method has sufficiently descent property and characteristic of automatically being in a trust region without carrying out any line search technique. Numerical results indicate that the new method is interesting for the given test problems. This work is supported by Guangxi University SF grands X061041 and China NSF grands 10761001.  相似文献   

9.
In this paper we propose a new line search algorithm that ensures global convergence of the Polak-Ribière conjugate gradient method for the unconstrained minimization of nonconvex differentiable functions. In particular, we show that with this line search every limit point produced by the Polak-Ribière iteration is a stationary point of the objective function. Moreover, we define adaptive rules for the choice of the parameters in a way that the first stationary point along a search direction can be eventually accepted when the algorithm is converging to a minimum point with positive definite Hessian matrix. Under strong convexity assumptions, the known global convergence results can be reobtained as a special case. From a computational point of view, we may expect that an algorithm incorporating the step-size acceptance rules proposed here will retain the same good features of the Polak-Ribière method, while avoiding pathological situations. This research was supported by Agenzia Spaziale Italiana, Rome, Italy.  相似文献   

10.
Two new methods for unconstrained optimization are presented. Both methods employ a hybrid direction strategy which is a modification of Powell's 1970 dogleg strategy. They also employ a projection technique introduced by Davidon in his 1975 algorithm which uses projection images of x and g in updating the approximate Hessian. The first method uses Davidon's optimally conditioned update formula, while the second uses only the BFGS update. Both methods performed well without Powell's special iterations and singularity safeguards, and the numerical results are very promising.This research was supported by the National Science Foundation under Grant No. GJ-40903.  相似文献   

11.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.  相似文献   

12.
In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.  相似文献   

13.
In this paper, a new type of stepsize, approximate optimal stepsize, for gradient method is introduced to interpret the Barzilai–Borwein (BB) method, and an efficient gradient method with an approximate optimal stepsize for the strictly convex quadratic minimization problem is presented. Based on a multi-step quasi-Newton condition, we construct a new quadratic approximation model to generate an approximate optimal stepsize. We then use the two well-known BB stepsizes to truncate it for improving numerical effects and treat the resulted approximate optimal stepsize as the new stepsize for gradient method. We establish the global convergence and R-linear convergence of the proposed method. Numerical results show that the proposed method outperforms some well-known gradient methods.  相似文献   

14.
刘亚君  刘新为 《计算数学》2016,38(1):96-112
梯度法是求解无约束最优化的一类重要方法.步长选取的好坏与梯度法的数值表现息息相关.注意到BB步长隐含了目标函数的二阶信息,本文将BB法与信赖域方法相结合,利用BB步长的倒数去近似目标函数的Hesse矩阵,同时利用信赖域子问题更加灵活地选取梯度法的步长,给出求解无约束最优化问题的单调和非单调信赖域BB法.在适当的假设条件下,证明了算法的全局收敛性.数值试验表明,与已有的求解无约束优化问题的BB类型的方法相比,非单调信赖域BB法中e_k=‖x_k-x~*‖的下降呈现更明显的阶梯状和单调性,因此收敛速度更快.  相似文献   

15.
共轭梯度法是一类具有广泛应用的求解大规模无约束优化问题的方法. 提出了一种新的非线性共轭梯度(CG)法,理论分析显示新算法在多种线搜索条件下具有充分下降性. 进一步证明了新CG算法的全局收敛性定理. 最后,进行了大量数值实验,其结果表明与传统的几类CG方法相比,新算法具有更为高效的计算性能.  相似文献   

16.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


17.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence result under traditional line searches such as Armijo, Wolfe and Goldstein line searches. In this paper a convergent version of Liu–Storey conjugate gradient method (LS in short) is proposed for minimizing functions that have Lipschitz continuous partial derivatives. By estimating the Lipschitz constant of the derivative of objective functions, we can find an adequate step size at each iteration so as to guarantee the global convergence and improve the efficiency of LS method in practical computation.  相似文献   

18.
Efficient generalized conjugate gradient algorithms,part 1: Theory   总被引:14,自引:0,他引:14  
The effect of inexact line search on conjugacy is studied in unconstrained optimization. A generalized conjugate gradient method based on this effect is proposed and shown to have global convergence for a twice continuously differentiable function with a bounded level set.  相似文献   

19.
Efficient generalized conjugate gradient algorithms,part 2: Implementation   总被引:5,自引:0,他引:5  
In Part 1 of this paper (Ref. 1), a new, generalized conjugate gradient algorithm was proposed and its convergence investigated. In this second part, the new algorithm is compared numerically with other modified conjugate gradient methods and with limited-memory quasi-Newton methods.  相似文献   

20.
This paper describes a gradient projection-multiplier method for solving the general nonlinear programming problem. The algorithm poses a sequence of unconstrained optimization problems which are solved using a new projection-like formula to define the search directions. The unconstrained minimization of the augmented objective function determines points where the gradient of the Lagrangian function is zero. Points satisfying the constraints are located by applying an unconstrained algorithm to a penalty function. New estimates of the Lagrange multipliers and basis constraints are made at points satisfying either a Lagrangian condition or a constraint satisfaction condition. The penalty weight is increased only to prevent cycling. The numerical effectiveness of the algorithm is demonstrated on a set of test problems.The author gratefully acknowledges the helpful suggestions of W. H. Ailor, J. L. Searcy, and D. A. Schermerhorn during the preparation of this paper. The author would also like to thank D. M. Himmelblau for supplying a number of interesting test problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号