首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
In this paper, a new class of memoryless non-quasi-Newton method for solving unconstrained optimization problems is proposed, and the global convergence of this method with inexact line search is proved. Furthermore, we propose a hybrid method that mixes both the memoryless non-quasi-Newton method and the memoryless Perry-Shanno quasi-Newton method. The global convergence of this hybrid memoryless method is proved under mild assumptions. The initial results show that these new methods are efficient for the given test problems. Especially the memoryless non-quasi-Newton method requires little storage and computation, so it is able to efficiently solve large scale optimization problems.  相似文献   

2.
In this paper, a recursive quadratic programming algorithm is proposed and studied. The line search functions used axe Han‘s nondifferentiable penalty functions with a second oeder penalty term. In order to avoid maratos effect Fukushima‘s mixed direction is used as the direction of line search. Finally, we prove the global convergence and the local second order convergence of the algorithm.  相似文献   

3.
We propose a line search exact penalty method with bi-object strategy for nonlinear semidefinite programming. At each iteration, we solve a linear semidefinite programming to test whether the linearized constraints are consistent or not. The search direction is generated by a piecewise quadratic-linear model of the exact penalty function. The penalty parameter is only related to the information of the current iterate point. The line search strategy is a penalty-free one. Global and local converg...  相似文献   

4.
一种修正的谱CD共轭梯度算法的全局收敛性   总被引:2,自引:0,他引:2  
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising.  相似文献   

5.
A Kind of direct methods is presented for the solution of optimal control problems with state constraints.These methods are sequential quadratic programming methods.At every iteration a quadratic programming which is obtained by quadratic approximation to Lagrangian function and Linear approximations to constraints is solved to get a search direction for a merit function.The merit function is formulated by augmenting the Lagrangian funetion with a penalty term.A line search is carried out along the search direction to determine a step length such that the merit function is decreased.The methods presented in this paper include continuous sequential quadratic programming methods and discreate sequential quadrade programming methods.  相似文献   

6.
In this paper, a new class of three term memory gradient method with nonmonotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Combining the quasi-Newton method with the new method, the former is modified to have global convergence property. Numerical results show that the new algorithm is efficient.  相似文献   

7.
A new contraction principle in menger spaces   总被引:2,自引:0,他引:2  
In the present work we introduce a new type of contraction mapping by using a specific function and obtain certain fixed point results in Menger spaces. The work is in line with the research for generalizing the Banach's contraction principle. We extend the notion of altering distance function to Menger Spaces and obtain fixed point results.  相似文献   

8.
A new conjugate gradient method is proposed in this paper. For any (inexact) line search, our scheme satifies the sufficient descent property. The method is proved to be globally convergent if the restricted Wolfe-Powell line search is used. Preliminary numerical result shows that it is efficient.  相似文献   

9.
The minimax optimization model introduced in this paper is an important model which has received some attention over the past years. In this paper, the application of minimax model on how to select the distribution center location is first introduced. Then a new algorithm with nonmonotone line search to solve the non-decomposable minimax optimization is proposed. We prove that the new algorithm is global Convergent. Numerical results show the proposed algorithm is effective.  相似文献   

10.
In this paper,an unconstrained optimization method using the nonmonotone second order Goldstein's line search is proposed.By using the negative curvature information from the Hessian, the sequence generated is shown to converge to a stationary point with the second order optimality conditions.Numerical tests on a set of standard test problems confirm the efficiency of our new method.  相似文献   

11.
In this paper, the non-quasi-Newton's family with inexact line search applied to unconstrained optimization problems is studied. A new update formula for non-quasi-Newton's family is proposed. It is proved that the constituted algorithm with either Wolfe-type or Armijotype line search converges globally and Q-superlinearly if the function to be minimized has Lipschitz continuous gradient.  相似文献   

12.
非拟牛顿非凸族的收敛性   总被引:11,自引:0,他引:11  
陈兰平  焦宝聪 《计算数学》2000,22(3):369-378
1.引言 对于无约束最优化问题拟牛顿法是目前最成熟,应用最广泛的解法之一.近二十多年来,对拟牛顿法收敛性质的研究一直是非线性最优化算法理论研究的热点.带非精确搜索的拟牛顿算法的研究是从1976年 Powell[1]开始,他证明了带 Wolfe搜索 BFGS算法的全局收敛性和超线性收敛性. 1978年 Byrd, Nocedal; Ya-Xiang Yuan[3]成功地将 Powell的结果推广到限制的 Brosden凸族. 1989年, Nocedal[4]在目标函数一致凸的条件下,证明了带回追搜索的BFG…  相似文献   

13.
本文就非拟牛顿法在无约束最优化问题上,对采用非单调线搜索的情况下是否具有全局收敛性进行了研究,在目标函数满足一致凸的条件下,证明了非拟牛顿族是全局收敛的.  相似文献   

14.
《Optimization》2012,61(4):993-1009
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed.  相似文献   

15.
A three-parameter family of nonlinear conjugate gradient methods   总被引:3,自引:0,他引:3  

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell's restart criterion, the three-parameter family of methods with the strong Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the three-parameter family of methods. This paper can also be regarded as a brief review on nonlinear conjugate gradient methods.

  相似文献   


16.
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the solution point, the next search direction will be close to the negative gradient direction; and (ii) its descent property and global convergence are likely to be achieved provided that the line search satisfies the Wolfe conditions. Some numerical results with the family are also presented.

  相似文献   


17.
给出了一个用于解决 LC1线性约束优化问题的 BFGS-SQP算法 ,这个算法是用 Armijo线性原则来求步长的 .为推广 BFGS-SGP算法 ,本文采用 Wolfe线性搜索原则来替代该 BFGS-SQP算法的 Armijo原则 ,经过分析 ,同样得到了 BFGS-SGP算法的全局收敛性及超线性收敛性  相似文献   

18.
In this paper, we consider the DFP algorithm without exact line search. We strengthen the conditions on the line search and prove that, under the new line search conditions, the DFP algorithm is globally convergent, Q-superlinearly convergent, and n-step quadratically convergent.  相似文献   

19.
In this paper, we focus on solving a class of nonlinear complementarity problems with non-Lipschitzian functions. We first introduce a generalized class of smoothing functions for the plus function. By combining it with Robinson's normal equation, we reformulate the complementarity problem as a family of parameterized smoothing equations. Then, a smoothing Newton method combined with a new nonmonotone line search scheme is employed to compute a solution of the smoothing equations. The global and local superlinear convergence of the proposed method is proved under mild assumptions. Preliminary numerical results obtained applying the proposed approach to nonlinear complementarity problems arising in free boundary problems are reported. They show that the smoothing function and the nonmonotone line search scheme proposed in this paper are effective.  相似文献   

20.
一族新共轭梯度法的全局收敛性   总被引:4,自引:0,他引:4  
杜学武  徐成贤 《数学研究》1999,32(3):277-280
提出求解无约束优化问题的一族新共轭梯度法,证明了它的一个子族在一种非精确线搜索下的下降性和全局收敛性  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号