首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, a new spectral PRP conjugate gradient algorithm has been developed for solving unconstrained optimization problems, where the search direction was a kind of combination of the gradient and the obtained direction, and the steplength was obtained by the Wolfe-type inexact line search. It was proved that the search direction at each iteration is a descent direction of objective function. Under mild conditions, we have established the global convergence theorem of the proposed method. Numerical results showed that the algorithm is promising, particularly, compared with the existing several main methods.  相似文献   

2.
Multicriteria optimization with a multiobjective golden section line search   总被引:1,自引:0,他引:1  
This work presents an algorithm for multiobjective optimization that is structured as: (i) a descent direction is calculated, within the cone of descent and feasible directions, and (ii) a multiobjective line search is conducted over such direction, with a new multiobjective golden section segment partitioning scheme that directly finds line-constrained efficient points that dominate the current one. This multiobjective line search procedure exploits the structure of the line-constrained efficient set, presenting a faster compression rate of the search segment than single-objective golden section line search. The proposed multiobjective optimization algorithm converges to points that satisfy the Kuhn-Tucker first-order necessary conditions for efficiency (the Pareto-critical points). Numerical results on two antenna design problems support the conclusion that the proposed method can solve robustly difficult nonlinear multiobjective problems defined in terms of computationally expensive black-box objective functions.  相似文献   

3.
In this paper, we propose a strongly sub-feasible direction method for the solution of inequality constrained optimization problems whose objective functions are not necessarily differentiable. The algorithm combines the subgradient aggregation technique with the ideas of generalized cutting plane method and of strongly sub-feasible direction method, and as results a new search direction finding subproblem and a new line search strategy are presented. The algorithm can not only accept infeasible starting points but also preserve the “strong sub-feasibility” of the current iteration without unduly increasing the objective value. Moreover, once a feasible iterate occurs, it becomes automatically a feasible descent algorithm. Global convergence is proved, and some preliminary numerical results show that the proposed algorithm is efficient.  相似文献   

4.
A new subspace minimization conjugate gradient algorithm with a nonmonotone Wolfe line search is proposed and analyzed. In the scheme, we propose two choices of the search direction by minimizing a quadratic approximation of the objective function in special subspaces, and state criterions on how to choose the direction. Under given conditions, we obtain the significant conclusion that each choice of the direction satisfies the sufficient descent property. Based on the idea on how the function is close to a quadratic function, a new strategy for choosing the initial stepsize is presented for the line search. With the used nonmonotone Wolfe line search, we prove the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with well-known CGOPT and CG_DESCENT and show that the proposed algorithm is very promising.  相似文献   

5.
Wolfe线搜索下一类混合共轭梯度法的全局收敛性   总被引:3,自引:0,他引:3  
本文给出了一个新的共轭梯度公式,新公式在精确线搜索下与DY公式等价,并给出了新公式的相关性质.结合新公式和DY公式提出了一个新的混合共轭梯度法,新算法在Wolfe线搜索下产生一个下降方向,并证明了算法的全局收敛性,并给出了数值例子.  相似文献   

6.
利用SQP方法、广义投影技术和强次可行方(向)法思想,建立不等式约束优化一个新的初始点任意的快速收敛算法. 算法每次迭代仅需解一个总存在可行解的二次子规划,或用广义投影计算“一阶”强次可行下降辅助搜索方向;采用曲线搜索与直线搜索相结合的方法产生步长. 在较温和的条件下,算法具有全局收敛性、强收敛性、超线性与二次收敛性. 给出了算法有效的数值试验.  相似文献   

7.
本构造一个求解非线性无约束优化问题的免梯度算法,该算法基于传统的模矢法,每次不成功迭代后,充分利用已有迭代点的信息,构造近似下降方向,产生新的迭代点。在较弱条件下,算法是总体收敛的。通过数值实验与传统模矢法相比,计算量明显减少。  相似文献   

8.
利用广义投影校正技术对搜索方向进行某种修正,改进假设条件,采用一种新型的一阶修正方向并结合SQP技术,建立了求解非线性约束最优化问题(p)的一个新的SQP可行下降算法,在较温和的假设条件下证明了算法的全局收敛性.由于新算法仅需较小的存储,从而适合大规模最优化问题的计算.  相似文献   

9.
本文对非线性不等式约束优化问题提出了一个新的可行 QP-free 算法. 新算法保存了现有算法的优点, 并具有以下特性: (1) 算法每次迭代只需求解三个具有相同系数矩阵的线性方程组, 计算量小; (2) 可行下降方向只需通过求解一个线性方程组即可获得, 克服了以往分别求解两个线性方程组获得下降方向和可行方向, 然后再做凸组合的困难;(3) 迭代点均为可行点, 并不要求是严格内点; (4) 算法中采用了试探性线搜索,可以进一步减少计算量; (5) 算法中参数很少,数值试验表明算法具有较好的数值效果和较强的稳定性.  相似文献   

10.
In this paper, stochastic approximation (SA) algorithm with a new adaptive step size scheme is proposed. New adaptive step size scheme uses a fixed number of previous noisy function values to adjust steps at every iteration. The algorithm is formulated for a general descent direction and almost sure convergence is established. The case when negative gradient is chosen as a search direction is also considered. The algorithm is tested on a set of standard test problems. Numerical results show good performance and verify efficiency of the algorithm compared to some of existing algorithms with adaptive step sizes.  相似文献   

11.
An efficient descent method for unconstrained optimization problems is line search method in which the step size is required to choose at each iteration after a descent direction is determined. There are many ways to choose the step sizes, such as the exact line search, Armijo line search, Goldstein line search, and Wolfe line search, etc. In this paper we propose a new inexact line search for a general descent method and establish some global convergence properties. This new line search has many advantages comparing with other similar inexact line searches. Moreover, we analyze the global convergence and local convergence rate of some special descent methods with the new line search. Preliminary numerical results show that the new line search is available and efficient in practical computation.  相似文献   

12.
In this article, unconstrained minimax problems are discussed, and a sequential quadratic programming (SQP) algorithm with a new nonmonotone linesearch is presented. At each iteration, a search direction of descent is obtained by solving a quadratic programming (QP). To circumvent the Maratos effect, a high-order correction direction is achieved by solving another QP and a new nonmonotone linesearch is performed. Under reasonable conditions, the global convergence and the rate of superlinear convergence are established. The results of numerical experiments are reported to show the effectiveness of the proposed algorithm.  相似文献   

13.
In this paper, a class of optimization problems with equality and inequality constraints is discussed. Firstly, the original problem is transformed to an associated simpler problem with only inequality constraints and a parameter. The later problem is shown to be equivalent to the original problem if the parameter is large enough (but finite), then a feasible descent SQP algorithm for the simplified problem is presented. At each iteration of the proposed algorithm, a master direction is obtained by solving a quadratic program (which always has a feasible solution). With two corrections on the master direction by two simple explicit formulas, the algorithm generates a feasible descent direction for the simplified problem and a height-order correction direction which can avoid the Maratos effect without the strict complementarity, then performs a curve search to obtain the next iteration point. Thanks to the new height-order correction technique, under mild conditions without the strict complementarity, the globally and superlinearly convergent properties are obtained. Finally, an efficient implementation of the numerical experiments is reported.  相似文献   

14.
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效.  相似文献   

15.
Feasible Direction Interior-Point Technique for Nonlinear Optimization   总被引:5,自引:0,他引:5  
We propose a feasible direction approach for the minimization by interior-point algorithms of a smooth function under smooth equality and inequality constraints. It consists of the iterative solution in the primal and dual variables of the Karush–Kuhn–Tucker first-order optimality conditions. At each iteration, a descent direction is defined by solving a linear system. In a second stage, the linear system is perturbed so as to deflect the descent direction and obtain a feasible descent direction. A line search is then performed to get a new interior point and ensure global convergence. Based on this approach, first-order, Newton, and quasi-Newton algorithms can be obtained. To introduce the method, we consider first the inequality constrained problem and present a globally convergent basic algorithm. Particular first-order and quasi-Newton versions of this algorithm are also stated. Then, equality constraints are included. This method, which is simple to code, does not require the solution of quadratic programs and it is neither a penalty method nor a barrier method. Several practical applications and numerical results show that our method is strong and efficient.  相似文献   

16.
Ferris 和Mangasarian 提出求解最优化问题的PVD(并行变量分配)算法, 此算法是把变量分为主要变量和辅助变量, 分配到p个处理机上, 每个处理机除了负责更新本处理机的主要变量外, 同时还沿着给定的方向更新辅助变量, 使算法的鲁棒性和灵活性得到了很大的提高. 该文基于文献[6]提出一种修正的SQP型PVD算法, 构造其搜索方向是下降方向和可行方向的组合, 并对此方向给予一个高阶修正, 使此算法很好地防止 Maratos 效应发生, 而且能够克服在求解子问题时出现约束不相容的情况. 在合适的条件下, 推导出此算法具有全局收敛性.  相似文献   

17.
In this paper, we suggest another accelerated conjugate gradient algorithm for which both the descent and the conjugacy conditions are guaranteed. The search direction is selected as a linear combination of the gradient and the previous direction. The coefficients in this linear combination are selected in such a way that both the descent and the conjugacy condition are satisfied at every iteration. The algorithm introduces the modified Wolfe line search, in which the parameter in the second Wolfe condition is modified at every iteration. It is shown that both for uniformly convex functions and for general nonlinear functions, the algorithm with strong Wolfe line search generates directions bounded away from infinity. The algorithm uses an acceleration scheme modifying the step length in such a manner as to improve the reduction of the function values along the iterations. Numerical comparisons with some conjugate gradient algorithms using a set of 75 unconstrained optimization problems with different dimensions show that the computational scheme outperforms the known conjugate gradient algorithms like Hestenes and Stiefel; Polak, Ribière and Polyak; Dai and Yuan or the hybrid Dai and Yuan; CG_DESCENT with Wolfe line search, as well as the quasi-Newton L-BFGS.  相似文献   

18.
PSB (Powell-Symmetric-Broyden) algorithm is a very important algorithm and has been extensively used in trust region methods. However, there are few studies on the line search type PSB algorithm. The primary reason is that the direction generated by this class of algorithms is not necessarily a descent direction of the objective function. In this paper, by combining a nonmonotone line search technique with the PSB method, we propose a nonmonotone PSB algorithm for solving unconstrained optimization. Under proper conditions, we establish global convergence and superlinear convergence of the proposed algorithm. At the same time we verify the efficiency of the proposed algorithm by some numerical experiments.  相似文献   

19.
A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.  相似文献   

20.
A NEW STEPSIZE FOR THE STEEPEST DESCENT METHOD   总被引:8,自引:0,他引:8  
The steepest descent method is the simplest gradient method for optimization. It is well known that exact line searches along each steepest descent direction may converge very slowly. An important result was given by Barzilar and Borwein, which is proved to be superlinearly convergent for convex quadratic in two dimensional space, and performs quite well for high dimensional problems. The BB method is not monotone, thus it is not easy to be generalized for general nonlinear functions unless certain non-monotone techniques being applied. Therefore, it is very desirable to find stepsize formulae which enable fast convergence and possess the monotone property. Such a stepsize αk for the steepest descent method is suggested in this paper. An algorithm with this new stepsize in even iterations and exact line search in odd iterations is proposed. Numerical results are presented, which confirm that the new method can find the exact solution within 3 iteration for two dimensional problems. The new method is very efficient for small scale problems. A modified version of the new method is also presented, where the new technique for selecting the stepsize is used after every two exact line searches. The modified algorithm is comparable to the Barzilar-Borwein method for large scale problems and better for small scale problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号