首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
求解点覆盖问题的拟物转换及算法   总被引:4,自引:0,他引:4  
本文首先基于CNF-SAT问题已有的拟物型的求解思想,将点覆盖问题转换为连续情形的目标优化问题,进而提出一种新的竞争式下降梯度方法,从而获得了求解点覆盖问题的高效实用的快速算法.  相似文献   

2.
孙清滢 《计算数学》2004,26(4):401-412
本文利用广义投影矩阵,对求解无约束规划的超记忆梯度算法中的参数给出一种新的取值范围以保证得到目标函数的超记忆梯度广义投影下降方向,并与处理任意初始点的方法技巧结合建立求解非线性不等式约束优化问题的一个初始点任意的超记忆梯度广义投影算法,在较弱条件下证明了算法的收敛性.同时给出结合FR,PR,HS共轭梯度参数的超记忆梯度广义投影算法,从而将经典的共轭梯度法推广用于求解约束规划问题.数值例子表明算法是有效的.  相似文献   

3.
黎超琼  李锋 《运筹学学报》2010,24(1):101-114
LQP交替方向法是求解可分离结构型单调变分不等式问题的一种非常有效的方法.它不仅可以充分地利用目标函数的可分结构,将原问题分解为多个更易求解的子问题,还更适合求解大规模问题.对于带有三个可分离算子的单调变分不等式问题,结合增广拉格朗日算法和LQP交替方向法提出了一种部分并行分裂LQP交替方向法,构造了新算法的两个下降方向,结合这两个下降方向得到了一个新的下降方向,沿着这个新的下降方向给出了最优步长.并在较弱的假设条件下,证明了新算法的全局收敛性.  相似文献   

4.
提出了一个处理等式约束优化问题新的SQP算法,该算法通过求解一个增广Lagrange函数的拟Newton方法推导出一个等式约束二次规划子问题,从而获得下降方向.罚因子具有自动调节性,并能避免趋于无穷.为克服Maratos效应采用增广Lagrange函数作为效益函数并结合二阶步校正方法.在适当的条件下,证明算法是全局收敛的,并且具有超线性收敛速度.  相似文献   

5.
研究一类新的求解无约束优化问题的超记忆梯度法,分析了算法的全局收敛性和线性收敛速率.算法利用一种多步曲线搜索准则产生新的迭代点,在每步迭代时同时确定下降方向和步长,并且不用计算和存储矩阵,适于求解大规模优化问题.数值试验表明算法是有效的.  相似文献   

6.
LQP交替方向法是求解可分离结构型单调变分不等式问题的一种非常有效的方法.它不仅可以充分地利用目标函数的可分结构,将原问题分解为多个更易求解的子问题,还更适合求解大规模问题.对于带有三个可分离算子的单调变分不等式问题,结合增广拉格朗日算法和LQP交替方向法提出了一种部分并行分裂LQP交替方向法,构造了新算法的两个下降方向,结合这两个下降方向得到了一个新的下降方向,沿着这个新的下降方向给出了最优步长.并在较弱的假设条件下,证明了新算法的全局收敛性.  相似文献   

7.
利用广义投影矩阵,对求解无约束规划的三项记忆梯度算法中的参数给一条件,确定它们的取值范围,以保证得到目标函数的三项记忆梯度广义投影下降方向,建立了求解非线性等式和不等式约束优化问题的三项记忆梯度广义投影算法,并证明了算法的收敛性.同时给出了结合FR,PR,HS共轭梯度参数的三项记忆梯度广义投影算法,从而将经典的共轭梯度算法推广用于求解约束规划问题.数值例子表明算法是有效的.  相似文献   

8.
一类新的记忆梯度法及其全局收敛性   总被引:1,自引:0,他引:1  
研究了求解无约束优化问题的记忆梯度法,利用当前和前面迭代点的信息产生下降方向,得到了一类新的无约束优化算法,在Wolfe线性搜索下证明了其全局收敛性.新算法结构简单,不用计算和存储矩阵,适于求解大型优化问题.数值试验表明算法有效.  相似文献   

9.
在Hilbert空间的非空闭凸子集上研究了具有Lipschitz和强单调算子的经典变分不等式.为求解此变分不等式引入了一类新的三步松弛混合最速下降法.在算法参数的适当假设下,证明了此算法的强收敛性.  相似文献   

10.
一维粘弹性波动方程弹性系数的识别方法   总被引:2,自引:0,他引:2  
丛文相 《应用数学》1998,11(1):128-130
本文就一维粘弹性波动方程弹性系数的求解问题,给出了一个新的求解方法.通过对算法进行分析可知,该方法具有较小的计算量,并且具较好的数值稳定性.数值模拟表明了该方法的可行性及有效性.  相似文献   

11.
The aim of this paper is to propose a new multiple subgradient descent bundle method for solving unconstrained convex nonsmooth multiobjective optimization problems. Contrary to many existing multiobjective optimization methods, our method treats the objective functions as they are without employing a scalarization in a classical sense. The main idea of this method is to find descent directions for every objective function separately by utilizing the proximal bundle approach, and then trying to form a common descent direction for every objective function. In addition, we prove that the method is convergent and it finds weakly Pareto optimal solutions. Finally, some numerical experiments are considered.  相似文献   

12.
In this paper, based on a new class of conjugate gradient methods which are proposed by Rivaie, Dai and Omer et al. we propose a class of improved conjugate gradient methods for nonconvex unconstrained optimization. Different from the above methods, our methods possess the following properties: (i) the search direction always satisfies the sufficient descent condition independent of any line search; (ii) these approaches are globally convergent with the standard Wolfe line search or standard Armijo line search without any convexity assumption. Moreover, our numerical results also demonstrated the efficiencies of the proposed methods.  相似文献   

13.
Recently, it has been observed that several nondifferentiable minimization problems share the property that the question of whether a given point is optimal can be answered by solving a certain bounded least squares problem. If the resulting residual vector,r, vanishes then the current point is optimal. Otherwise,r is a descent direction. In fact, as we shall see,r points at the steepest descent direction. On the other hand, it is customary to characterize the optimality conditions (and the steepest descent vector) of a convex nondifferentiable function via its subdifferential. Also, it is well known that optimality conditions are usually related to theorems of the alternative. One aim of our survey is to clarify the relations between these subjects. Another aim is to introduce a new type of theorems of the alternative. The new theorems characterize the optimality conditions of discretel 1 approximation problems and multifacility location problems, and provide a simple way to obtain the subdifferential and the steepest descent direction in such problems. A further objective of our review is to demonstrate that the ability to compute the steepest descent direction at degenerate dead points opens a new way for handling degeneracy in active set methods.  相似文献   

14.
Most of the descent methods developed so far suffer from the computational burden due to a sequence of constrained quadratic subproblems which are needed to obtain a descent direction. In this paper we present a class of proximal-type descent methods with a new direction-finding subproblem. Especially, two of them have a linear programming subproblem instead of a quadratic subproblem. Computational experience of these two methods has been performed on two well-known test problems. The results show that these methods are another very promising approach for nondifferentiable convex optimization.  相似文献   

15.
陈俊  孙文瑜 《东北数学》2008,24(1):19-30
In this paper, we combine the nonmonotone and adaptive techniques with trust region method for unconstrained minimization problems. We set a new ratio of the actual descent and predicted descent. Then, instead of the monotone sequence, the nonmonotone sequence of function values are employed. With the adaptive technique, the radius of trust region △k can be adjusted automatically to improve the efficiency of trust region methods. By means of the Bunch-Parlett factorization, we construct a method with indefinite dogleg path for solving the trust region subproblem which can handle the indefinite approximate Hessian Bk. The convergence properties of the algorithm are established. Finally, detailed numerical results are reported to show that our algorithm is efficient.  相似文献   

16.
An efficient descent method for unconstrained optimization problems is line search method in which the step size is required to choose at each iteration after a descent direction is determined. There are many ways to choose the step sizes, such as the exact line search, Armijo line search, Goldstein line search, and Wolfe line search, etc. In this paper we propose a new inexact line search for a general descent method and establish some global convergence properties. This new line search has many advantages comparing with other similar inexact line searches. Moreover, we analyze the global convergence and local convergence rate of some special descent methods with the new line search. Preliminary numerical results show that the new line search is available and efficient in practical computation.  相似文献   

17.
We consider Newton-like line search descent methods for solving non-linear least-squares problems. The basis of our approach is to choose a method, or parameters within a method, by minimizing a variational measure which estimates the error in an inverse Hessian approximation. In one approach we consider sizing methods and choose sizing parameters in an optimal way. In another approach we consider various possibilities for hybrid Gauss-Newton/BFGS methods. We conclude that a simple Gauss-Newton/BFGS hybrid is both efficient and robust and we illustrate this by a range of comparative tests with other methods. These experiments include not only many well known test problems but also some new classes of large residual problem.  相似文献   

18.
We propose two linearly convergent descent methods for finding a minimizer of a convex quadratic spline and establish global error estimates for the iterates. One application of such descent methods is to solve convex quadratic programs, since they can be reformulated as problems of unconstrained minimization of convex quadratic splines. In particular, we derive several new linearly convergent algorthms for solving convex quadratic programs. These algorithms could be classified as row-action methods, matrix-splitting methods, and Newton-type methods.  相似文献   

19.
孙清滢 《数学进展》2004,33(5):598-606
利用Rosen投影矩阵,建立求解带线性或非线性不等式约束优化问题的三项记忆梯度Rosen投影下降算法,并证明了算法的收敛性.同时给出了结合FR,PR,HS共轭梯度参数的三项记忆梯度Rosen投影算法,从而将经典的共轭梯度法推广用于求解约束规划问题.数值例子表明算法是有效的。  相似文献   

20.
This paper shows that error bounds can be used as effective tools for deriving complexity results for first-order descent methods in convex minimization. In a first stage, this objective led us to revisit the interplay between error bounds and the Kurdyka-?ojasiewicz (KL) inequality. One can show the equivalence between the two concepts for convex functions having a moderately flat profile near the set of minimizers (as those of functions with Hölderian growth). A counterexample shows that the equivalence is no longer true for extremely flat functions. This fact reveals the relevance of an approach based on KL inequality. In a second stage, we show how KL inequalities can in turn be employed to compute new complexity bounds for a wealth of descent methods for convex problems. Our approach is completely original and makes use of a one-dimensional worst-case proximal sequence in the spirit of the famous majorant method of Kantorovich. Our result applies to a very simple abstract scheme that covers a wide class of descent methods. As a byproduct of our study, we also provide new results for the globalization of KL inequalities in the convex framework. Our main results inaugurate a simple method: derive an error bound, compute the desingularizing function whenever possible, identify essential constants in the descent method and finally compute the complexity using the one-dimensional worst case proximal sequence. Our method is illustrated through projection methods for feasibility problems, and through the famous iterative shrinkage thresholding algorithm (ISTA), for which we show that the complexity bound is of the form \(O(q^{k})\) where the constituents of the bound only depend on error bound constants obtained for an arbitrary least squares objective with \(\ell ^1\) regularization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号